Why are my file uploads missing/deleted? Azure App Service - azure

I'm currently developing a web application using Flask, this app takes information from a SQL database to generate a .pdf that the final user can download from the browser.
My problem is that some hours after I deploy the app using Azure App Services, make some changes to the SQL database and generate some .pdf, the app automatically resets to its original state, therefore, I lose all the changes to the SQL database and generated documentation as it had some kind of ephemeral memory.
Is there any way in which I can store these files without losing them after a while?
In this case what Azure service/solution do you recommend me to use to store these files?
The data generated once the app is in use should be pretty little, the SQL will be updated once a month with a couple of .pdf generated.

Related

How Can I Save my images uploaded in file system for a deploy on google app engine

In my App deployed on Google App Engine I save images uploaded by user in public folder of my File system (using Multer)
Now If I deploy a new version of my app the images uploaded are lost are lost how do I solve this issue
Is there any method with which I can deploy new version of my App keeping the images uploaded Intact?
Basically How can I Backup my File System?
App Engine is a serverless and stateless product. You lost the image if you deploy a new version, but also is the service scale up and down, or if your instance is stopped for maintenance and restarted on other servers (it's the case for Flex app engine which restart at least once a week).
In this context, your design is not correct for using serverless product. You need to save the file elsewhere, typically in Cloud Storage and to load them from there. If you need to index, search, list your file, it's also a common pattern to save the metadata in a database, Cloud Firestore for example, to easily search the files, and then download them from Cloud Storage.
Notes: there is no persistent file system, it's in memory file system on serverless environment. You can also have out of memory error if you store too much file.

Apply local DB changes to Azure SQL Database

I have a backup file that came from Server A and I copied that .bak files into my local and setup that DB into my Sql Server Management Studio. Now After setting it up I deployed it in Azure Sql Database. But now there were change in the Data in Server A because it's still being used, so I need to get all those changes to the Azure SQL Database that I just deployed. How am I going to do that?
Note: I'm using Azure for my server and I have a local copy of Server A database. So basically in terms of data and structure my local and the previous Server A db is the same. But after a few days Server A data is now updated and my local DB is still the same as when I just backup the db in Server A.
How can I update the DB in Azure to take all the changes in Server A and deploy it in Azure?
You've got a few choices. It's just about migrating data. It's also a question of which data you're going to migrate. Let's say it's a neat, complete replacement. Then, I'd suggest looking at the bacpac mechanism. That's a way to export a database, it's structure and data, then import it into a new location. This is one mechanism of moving to Azure.
If you can't simply replace everything, you need to look at other options. First, there's SSIS. You can build a pipeline to move the data you need. There's also export and import through sqlcmd, which can connect to Azure SQL Database. You can also look to a third party tool like Redgate SQL Data Compare as a way to pick and choose the data that gets moved. There are a whole bunch of other possible Extract/Transform/Load (ETL) tools out there that can help.
Do you want to sync schema changes as well as Data change or just Data? If it is just Data then the best service to be used would be Azure Data Migration Service, where this service can help you copy the delta with respect to Data to Azure incrementally, both is online and offline manner and you can also decide on the schedule.

Initial remote data - Xamarin Forms

I have a Xamarin Forms application, and I have to get initial remote data (with images, maybe in urls) and save that data as a cache on my app. Every time the application starts, the data has to be refreshed, and if cannot, use the cached data.
So far, I have already viewed Easy Tables, but seems that its focus is on save user data on the cloud, and I don't want to do that.
I only want to get the initial data for an application, cache that data and refresh that data every time the app starts.
I didn't find a scenario with Easy Tables that the app administrator loads the initial data (maybe by REST calls) and then the app only consumes that data without modifying it.
Could you give some advice on how to do this? Using Azure.
Thanks!
So far, I have already viewed Easy Tables, but seems that its focus is on save user data on the cloud, and I don't want to do that.
Easy Tables work with Node.js backend, you just need to add the table and your backend would be automatically created for you. By using Offline Data Sync, you could create and modify data in your local store (e.g. sqlite) when your app is offline mode, then when your app is online you could push local changes to your server or pull changes from your server into your local store. This may be an approach for you and you could just pull the data from server and only read data from your local store.
I have a Xamarin Forms application, and I have to get initial remote data (with images, maybe in urls) and save that data as a cache on my app.
I didn't find a scenario with Easy Tables that the app administrator loads the initial data (maybe by REST calls) and then the app only consumes that data without modifying it.
Per my understanding, if your initial data is more about images, settings and without any sensitive data, I assumed that you could just leverage Azure Blob storage for storing data (image urls or settings within *.json file) or Azure Table storage, and you could leverage the related client SDK to retrieve the data and store into your local sqlite db or files.
I would prefer to use the blob storage and you could control access (Anonymous access or delegated access permissions) to your blob data. For more details, you could refer to Managing security for blobs.
You absolutely can do that with a Sync Table.
https://learn.microsoft.com/en-us/azure/app-service-mobile/app-service-mobile-xamarin-forms-get-started-offline-data
Just do a PullAsync in the splash screen to retrieve the values, you don't need to make use of the Post methods, and can even remove them (or return errors) in your Azure TableController

Pushing Bluemix app wipes files in the public folder

I have a Bluemix app (with Node.js backend) on which i upload some files (in the folder public/uploads).
Whenever I change my server code and cf push the app, the files that are in the uploads folder are wiped. How can i publish my app without touching files and folders that I would like not to touch?
Thanks in advance.
This is happening because the way Cloud Foundry works. Bluemix runs on Cloud Foundry. What is causing this is the fact that the file system is ephemeral. The file system should not be used to store uploaded files.
When an app restarts, crashes, scales, or you upload a new version the file system is wiped.
Additionally if you scale your app to for example 5 instances each instance of your app would have different uploads.
I would highly encourage you to check out the 12 Factor App. One of the tenants is not storing files on disk.
I would encourage you to use a shared file system such as OpenStack Swift. It is available in Bluemix and is called Object Storage.
Restaging will wipe your files on Cloud Foundry, as there is no permanent local storage available. Try using Bluemix Live Sync to edit your code on the fly -- no restaging required. You can either download the Bluemix Live cli, or use IBM Dev Ops Services to take advantage of Live Edit. The documentation goes over all the options.
For more permanent storage solutions, check out the Bluemix catalog for services like Cloudant and Redis.
The file system for cloud foundry applications is ephemeral. Every time your push, restart, scale, you get a new fresh file system. Your application should not store files to disc (except cache/temp). Your application should store the uploaded files in some kind of database or a blob store. Look into the Cloudant service.
Considerations for Designing and Running an Application in the Cloud

SQLite on Azure website

I've been trying to get SQLite to work on an Azure website. I have deployed everything successfully but I need to point it to a file name for the database. I have looked at creating Blob storage but I'm unsure how to convert this into a file name that SQLite will accept.
I'm sure this has been done as I can see references to other issues related to SQLite on Azure.
I have read http://www.sqlite.org/whentouse.html.
Based on my experience if you want to use SQLite with Azure Websites you can keep the database file within your deployment package so it will stay at the same server where your website is. Azure websites provide 1GB of application storage which is plenty for a database file. Your content with the websites will persist and access to SQLite DB will be fast. This is super easy and you can very easily do with ASP.NET web application or any other.
The problem of choosing Azure Blob storage is that if the database file is stored at Azure Blob storage, there are no API that SQLite can write to that file. So one option you could have is to writing locally first and then syncing to Azure Blob storage back and forth while others on SO may have some other options. If you want to backup your database file to Azure Blob storage for any reason you sure can do that separately however I think if you choose the have SQLite, the best would be the keep the database file with website to make it simple.

Resources