Windows Azure Blob Storage Emulator File Storage Location - azure

Where does the Windows Azure Blob Storage Emulator (yes, the emulator) store files uploaded to it? As in, what is the actual folder (path) that it is storing blobs on my local machine? I have it setup and running and I can successfully upload blobs and retrieve blobs, but I'd like to know where the emulator is actually storing the files. After uploading a blob, the address I get is:
http://127.0.0.1:10000/devstoreaccount1/mycontainer/picture.png
I am using xampp and the files don't seem to be in my htdocs folder. I'm just curious, that's all.

The storage emulator listens at the address you see there, but when requests come in it uses the SQL store as the back end storage.
The storage emulator uses SQL Server LocalDB by default, or you can use the DSInit.exe command line utility to point it at a full SQL Server instance. All table, queue and BLOB data is then saved in that database. In the case of BLOBs the metadata is stored in the database, but then the file is stored in an appdata directory. For example, one of mine was in c:\users\michael\appdata\local\developmentstorage\sql\blockblobroot\1\c1ba3640-ad8e-4cbb-8818-95c7d866cb71.
If you point your emulator at a SQL Express or SQL Server instance you can then use SQL Management Studio to connect to that instance and dig into the tables. There is a table named Blob with a column of DirectoryPath which will tell you were the files are. I wouldn't go messing around much with the structure of this database, or the file structure outside of using the API or a storage tool or you may cause issues with your local emulator stability.
Also note that this is NOT how the data is stored in Windows Azure, only how the local emulator simulates it.

As of 4.6
The storage emulator stores the data of all types as files (without extensions) under the root folder;
C:\Users\username\AppData\Local\AzureStorageEmulator

Related

How to share Azure Storage Emulator among multiple development pc?

Hi I am new to azure development. We are planning to use blobs to store an images. At development time it create local storage emulator to store blobs located on local pc. Can we make it shared so all developers working on this project can use it to store and retrieve that blobs.
I dig a lot but don't find any answer.
Any help would be highly appreciated.
Thanks in advance.
hi the storage emulator just uses localDB to save the data.
see these ans Windows Azure Blob Storage Emulator File Storage Location
DSInit has disappeared on upgrading Azure SDK to 2.3
you can change the save location to a sql server instance. which you can save on a server. and hence can share among others
WAStorageEmulator init /sqlInstance <shared sql server instance>

Azure configuration advice

I have a asp.net V2 website, which stores uploaded content to the file system and a SQL Server database (using Full-Text Search)
I'm trying to work out what the best configuration option would be for me on Azure?
I would like to have the site scalable, but if I do this how can I ensure that the uploaded content is shared across all the sites?
Also SQL Azure does not support Full-Text Searching, so does this mean I should setup a Virtual Machine and host it myself?
For your database, you'll want to run SQL Server in a Virtual Machine, as you'll then get all functionality of SQL Server, including FTS. It's very simple to get up and running with SQL Server VM's, as there's a gallery image with SQL Server preinstalled.
Regarding your file system storage: This won't scale to multiple instances. You'll need another mechanism for storage. Typically this would be Blob Storage, but... it depends on what you're doing with the files. If you're just serving / storing content (you mentioned uploaded content), this works great, and it's accessible across many instances. If, on the other hand, it's some type of file-based database or index, that won't really work well.
If you need to do some type of local processing on the files (e.g. photo or movie rendering), you can easily copy a blob's contents to a local VM disk, process the file with typical drive paths, then upload the results to another blob.

SQLite on Azure website

I've been trying to get SQLite to work on an Azure website. I have deployed everything successfully but I need to point it to a file name for the database. I have looked at creating Blob storage but I'm unsure how to convert this into a file name that SQLite will accept.
I'm sure this has been done as I can see references to other issues related to SQLite on Azure.
I have read http://www.sqlite.org/whentouse.html.
Based on my experience if you want to use SQLite with Azure Websites you can keep the database file within your deployment package so it will stay at the same server where your website is. Azure websites provide 1GB of application storage which is plenty for a database file. Your content with the websites will persist and access to SQLite DB will be fast. This is super easy and you can very easily do with ASP.NET web application or any other.
The problem of choosing Azure Blob storage is that if the database file is stored at Azure Blob storage, there are no API that SQLite can write to that file. So one option you could have is to writing locally first and then syncing to Azure Blob storage back and forth while others on SO may have some other options. If you want to backup your database file to Azure Blob storage for any reason you sure can do that separately however I think if you choose the have SQLite, the best would be the keep the database file with website to make it simple.

Access Azure Storage Blob as file system

We have a Worker role on Windows Azure that runs ffmpeg to convert media files using MediaHandler Pro. The files that we like to process is saved on a blob storage and the resulting files should also be stored there.
Our problem is that ffmpeg works on local files and not on URIs from the blob storage. Is there any way to mount a blob storage container and access the files there directly as a file system?
If this is not possible is it ok to download the files (they can be quite large, perhaps 1-2Gb) to the local file system*, process them there and then upload them. This sounds like redundant.
*) We have set up a CloudDrive that downloads this blob to a virtual disc
You have a couple ways of doing this - you can either create a cloud drive (VHD uploaded as page blob) and mount it or you can download the source files locally and work on scratch (local temp) disk. Of the two choices, I would download locally and use scratch disk.
If you were to use a cloud drive there would be 3 primary problems - the first is that it is a VHD and you have to mount it to get the files. The second is that only 1 instance can mount for RW, so you cannot split the work of encoding source files with multiple workers saving to same drive. The 3rd problem is that it is the slowest of all the storage options. For encoding, probably not a great choice.
Your best bet is to download the source files from blob storage (that is very fast, btw) into a 'Local Resource' (aka scratch disk) and work from there. Upload the resulting file into blob storage.
If your systems support SAMBA 3.0 you can simply map the Azure Storage Blob Container as a drive using the file share features now available.

How to write to a tmp/temp directory in Windows Azure website

How would I write to a tmp/temp directory in windows azure website? I can write to a blob, but i'm using an NPM that requires me to give it file names so that it can directly write to those filenames.
Are you using Cloud Services (PaaS) or Virtual Machines (IaaS).
If PaaS, look at Windows Azure Local Storage. This option gives you up to 250gb of disk space per core. Its a great location for temporary storage of information in a way that traditional apps will be familiar with. However, its not persistent so if you put anything there you need to make sure will be available if the VM instance gets repaved, then copy it to Blob storage. Also, this storage is specific to a given role instance. So if you have two instances of the same role, they each have their own local storage buckets.
Alternatively, you can use Azure Drive, which allows you to keep the information persisted, but still doesn't allow multiple parallel writes.
If IaaS, then you can just mount a data disk to the VM and write to it directly. Data disks are already persisted to blob storage so there's little risk of data loss.
Just from my understanding and please correct me if anything wrong.
In Windows Azure Web Site, the content of your website will be stored in blob storage and mounted as a drive, which will be used for all instances your web site is using. And since it's in blob storage it's persistent. So if you need the local file system I think you can use the folders under your web site root path. But I don't think you can use the system tmp or temp folder.

Resources