How to share Azure Storage Emulator among multiple development pc? - azure

Hi I am new to azure development. We are planning to use blobs to store an images. At development time it create local storage emulator to store blobs located on local pc. Can we make it shared so all developers working on this project can use it to store and retrieve that blobs.
I dig a lot but don't find any answer.
Any help would be highly appreciated.
Thanks in advance.

hi the storage emulator just uses localDB to save the data.
see these ans Windows Azure Blob Storage Emulator File Storage Location
DSInit has disappeared on upgrading Azure SDK to 2.3
you can change the save location to a sql server instance. which you can save on a server. and hence can share among others
WAStorageEmulator init /sqlInstance <shared sql server instance>

Related

Upload backup files on Azure file storage from Ubuntu server

I need to upload my backup files from my Ubuntu server to Azure file storage, unable to upload it. Please share any idea or suggestions for the same.
Thank you in advance!!!
You just want to store the your gitlab backup files? or want store and share them?
If you just want to store them, I think we can create Azure storage blobs to store backup files. In Linux we can install Azure CLI 1.0 or Azure CLI 2.0 to upload files to Azure blobs.
More information about how to use CLI 1.0 or CLI 2.0 to upload files to Azure, please refer to the link.
If you want to store and share the backup files, I think we can use Azure file share storage. Azure files share service same as SMB 3.0, so you can mount the Azure file share to your Ubuntu, in this way, you can upload the backup files to it. Then you can mount Azure file share service to others to share the backup files.
More information about Azure file share service, please refer to the link.
Have you thought of implementing some agent tool to backup the data from Ubuntu to Azure Cloud storage? I think it can be a way out. Have a look at Cloudberry. It may help you. I see no other way to help which does not take so much time and effort.
Azure File Storage on-premises access from across all regions for Linux distribution - Ubuntu 17.04 is now supported right out of the box and no extra setup is needed.
https://azure.microsoft.com/en-us/blog/azure-file-storage-on-premises-access-for-ubuntu/

Using AzCopy in azure virtual machine

I have an azure virtual machine which has some application specific CSV files(retrieved via ftp from on-premise) that needs to be stored into a blob (and eventually will be read and pushed into a Azure SQL DB via a worker role). Question is around pushing the files from VM to blob. Is it possible to get AzCopy without installing the SDK to have the files copied to the blob? Is there a better solution than this? Please read the points below for further information
Points to note:
1) Though files could be directly uploaded to a blob rather than getting them into the VM first and copying from there, for security reasons the files will have to be pulled into the VM and this cannot be changed.
2) I also thought about a worker role talking to a VM folder share (via common virtual network) to pull the files and upload to the blob, but this does not appear to be a right solution after reading some blogs - as it requires changes to both VMs (worker role VM and the Iaas VM).
3) Azure File Service is still in preview (?) and hence cannot be used.
Is it possible to get AzCopy without installing the SDK to have the
files copied to the blob?
Absolutely yes. You can directly download AzCopy binaries without installing SDK using the following links:
Version 3.1.0: http://aka.ms/downloadazcopy
Version 4.1.0: http://aka.ms/downloadazcopypr
Source: http://blogs.msdn.com/b/windowsazurestorage/archive/2015/01/13/azcopy-introducing-synchronous-copy-and-customized-content-type.aspx

Backup and restore Azure Storage Emulator data

Is there any information out there on how to backup and restore the data in the Azure Storage Emulator? (Notice this is not the live version of Azure.)
We are developing several different solutions against Windows Azure Storage but since the storage emulator is basically just "one storage account" we need to separate the data per project. Are there easy/handy ways to do this without having to manually extract & put back all the data?
Storage emulator data is stored in a local database. I'd assume that could be backed up and restored as needed. But, never tried personally.

Windows Azure Blob Storage Emulator File Storage Location

Where does the Windows Azure Blob Storage Emulator (yes, the emulator) store files uploaded to it? As in, what is the actual folder (path) that it is storing blobs on my local machine? I have it setup and running and I can successfully upload blobs and retrieve blobs, but I'd like to know where the emulator is actually storing the files. After uploading a blob, the address I get is:
http://127.0.0.1:10000/devstoreaccount1/mycontainer/picture.png
I am using xampp and the files don't seem to be in my htdocs folder. I'm just curious, that's all.
The storage emulator listens at the address you see there, but when requests come in it uses the SQL store as the back end storage.
The storage emulator uses SQL Server LocalDB by default, or you can use the DSInit.exe command line utility to point it at a full SQL Server instance. All table, queue and BLOB data is then saved in that database. In the case of BLOBs the metadata is stored in the database, but then the file is stored in an appdata directory. For example, one of mine was in c:\users\michael\appdata\local\developmentstorage\sql\blockblobroot\1\c1ba3640-ad8e-4cbb-8818-95c7d866cb71.
If you point your emulator at a SQL Express or SQL Server instance you can then use SQL Management Studio to connect to that instance and dig into the tables. There is a table named Blob with a column of DirectoryPath which will tell you were the files are. I wouldn't go messing around much with the structure of this database, or the file structure outside of using the API or a storage tool or you may cause issues with your local emulator stability.
Also note that this is NOT how the data is stored in Windows Azure, only how the local emulator simulates it.
As of 4.6
The storage emulator stores the data of all types as files (without extensions) under the root folder;
C:\Users\username\AppData\Local\AzureStorageEmulator

Shared Umbraco Media Folder on Azure Cloud Instances

I have just implemented Umbraco in an Azure Cloud Instance. I was able to migrate my existing SQL Database to run on SQL Azure and everything runs fine, except for the images and documents inside the media folder.
By default the media folder resides in [siteroot]/Media.
Is there a way to map this folder to azure storage? If not I don't think I'm going to be able to scale up my cloud instances, since the images depend on the virtual server's local storage.
Edit: Bounty Started
What I have so far is this:
Define a stand alone web role which would hold the media directory
and all the files.
Map this folder to the Azure Blobg Storage service with Cloud Drive, in order to minimize the risk of losing data and relying on a
single point of storage.
Somehow (and this is the part I don't know how to accomplish) keep all the folder of [siteRoot]/media synced with this shared drive on
all running instances.
I've seen a similar approach taken with the Azure Accelerator project from Umbraco here: http://azureaccelerators.codeplex.com/releases
But they haven't updated the release since 2011, and I'm not sure it would work with the current version of Azure.
Edit 2:
Umbraco has their own accelerator, but they've deprecated it in favor of using Websites instead of Web Roles:
https://github.com/Microsoft-DPE/wa-accelerator-umbraco
This release works with the 1.6 SDK. Current version is 1.8 I believe...
I'm not sure about a way of mapping the path to storage, but depending on the version of Umbraco you are using, I think from 4.9 (possibly 4.10) they introduced FileSystemProviders configuration which may help solve your problem.
My understanding of it is that it allows you to replace the default Umbraco FileSystemProvider, Umbraco.Core.IO.PhysicalFileSystem with your own custom implementation. I'm pretty sure you could implement an Azure-based provider that wrote and read from the blob storage. In the source, it looks fairly straightforward, a matter of implementing their IFileSystem.
Ended up using Matt Brailsford's Universal Media Picker solution:
http://our.umbraco.org/projects/backoffice-extensions/universal-media-picker
The final solution actually circumvents the Umbraco Media Folder and reads directly from Blob Storage, so I had to rewrite all the macros and templates that rendered images before and point them directly to the Blob Storage account
Unfortunately theres no way to map a NTFS directory to BlobStorage directly.
Have a look at the CloudDrive class of the Windows Azure SDK. This feature allows you to upload a Virtual Hard Disk file (.vhd file) into your blob storage and mount it as a local drive inside Windows Azure Instances.
You sould know that (if you're using multiple instances) only one cloud instance can mount the VHD in read/write mode. The rest of them has only read access to the drive. If the "Media" folder stores static content that you update manually only a few times, this is okay. But if user content is placed there, too, you might want only one instance to mount the VHD and grant other instances access to it via Network Share.
This package provided by Ali Sheikh Taheri solves the problem of the media folder
http://our.umbraco.org/projects/backoffice-extensions/ast-azure-media-sync

Resources