Where are Azure Storage Explorer configurations stored? - azure

On MacOS, I'm trying to restore Microsoft Azure File Explorer settings/configurations from an old hard drive backup. I'd like to get all the previous account connections back without having to set them up again manually. Where is this data stored in the MacOS directory structure so I can copy it to the new hard drive?

Not sure if that's doable as of today but this is definitely a much-requested feature:
Feature request : backup file #754
Export Settings (include Quick Access) #2880
Persisting the Transfer and such Explorer settings at machine level. #4169
Feel free to comment on any/all of the above issues adding more context or create a new issue for the Storage Explorer Team to evaluate and prioritize.

Related

Scanning for malware in files uploaded to Azure

I am building a .net app (Azure app service) with file upload feature that would upload pdf/docx files to Azure blob storage.
I was just wondering if a malicious user uploads a virus infected file, bad macro in word file, is Azure able to scan and remove/quarantine that file?
In the app, admin user will be able to download the file using a url. Will that file be virus free? Or do I need to explicitly call antivirus like Symantec End Point after the admin downloads the file.
Please let me know your expert thoughts.
Looks like Azure Advanced Threat Protection is now available for Blob Storage (and other Azure resources).
At this moment, there's no solution for that available on Azure. You'd need to use 3rd party or build one yourself. In the future, you'll be able to use https://learn.microsoft.com/en-us/azure/security-center/threat-protection
There is an open source solution that my team implemented, Its a small antivirus system that sends all uploaded blobs from a specific container to antivirus scan (using VM with Microsoft defender) and the blob is moved based on the scan results to a different container.
Also the remediation can be modified.
You can use that solution to send to scan each blob uploaded and download blobs only from a "clean-container".
here's a link to the repo azure-storage-av-automation.

Rename Azure Storage Table?

Is it not possible to rename an Azure Storage Table?
I cannot seem to find anything online (not even cmdlets). There are no options for this in Visual Studio Server Explorer, Cloud Storage Studio or TableXplorer.
You're correct. It is not possible to rename an Azure Storage Table (or Blob Container or Queue for that matter).
Possible solution would be to download all entities from the table and upload them again in another table. Once all entities are uploaded, you can then delete the old table. When downloading entities, please do keep Continuation Token in mind as querying table would return up to 1000 entities per request.
You can download all entities using either Cloud Storage Studio (or Azure Management Studio) from Cerebrata or TableXplorer. If you want, you can use Azure Management Cmdlets from Cerebrata as well. It has cmdlets to export a table (Export-Table) and restore a table (Restore-Table).
Now, you can rename Azure Tables with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3). You can also rename containers and file shares with this tool. See the release notes here.
Note that this feature has the following disclaimer during usage.
Renaming works by copying to the new name, then deleting the source item. Renaming a table currently loses the table's properties and metadata, and may take a while if there are lots of entities.
Therefore this is not an actual renaming behind the scenes and incurs read/write/transaction costs.
You can also use AzCopy, which is a Microsoft command line tool for downloading/moving table data.

Create Azure Table Storage

Can I check if my understanding is correct here.
To create an Azure Storage table I have to C# or Javacript, PHP etc.
There is no GUI for simply creating a table? And if there is a GUI, is it popular/recommended approach or a niche thing?
What you're looking for is a Storage Explorer. There are many storage explorers available in the market today - There are both open source and commercial (both paid and free) storage explorers available. Please see this blog post from Windows Azure Storage Team about the list of storage explorers: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx.
Apart from these, Visual Studio also has a storage explorer built into it. You can find that in the Server Explorer. I haven't used Eclipse but I have heard that there's a storage explorer there as well.

Shared Umbraco Media Folder on Azure Cloud Instances

I have just implemented Umbraco in an Azure Cloud Instance. I was able to migrate my existing SQL Database to run on SQL Azure and everything runs fine, except for the images and documents inside the media folder.
By default the media folder resides in [siteroot]/Media.
Is there a way to map this folder to azure storage? If not I don't think I'm going to be able to scale up my cloud instances, since the images depend on the virtual server's local storage.
Edit: Bounty Started
What I have so far is this:
Define a stand alone web role which would hold the media directory
and all the files.
Map this folder to the Azure Blobg Storage service with Cloud Drive, in order to minimize the risk of losing data and relying on a
single point of storage.
Somehow (and this is the part I don't know how to accomplish) keep all the folder of [siteRoot]/media synced with this shared drive on
all running instances.
I've seen a similar approach taken with the Azure Accelerator project from Umbraco here: http://azureaccelerators.codeplex.com/releases
But they haven't updated the release since 2011, and I'm not sure it would work with the current version of Azure.
Edit 2:
Umbraco has their own accelerator, but they've deprecated it in favor of using Websites instead of Web Roles:
https://github.com/Microsoft-DPE/wa-accelerator-umbraco
This release works with the 1.6 SDK. Current version is 1.8 I believe...
I'm not sure about a way of mapping the path to storage, but depending on the version of Umbraco you are using, I think from 4.9 (possibly 4.10) they introduced FileSystemProviders configuration which may help solve your problem.
My understanding of it is that it allows you to replace the default Umbraco FileSystemProvider, Umbraco.Core.IO.PhysicalFileSystem with your own custom implementation. I'm pretty sure you could implement an Azure-based provider that wrote and read from the blob storage. In the source, it looks fairly straightforward, a matter of implementing their IFileSystem.
Ended up using Matt Brailsford's Universal Media Picker solution:
http://our.umbraco.org/projects/backoffice-extensions/universal-media-picker
The final solution actually circumvents the Umbraco Media Folder and reads directly from Blob Storage, so I had to rewrite all the macros and templates that rendered images before and point them directly to the Blob Storage account
Unfortunately theres no way to map a NTFS directory to BlobStorage directly.
Have a look at the CloudDrive class of the Windows Azure SDK. This feature allows you to upload a Virtual Hard Disk file (.vhd file) into your blob storage and mount it as a local drive inside Windows Azure Instances.
You sould know that (if you're using multiple instances) only one cloud instance can mount the VHD in read/write mode. The rest of them has only read access to the drive. If the "Media" folder stores static content that you update manually only a few times, this is okay. But if user content is placed there, too, you might want only one instance to mount the VHD and grant other instances access to it via Network Share.
This package provided by Ali Sheikh Taheri solves the problem of the media folder
http://our.umbraco.org/projects/backoffice-extensions/ast-azure-media-sync

How to perform a Windows Azure Backup?

I'm starting using Windows Azure to manipulate my azure databases. I don't have experienced in IT world, I'm just looking a way to backup my database (preferibly in a local computer) and restore it.
I started reading from here:
http://msdn.microsoft.com/en-us/library/jj650016.aspx#copy
And I ran this code:
CREATE DATABASE destination_database_name
AS COPY OF [source_server_name].source_database_name
But I'm not sure if it's working, in the next image, contoso2 is my original database and the another is the copy, and this one does not have any table from the original source.
So, please guide about how to backup my datases not using commercial products.
If you need additional data, please let me know.
I recommend reading Business Continuity in Windows Azure SQL Database which explains the underlying infrastructure available to you and the two main mechanisms for backup - ocpy database and export/import
You have third party products available; some of which don't require you to purchase anything. Here is a good summary which is still valid. You can also use the Export/Import feature available right off the management portal of Windows Azure.
Well it is easy if you are using Sql Server 2012. If you are not then you can install the express version.
Select the database you want to back up in new portal of windows azure https://manage.windowsazure.com
In the footer you will have an option to import/export. Click export. This opens a modal popup. Select the storage account you want to use and type in a appropriate name to save the *.bacpac file.
Once the file is saved to storage, download it to local, open sql server 2012 management studio. Select the database server. Right click on it and in the context menu you will find Import Data-Tier Application. Select the bacpac file from you local and follow the settings.
At the end you will have your data residing on your local machine.

Resources