I am building a .net app (Azure app service) with file upload feature that would upload pdf/docx files to Azure blob storage.
I was just wondering if a malicious user uploads a virus infected file, bad macro in word file, is Azure able to scan and remove/quarantine that file?
In the app, admin user will be able to download the file using a url. Will that file be virus free? Or do I need to explicitly call antivirus like Symantec End Point after the admin downloads the file.
Please let me know your expert thoughts.
Looks like Azure Advanced Threat Protection is now available for Blob Storage (and other Azure resources).
At this moment, there's no solution for that available on Azure. You'd need to use 3rd party or build one yourself. In the future, you'll be able to use https://learn.microsoft.com/en-us/azure/security-center/threat-protection
There is an open source solution that my team implemented, Its a small antivirus system that sends all uploaded blobs from a specific container to antivirus scan (using VM with Microsoft defender) and the blob is moved based on the scan results to a different container.
Also the remediation can be modified.
You can use that solution to send to scan each blob uploaded and download blobs only from a "clean-container".
here's a link to the repo azure-storage-av-automation.
Related
I have a problem that I have been wracking my brain about and figured I would need some perspective and insight from people who are a lot more knowledgeable about this.
What I have currently: Web based application hosted in azure uses azure blob store to store files that are generated as part of data import processes. We have a seperate application that extends the original web application that allows users to upload files and these files are currently also stored in azure blob store.
Where I am trying to go: I have a requirement that wants the ability to map network file shares on a users laptop and be able to access these files that currently reside in the blob.
Since Azure blob does not support SMB I have no way of actually doing this with a blob store.
I could use Azure files in conjunction with a File Server running the sync agent. However, this requires a lot of work both in terms of refactoring, setup and some custom service that add remove permissions on the file server.
I'm wondering if there is a service or a piece of software that exists in the market currently that allows me to continue using blob and perhaps sync the blob files into a file server that can then allow users to access and open files using windows file explorer? I found one that looks like an open source project but only does a one way sync from the blob to the file share. Ideally I'd like to find a solution that does a two way sync like azure file sync does.
Any thoughts and ideas will be appreciated.
Since the max number of blob containers, file shares is unlimited. Per my understanding, you could leverage the following approaches:
Migrate the data from blob storage to azure file share instead of blob storage, then the subsequent file store is azure file storage.
Note: Currently you must specify storage account key when mounting file shares, details you could follow this feedback. I recommend that you'd better do not map network file shares on a users laptop.
You could still use the blob storage, and you could create each blob container for each user and generate each blob container SAS token for your users, then the users could leverage Azure Storage Explorer to manage their blob files or use AzCopy and other command tools to download the blob files into their laptop file system.
Note: For security consideration, you could combine a stored access policy with a SAS, in order to revoke the permissions, you just need to invalidate the related access policy instead of regenerating the account key. Details you could follow Controlling a SAS with a stored access policy and Shared Access Signatures, Part 2: Create and use a SAS with Blob storage.
Our application stores files uploaded from our customers to blob storage. These files are exchanged between different parties (our customers and their suppliers). Is there a way to check the uploaded files for viruses? The Antimalware service seems to just check virtual machines, but I cannot get any information about using it to check files as a service.
A great solution would be if we could store such a file in Azure Storage as an "on hold" file till it is checked. Then we would need a service to check this file and returns the result. If the file is virus-free we could then move it to the final destination.
Azure Storage is just... storage. There are no utilities built in, such as antivirus. You'd need to do your antivirus check on your own. Since antivirus tools typically only work with local OS storage, you'd need to place your "on hold" content (as you referred to it) on a local disk somewhere that you have antivirus installed and then copy to blob storage once your antivirus check is done.
How you accomplish managing this, and which software you use, is up to you. But VMs, App Services, and Cloud Services (web/worker roles) all have local disks available.
As the other answer states Azure Storage is just storage. There are a couple of ways you could do this though,
The first solution would be to run your own anti-virus and use this either as a gateway or programatically download the file from the Blob storage, check the file and then take the appropriate action. It's possible to run something like ClamAV to do this yourself.
Alternatively you could use a third party service like AttachmentScanner (which is exactly what you mention in your comment) which will accept a URL or a direct file upload. With Azure you can generate a temporary url pointing to the file with an expiration of a few minutes, pass the URL to AttachmentScanner and then take the appropriate action depending on the result.
I read an article about virus scanning for blob storage. Might be useful for you.
This guy is using an azure function trigger for the blob to catch the changes and sending the blob file to a virus scanner. The virus scanner is running in a docker container.
Full implementation details are available in the link below
https://peterrombouts.nl/2019/04/15/scanning-blob-storage-for-viruses-with-azure-functions-and-docker/
You can use Azure Defender for Storage to detect following:
Suspicious access patterns - such as successful access from a Tor exit node or from an IP considered suspicious by Microsoft Threat Intelligence
Suspicious activities - such as anomalous data extraction or unusual change of access permissions
Upload of malicious content - such as potential malware files (based on hash reputation analysis) or hosting of phishing content
And to enable it you need to go to Advanced security:
I setup an "azinbox" folder on an azure file storage container. I setup a console application (job) on a VM to check every 30 seconds for a file in that folder. If the job finds it, it moves the file from azinbox to a vminbox folder on the VM. As soon as the files shows up on the VM, if it has a virus, it gets quarantined and the file is deleted from the vminbox. The job on the vm then checks 30 seconds later to see if the file is still in the vminbox. If it is, it must be OK. The job moves the validated file to an azoutbox folder on the azure file storage container. From the Web Site perspective, 1) upload the file to azinbox 2) wait a minute and check the azoutbox. If the file is found, the website moves the file from the azoutbox to its final destination.
I admit it is a crappy solution because it takes a LONG time to complete a file upload. A minute or two can seem like a long time to upload a simple PDF to the user especially if they have more than one to upload.
Also, this requires you setup an entire VM server JUST to validate a file for a virus.
If anyone has a better option, please let me know.
As the title says, I'm looking for a way to access an azure files share (in preview) directly from an azure website. I cannot use any REST API or anything like this and I was looking for the possibility of mounting a SMB share directly into the website (through the new portal or any other way).
I found the following links, from which I understand that this is still under review (http://feedback.azure.com/forums/169385-web-apps-formerly-websites/suggestions/6084609-allow-map-azure-file-share-microsoft-azure-file-s) and also a SO question (Can the new Azure File Service be used from Azure WebSites?) that doesn't answer my question.
To be honest and for the sake of giving more details, my scenario is pretty simple - I have some websites and also some virtual machines that should access the files from the azure files service. Regarding the VM, the approach is pretty straight forward and easy but regarding the WebSites, I don't find any way at this moment.
On the other hand, regardless of the answer to the above question, does it make sense to (or do I have the possibility to) enable CDN over an Azure Files Share?
Thank you very much.
As of today, no single technology will serve your purpose. You can't use File Service as you don't have the capability to mount a share in an Azure Website as well as it is not suited for streaming purposes (all access to files there need to be authorized and there's no concept of Shared Access Signature in File Service today).
I guess, you would have to pick one of the two technologies (Blob Service and File Service) and make some compromises to make it work in both Websites and Virtual Machines.
Assuming you go with File Service, then you can mount them in the Virtual Machine and do the processing on the files there. On the website front, you would need to use Storage Client library to download the relevant files in some folder in your website and stream those files from there.
Assuming you go with Blob Service, then you can simply stream them in your website directly from blob storage (no need to have those files in your website). In the Virtual Machine, when you need to process those files (blobs), you would simply download them to your VM for processing and then re-upload them in blob storage.
Does it make sense to (or do I have the possibility to) enable CDN
over an Azure Files Share?
Currently it is not possible to serve Azure File Service files via CDN.
I have an application that is deployed on Windows Azure, in the application there is a Report part, the reports works as shown below.
The application generates the report as a PDF file and save it in a certain folder in the application.
I have a PDF viewer in the application that takes the URL of the file and displays it.
As you know, in windows azure I will have several VMs that will handled through a Load balancer so I can not ensure that the request in step 2 will go to the same VM in step 1, and this will cause a problem for me.
Any help is very appreciated.
I know that I can use BLOB, but this is not the problem.
The problem is that after creating the file on a certain VM, I give the PDF viewer the url of the pdf viewer as "http://..../file.pdf". This will generate a new request that I cannot control, and I cannot know which VM will server, so even I saved the file in the BLOB it will not solve my problem.
as in any farm environment, you have to consider saving files in a storage that is common for all machines in the farm. In Windows Azure, such common storage is Windows Azure Blob Storage.
You have to make some changes to your application, so that it saves the files to a Blob stroage. If these are public files, then you just mark the Blob Container as public and provide the full URL to the file in blob to the PDF viewer.
If your PDF files are private, you have to mark your container as private. Second step is to generate a Shared Access Signature URL for the PDF and provide that URL to the PDF viewer.
Furthermore, while developing you can explore your Azure storage using any of the (freely and not so freely) available tools for Windows Azure Storage. Here are some:
Azure Storage Explorer
Azure Cloud Storage Studio
There are a lot of samples how to upload file to Azure Storage. Just search it with your favorite search engine. Check out these resources:
http://msdn.microsoft.com/en-us/library/windowsazure/ee772820.aspx
http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/11/using-windows-azure-page-blobs-and-how-to-efficiently-upload-and-download-page-blobs.aspx
http://wely-lau.net/2012/01/10/uploading-file-securely-to-windows-azure-blob-storage-with-shared-access-signature-via-rest-api/
The Windows Azure Training Kit has great lab named "Exploring Windows Azure Storage"
Hope this helps!
UPDATE (following question update)
The problem is that after creating the file on a certain VM, I give
the PDF viewer the url of the pdf viewer as "http://..../file.pdf".
This will generate a new request that I cannot control, and I cannot
know which VM will server, so even I saved the file in the BLOB it
will not solve
Try changing a bit your logic, and follow my instructions. When your VM create the PDF, upload the file to a blob. Then give the full blob URL for your pdf file to the PDF viewer. Thus the request will not go to any VM, but just to the blob. And the full blob URL will be something like http://youraccount.blob.core.windows.net/public_files/file.pdf
Or I am missing something? What I understand, your process flow is as follows:
User makes a special request which would cause PDF file generation
File is generated on the server
full URL to the file is sent back to the client so that a client PDF viewer could render it
If this is the flow, that with suggested changes will look like the following:
User make a special request which would cause PDF file generation
File is generated on the server
File is uploaded to a BLOB storage
Full URL for the file in the BLOB is returned back to the client, so that it can be rendered on the client.
What is not clear? Or what is different in your process flow? I do exaclty the same for on-the-fly report generation and it works quite well. The only difference is that my app is Silverlight based and I force file download instead of inline-displaying.
An alternative approach is not to persist the file at all.
Rather, generate it in memory, set the content type of the response to "application/pdf" and return the binary content of the report. This is particularly easy if you're using ASP.NET MVC, but you can use a HttpHandler instead. It is a technique I regularly use in similar circumstances (though lately with Excel reports rather than PDF).
The usefulness of this approach does depend on how you're generating the PDF, how big it is and what the load is on your application.
But if the report is to be served just once, persisting it just so that another request can be made by the browser to retrieve it is just wasteful (and you have to provide the persistence mechanism).
If the same file is to be served multiple times and it is resource-intensive to create, it makes sense to persist it, then.
You want to save your PDF to a centralized persisted storage. VM's hard drive is neither. Azure Blob Storage is likely the simplest and best solution. It is dirt cheap to store and access. API for storing files and access them is very simple
There are two things you could consider.
Windows Azure Blob + Queue Storage
Blob Storage is a cost effective way of storing binary and sharing that information between instances. You would most likely use a worker role to create the Report which would store the report to Blob Storage and drop a completed message on the Queue.
Your web role instance could monitor the queue looking for reports that are ready to be displayed.
It would be similar to the concept used in the Windows Azure Guest Book app.
Windows Azure Caching Service
Similarly [and much more expensive] you could share the binary using the Caching Service. This gives a common layer between your VMs in which to store things, however you won't be able to provide a url to the PDF you'd have to download the binary and use either an HttpHandler or change the content-type of the request.
This would be much harder to implement, very expensive to run, and is not guaranteed to work in your scenario. I'd still suggest Blobs over any other means
Another option would be to implement a sticky session handler of your own. Take a look at:
http://dunnry.com/blog/2010/10/14/StickyHTTPSessionRoutingInWindowsAzure.aspx
I have a build script that it would be very useful to configure to dump some files into Azure blob storage so they can be picked up by my Azure web role.
My preferred plan was to find some way of mounting the blob storage on my build server as a mapped drive and simply using Robocopy copy to copy the files over. This will involve the least ammount of friction as I already am deploying some files like this to other web servers using WebDrive.
I found a piece of software that will allow me to do that: http://www.gladinet.com/
However on further investigation I found that it needs port 80 to run without some hairy looking hacking about on the server.
So is there another piece of software I could use or perhaps another way I haven't considered, such as deploying the files to a local folder that is automagically synced with blob storage?
Update in response to #David Makogon
I am using http://waacceleratorumbraco.codeplex.com/ this performs 2 way synchronisation between the blob storage and the web roles. I have tested this with http://cloudberrylab.com/ and I can deploy files manually to the blob and they are deployed correctly to the web roles. Also I have done the reverse and updated files in the web roles which have then been synced back to the blob and I have subsequently edited/downloaded them from blob storage.
What I'm really looking for is a way to automate the cloudberry side of things. So I don't have a manual step to copy a few files over. I will investigate the Powershell solutions in the meantime.
I know this is an old post - but in case someone else comes here... the answer is now "yes". I've been working on a CodePlex project to do exactly that. (All source code is available).
http://azuredrive.codeplex.com/
If you're comfortable using powershell in your build process then you could use the Cerebrata Cmdlets to upload the files. If that doesn't work for you, you could write a custom activity (but this sounds quite a bit more involved).
Mounting a cloud drive from a non-Windows Azure compute instance (e.g. your local build machine) is not supported.
Having said that: Even if you could mount a Cloud Drive from your build machine, your compute instances would need access to it too, and there can only be one writer. If your compute instances only needed read-only access, they'd need to create a snapshot after you upload new files.
This really doesn't sound like a good idea though. As knightpfhor suggested, the Cerebrata cmdlets provide this capability (look at Import-File). This allows you to push individual files into their own blobs. You can optimize further by pushing a single ZIP file into a blob. You can then use a technique similar to the one described by Nate Totten in his multi-tenant web role sample, to detect new zip files and expand them to your local storage. Nate's blog post is here.
Oh, and if you don't want to use the Cerebrata cmdlets, you can upload blobs directly with the Windows Azure Storage REST API (though the cmdlets are very simple to use and integrate seamlessly with PowerShell).