I would like to copy files from Linux server to Google cloud Storage bucket. Currently I am using service account for that activity. Now I would like to do the file transfer without exposing the service account keys to the Linux machine.
I have searched and found that GCP provides an option for that - impersonat service account. But for that I need to create a user profile, given the profile, the permission to create short lived service tokens.
Now that I would like to know is there any way i can copy the files from Linux server to GCS without creating a new user profile (specific to the Linux server to copy the files in GCS) and without exposing the service account keys.
Please let me know if you need anymore details.
Any help or knowledge shared in this regard would be much appreciated. Thanks!!!
Related
I am building a .net app (Azure app service) with file upload feature that would upload pdf/docx files to Azure blob storage.
I was just wondering if a malicious user uploads a virus infected file, bad macro in word file, is Azure able to scan and remove/quarantine that file?
In the app, admin user will be able to download the file using a url. Will that file be virus free? Or do I need to explicitly call antivirus like Symantec End Point after the admin downloads the file.
Please let me know your expert thoughts.
Looks like Azure Advanced Threat Protection is now available for Blob Storage (and other Azure resources).
At this moment, there's no solution for that available on Azure. You'd need to use 3rd party or build one yourself. In the future, you'll be able to use https://learn.microsoft.com/en-us/azure/security-center/threat-protection
There is an open source solution that my team implemented, Its a small antivirus system that sends all uploaded blobs from a specific container to antivirus scan (using VM with Microsoft defender) and the blob is moved based on the scan results to a different container.
Also the remediation can be modified.
You can use that solution to send to scan each blob uploaded and download blobs only from a "clean-container".
here's a link to the repo azure-storage-av-automation.
I have a problem that I have been wracking my brain about and figured I would need some perspective and insight from people who are a lot more knowledgeable about this.
What I have currently: Web based application hosted in azure uses azure blob store to store files that are generated as part of data import processes. We have a seperate application that extends the original web application that allows users to upload files and these files are currently also stored in azure blob store.
Where I am trying to go: I have a requirement that wants the ability to map network file shares on a users laptop and be able to access these files that currently reside in the blob.
Since Azure blob does not support SMB I have no way of actually doing this with a blob store.
I could use Azure files in conjunction with a File Server running the sync agent. However, this requires a lot of work both in terms of refactoring, setup and some custom service that add remove permissions on the file server.
I'm wondering if there is a service or a piece of software that exists in the market currently that allows me to continue using blob and perhaps sync the blob files into a file server that can then allow users to access and open files using windows file explorer? I found one that looks like an open source project but only does a one way sync from the blob to the file share. Ideally I'd like to find a solution that does a two way sync like azure file sync does.
Any thoughts and ideas will be appreciated.
Since the max number of blob containers, file shares is unlimited. Per my understanding, you could leverage the following approaches:
Migrate the data from blob storage to azure file share instead of blob storage, then the subsequent file store is azure file storage.
Note: Currently you must specify storage account key when mounting file shares, details you could follow this feedback. I recommend that you'd better do not map network file shares on a users laptop.
You could still use the blob storage, and you could create each blob container for each user and generate each blob container SAS token for your users, then the users could leverage Azure Storage Explorer to manage their blob files or use AzCopy and other command tools to download the blob files into their laptop file system.
Note: For security consideration, you could combine a stored access policy with a SAS, in order to revoke the permissions, you just need to invalidate the related access policy instead of regenerating the account key. Details you could follow Controlling a SAS with a stored access policy and Shared Access Signatures, Part 2: Create and use a SAS with Blob storage.
Working in IaaS environment in AZURE and need to create a shared file for applications that will be sharing the same files uploaded by end users. The file share needs to be scene on various servers and appear as a fixed drive letter or mount point. Already created a Storage account and a file share in azure but can not overcome the issue that the mapped drive is associated with a users profile.
Was wondering if any has come up with a solution. ... I'm the system administrator assigned to this task and can do things in powershell or pass code information to developers for their review.
Did not resolve issue, developers are going to use Blog storage.
The trick with this was getting the application to see the drive letter. For us having a local user run as a service with the associated Azure file share mapping might have worked
NOTE to map the azure drive a use would need the Azure Storage account and Key generated for that account to access it.
As the title says, I'm looking for a way to access an azure files share (in preview) directly from an azure website. I cannot use any REST API or anything like this and I was looking for the possibility of mounting a SMB share directly into the website (through the new portal or any other way).
I found the following links, from which I understand that this is still under review (http://feedback.azure.com/forums/169385-web-apps-formerly-websites/suggestions/6084609-allow-map-azure-file-share-microsoft-azure-file-s) and also a SO question (Can the new Azure File Service be used from Azure WebSites?) that doesn't answer my question.
To be honest and for the sake of giving more details, my scenario is pretty simple - I have some websites and also some virtual machines that should access the files from the azure files service. Regarding the VM, the approach is pretty straight forward and easy but regarding the WebSites, I don't find any way at this moment.
On the other hand, regardless of the answer to the above question, does it make sense to (or do I have the possibility to) enable CDN over an Azure Files Share?
Thank you very much.
As of today, no single technology will serve your purpose. You can't use File Service as you don't have the capability to mount a share in an Azure Website as well as it is not suited for streaming purposes (all access to files there need to be authorized and there's no concept of Shared Access Signature in File Service today).
I guess, you would have to pick one of the two technologies (Blob Service and File Service) and make some compromises to make it work in both Websites and Virtual Machines.
Assuming you go with File Service, then you can mount them in the Virtual Machine and do the processing on the files there. On the website front, you would need to use Storage Client library to download the relevant files in some folder in your website and stream those files from there.
Assuming you go with Blob Service, then you can simply stream them in your website directly from blob storage (no need to have those files in your website). In the Virtual Machine, when you need to process those files (blobs), you would simply download them to your VM for processing and then re-upload them in blob storage.
Does it make sense to (or do I have the possibility to) enable CDN
over an Azure Files Share?
Currently it is not possible to serve Azure File Service files via CDN.
I planning to host a simple node.js app on the website tier of Azure.
The application uses a .pem file to sign responses. The issues is that I'd like to deploy the application by simply pushing the git repo, but I don't want to include the .pem file in that repo because it seems that would be a big security issue?
Is there a way I can manually upload one file? What's the best way to store a .pem file on Windows Azure? What are common ways to handle situations like this?
This question is a bit open-ended, as I'm sure there are several viable ways to securely transfer a file.
Having said that: From an Azure-specific standpoint: You should be able to upload a file to your Web Site via ftp. Also, you could push a file to a specific blob and have your app check periodically for files in said blob. To upload (or download later), you'd need the storage account's access key, and as long as you're the only one with that key, you should be ok. When uploading from outside of Azure, you can connect to storage via SSL, further securing the upload.
While you could probably use a queue (either Storage or Service Bus) with the .pem file as payload in a message that your node app would monitor, you'd need to ensure that the file fits within the limits of the message size (64K for Azure Queue, 256K for Service Bus queue).