I've tried removing a file in an Azure File Share using
the az CLI
Azure Storage Explorer
Both yield the error:
The specified resource may be in use by an SMB client. and ErrorCode:SharingViolation
I've tried listing file handles with the Azure Powershell and az CLI commands, but no file handles are shown. Supposedly, this should reveal any file locks.
I've also tried rebooting everything (that I know of!) that is connected to this file share. Other files in the same directory can be deleted. Everything else with this file share seems normal.
Any idea how I can find the source of the lock, and how to delete it?
Can you check any other client accessing the share?
Create another test file in the same storage account(fileshare) for testing purpose and see are facing the similar issue?
Sharingviolation: The operation failed because the object is already opened and does not allow the sharing mode that the caller requested.
Based on the error message you may refer to this article: https://learn.microsoft.com/en-us/rest/api/storageservices/managing-file-locks
which provides detailed information on file locks
Try to Unlock all Azure file share locks
This article lists common problems that are related to Microsoft Azure Files when you connect from Windows clients. It also provides possible causes and resolutions for these problems. In addition to the troubleshooting steps in this article: Unable to delete files
Related
I am attempting to copy a file to Azure File Share via .Net Storage client (v12) for an integration.
The plan is copy file from Azure Storage to Azure File Share in the same account. That works fine.
I am using the StartCopyAsync method and it works as expected during testing.
However, for the live integration, the consumer is reading the file during the copy operation with zero bytes. The consumer then deletes the file. However, they have accessed and copied deleted the file before it was complete. Is there anything I can do to prevent access to the file during copy operation?
They have the file share mounted an SMB share and monitored and BizTalk is just seeing the file instantly.
Options looked at:
Uploading file and renaming - Azure File Share doesn't seem to support rename?
Uploading Marked ReadOnly and Hidden. SMB doesn't care if the file is ReadOnly and the other part of integration can't change to ignore hidden files
Take File Lease - To prevent deletion - still has the problem of client reading the empty file as soon as it hits the file share
Something else?
I am looking for a solution to my issue. I would like to use one folder with fils for my VMs.
I have tested a few solutions but always I have the same result. My shared folder is disconnected after every restart VM.
The problem is that Windows Server has credentials in Credential Manager.
I am trying to do this with net use, PowerShell and Cdmkey -
The easiest way to establish a persistent connection.
Does anybody has the same issue and found the solution?
I'm using Azure Files on my laptop it reconnects just fine after months of using it\rebooting\shutting down (i never hibernate). I think I was using net use\powershell and manually from explorer, all paths lead to the same outcome.
Another option - Azure File Sync, quote:
Use Azure File Sync to centralize your organization's file shares in
Azure Files, while keeping the flexibility, performance, and
compatibility of an on-premises file server. Azure File Sync
transforms Windows Server into a quick cache of your Azure file share.
You can use any protocol that's available on Windows Server to access
your data locally, including SMB, NFS, and FTPS. You can have as many
caches as you need across the world.
have you looked at "persisting Azure File Share credentials in Windows" section in the following document: https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows. Let me know if you have additional questions.
I'm trying to restore a file from a backup content database in SharePoint 2016 by using the Get-SPContentDatabase -ConnectAsUnattachedDatabase and drilling down to the item level to use the OpenBinary() call. I believe this is failing because the BLOB is externalized via StoragePoint, but I'm not sure how to allow this command to access the external BLOB data. Any ideas on what permissions might be necessary? The BLOB and endpoint still exist in SharePoint and on the file share and I am successfully able to see the item and its properties within PowerShell.
I found a similar issue where the OP said they solved it by giving explicit permissions to the StoragePoint databases, but Imm not sure what permissions or which databases need them. listItem.File.OpenBinary() not working - Remote Blob Storage / FileStreaming not enabled on SQL Server the culprit?
I was able to figure this out. I was testing from a server that didn't have the full StoragePoint installation. Testing the same call from one of the web servers in the farm I was able to open and download the file.
Our application stores files uploaded from our customers to blob storage. These files are exchanged between different parties (our customers and their suppliers). Is there a way to check the uploaded files for viruses? The Antimalware service seems to just check virtual machines, but I cannot get any information about using it to check files as a service.
A great solution would be if we could store such a file in Azure Storage as an "on hold" file till it is checked. Then we would need a service to check this file and returns the result. If the file is virus-free we could then move it to the final destination.
Azure Storage is just... storage. There are no utilities built in, such as antivirus. You'd need to do your antivirus check on your own. Since antivirus tools typically only work with local OS storage, you'd need to place your "on hold" content (as you referred to it) on a local disk somewhere that you have antivirus installed and then copy to blob storage once your antivirus check is done.
How you accomplish managing this, and which software you use, is up to you. But VMs, App Services, and Cloud Services (web/worker roles) all have local disks available.
As the other answer states Azure Storage is just storage. There are a couple of ways you could do this though,
The first solution would be to run your own anti-virus and use this either as a gateway or programatically download the file from the Blob storage, check the file and then take the appropriate action. It's possible to run something like ClamAV to do this yourself.
Alternatively you could use a third party service like AttachmentScanner (which is exactly what you mention in your comment) which will accept a URL or a direct file upload. With Azure you can generate a temporary url pointing to the file with an expiration of a few minutes, pass the URL to AttachmentScanner and then take the appropriate action depending on the result.
I read an article about virus scanning for blob storage. Might be useful for you.
This guy is using an azure function trigger for the blob to catch the changes and sending the blob file to a virus scanner. The virus scanner is running in a docker container.
Full implementation details are available in the link below
https://peterrombouts.nl/2019/04/15/scanning-blob-storage-for-viruses-with-azure-functions-and-docker/
You can use Azure Defender for Storage to detect following:
Suspicious access patterns - such as successful access from a Tor exit node or from an IP considered suspicious by Microsoft Threat Intelligence
Suspicious activities - such as anomalous data extraction or unusual change of access permissions
Upload of malicious content - such as potential malware files (based on hash reputation analysis) or hosting of phishing content
And to enable it you need to go to Advanced security:
I setup an "azinbox" folder on an azure file storage container. I setup a console application (job) on a VM to check every 30 seconds for a file in that folder. If the job finds it, it moves the file from azinbox to a vminbox folder on the VM. As soon as the files shows up on the VM, if it has a virus, it gets quarantined and the file is deleted from the vminbox. The job on the vm then checks 30 seconds later to see if the file is still in the vminbox. If it is, it must be OK. The job moves the validated file to an azoutbox folder on the azure file storage container. From the Web Site perspective, 1) upload the file to azinbox 2) wait a minute and check the azoutbox. If the file is found, the website moves the file from the azoutbox to its final destination.
I admit it is a crappy solution because it takes a LONG time to complete a file upload. A minute or two can seem like a long time to upload a simple PDF to the user especially if they have more than one to upload.
Also, this requires you setup an entire VM server JUST to validate a file for a virus.
If anyone has a better option, please let me know.
Working in IaaS environment in AZURE and need to create a shared file for applications that will be sharing the same files uploaded by end users. The file share needs to be scene on various servers and appear as a fixed drive letter or mount point. Already created a Storage account and a file share in azure but can not overcome the issue that the mapped drive is associated with a users profile.
Was wondering if any has come up with a solution. ... I'm the system administrator assigned to this task and can do things in powershell or pass code information to developers for their review.
Did not resolve issue, developers are going to use Blog storage.
The trick with this was getting the application to see the drive letter. For us having a local user run as a service with the associated Azure file share mapping might have worked
NOTE to map the azure drive a use would need the Azure Storage account and Key generated for that account to access it.