Azure blob storage overwriting duplicate files - azure

I am using Azure Blob storage to upload/download files. The problem is, if I upload any new file to azure blob that have the same name as already uploaded file then its automatically overwriting the content of previously uploaded file.
For example
These are the files uploaded on azure blob storage -
file1.docx
file2.png
file1.png
So if i am uploading a new file named as "file1.docx" which have the different content. Then blob storage is replacing the previous uploaded file1.docx . So in this case i am losing the previously uploaded file.
Is there any way that blob storage can automatically detect that there is duplicate so it can append _1 or (1) in the end or any other way to solve this problem ?

Is there any way that blob storage can automatically detect that there
is duplicate so it can append _1 or (1) in the end or any other way to
solve this problem ?
Out of the box this feature is not available and you will have to handle this in your application. If your upload operation fails with a Conflict (HTTP Status Code 409) error that would mean that a blob by the name of the uploaded file exists. You would then need to retry the operation by appending _1 or (1). You will need to keep on doing it by increasing the counter till the time your upload does not fail with conflict status code.

You can also append GUID to your file name which will make the file unique.

Related

how to programmatically delete uncommitted blocks in azure storage using blob service?

We get an error 'The specified blob or block content is invalid' if we try to upload a block that is already present in server. how to clear those uncommittedblocks before user retries to upload the same blob?
code:'InvalidBlobOrBlock'
message:'The specified blob or block content is invalid.\nRequestId:1b015a55-201e-00be-7b1c-7e8fb8000000\nTime:2021-07-21T10:35:48.0075829Z'
name:'StorageError'
requestId:'1b015a55-201e-00be-7b1c-7e8fb8000000'
stack:'StorageError: The specified blob or block content is invalid
statusCode:400
how to clear those uncommittedblocks before user retries to upload the
same blob?
AFAIK, there's no direct way to delete uncommitted blocks. Simplest way for you would be to create a blob with the same name and then delete that blob. When creating a blob with the same name, please ensure that it is uploaded without splitting the contents into blocks.
I wrote a blog post about this error some time back that you may find useful: https://gauravmantri.com/2013/05/18/windows-azure-blob-storage-dealing-with-the-specified-blob-or-block-content-is-invalid-error/.

access a file from a directory in azure blob storage through Azure Logic App

I am using LogicApp to import a set of files which are inside the directory(/devcontainer/sample1/abc.csv).
The problem here is that,I could not even located to the azure file from my LogicApp, I am getting the following error as:
verify that the path exists and does not contain the blob name.List Folder is not allowed on blobs.
Screenshots for reference
The problem here is that,I could not even located to the azure file from my LogicApp,
The file explorer will show all the contains and blobs when you choose blob path. And it will cache the data for a period of time to ensure the smoothness of the operation. If a blob is added to the container recently, it will not be seen and chosen from the file explorer. The workaround is by clicking the change connection link and using a new connection to retrieve the data.
Does your blob connection pointing to the correct storage account? one thing you can try to do is instead of providing the path try to browse the path so that you can what are the containers and the blobs that are present in the storage account that you are trying to access.

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Verifying CloudBlob.UploadFromStream compleated with no errors?

I want to save files users upload to my site into my Azure Blob and I am using the CloudBlob.UploadFromStream method to do so but I want to make sure the file completed saving to the blob with no problems before doing some more work. I am currently just uploading the blob then checking to see if a reference to the new blob exists using GetBlockBlobReference inside an if statement. Is there a better way of verifying the upload completed fine?
If there's any problem while uploading the blob, CloudBlob.UploadFromStream method would throw an error so that would be the first place to check if the upload went fine.
I don't think creating a reference for a blob using GetBlockBlobReference would do you any good as it just creates an instance of CloudBlockBlob. It doesn't check if the blob exists in the storage or not. If you want to check if the blob exists in the storage, you could either fetch blob attributes using CloudBlockBlob.FetchAttributes method or creating an instance of CloudBlob using CloudBlobContainer.GetBlobReferenceFromServer or CloudBlobClient.GetBlobReferenceFromServer. All of the three methods above will fetch information about the blob from storage and would throw appropriate errors if something is not right (e.g. Not Found error if the blob does not exist).

Upload to Blob Storage while overwriting existing Blob

When using the StorageClient and calling UploadByteArray() with a blob name that already exists, will this cause any data corruption ? (In other words, do I have to call Delete() before uploading, which will cost another transaction ?)
You should just be able to call UploadByteArray() without having to do the delete first and it will overwrite the blob that is already there.

Resources