I want to upload a string value as file content into azure storage location but without writing any code and using azure components only. Is there any option available?
There are a lot of ways to do this:
Write a string into .txt file and upload the file to a storage container on Azure Portal.
Generate a long lifetime SAS token on Azure Portal:
and use blob rest API to upload it to a blob, you can do it directly in postman:
Result:
Use Azure Logic App to do this. Let me know if you have any more questions.
Let me know if you have any more questions.
Related
I have Azure blob storage, which contains some CSV files.
My task:
1. Create a logic app for the blob storage.
2. Retrieve the data from the blob storage.
3. Convert the retrieve file to JSON.
3. Upload that data to an online portal through API.
I've tried retrieving the data by "Get blob content" action, but not sure where to see the result. I have just created the logic app, but I'm stuck on what to do next.
For your question about
I've tried retrieving the data by "Get blob content" action, but not sure where to see the result.
When you run the logic app, you can see the content of your csv file in the OUTPUTS, shown as below:
If you want to use an api to upload the data, you just need to do the operation like below screenshot(but it also depends the type of your api's request body) :
Since you mentioned your azure storage blob contains some csv files, so maybe you need to loop the files. You can use "List blobs" action and "For each" action to loop them and then get blob content.
Hope it helps, if you have any further problem, please let me know.
In Azure, I have Blob storage with two containers. Input and Output. I have file say Test1.csv, which after processing I want to copy to output container. I need to do this as a step in Azure Logic app. However I am not able to understand how to configure the Copy Blob action in Azure Logic app. I am not able to configure the source path URL correctly.
It gives error message file not found.
Thanks
If you want to use Copy blob action to copy blob, the simplest way to get the blob url is use the Create SAS URI by path action. Then pass the url to the Copy blob action and the destination.
Except this way, you could use create blob to copy blob. Firstly use Get blob content using path to get the blob content, then use Create blob to upload the blob to the destination container.
I have an Azure storage account where I have created a folder to upload & download a file in it.I am also performing the rename operation on it e.g when I perform rename operation and upload the file into the blob all Blob metadata get updated successfully.
Please suggest the changes.
How to download the Azure blob content with the same name of File
As Gaurav Mantri said that you could specify the ContentDisposition property for your blob. Use the Azure Storage Explorer, you could quick set the ContentDisposition property as follows:
But when I downloading the image, the ContentDisposition seems not working at all. Then I found a similar issue, you need to set the DefaultServiceVersion for your blob storage service. And you need to write your code and set the DefaultServiceVersion, more details you could refer to here and choose your development language.
Test:
Additionally, if you upload/download your blob files by programming, you could refer to issue1 and issue2.
I have set up a postgreSQL database on a linux VM in Azure, and I have a .csv file in blob storage that I'd like to upload to that database.
However, I can't find any documentation regarding how (or even if it's possible) to reference files that are stored in blob storage as if it were part of the file system, or otherwise transfer files from blob storage to a server also running in Azure.
All the references I've found are about importing directly into pre-built SQL Server VMs, which is not my problem.
Any references or other help anyone can provide would be much appreciated.
As far as I know, the PostgreSQL support program key word in its query.
So I suggest you could use this key word to access the blob storage csv file.
Normally we will use curl to access the file, you could download it in below url:
https://curl.haxx.se/download.html#Linux
More details, you could refer to follow example codes:
COPY persons(first_name,last_name,dob,email)
FROM PROGRAM 'C:\curl "https://yourstorageaccount.blob.core.windows.net/mycontainer/test2.csv?sv=2016-05-31&sr=c&sig=jtNRuzR7G98hHogHHZyKY9gYN0r%2FSgr2j78HGKihYlc%3D&st=2017-03-09T02%3A43%3A17Z&se=2017-03-11T02%3A43%3A17Z&sp=rl"'DELIMITER ',' CSV HEADER;
The result of the query like as below:
Here I used the SAS token to protect my blob file.
If you don’t want to use this token, you could set the container’s permission in the portal.
Like below:
Then you could directly access the file by the url.
Link this :
https://yourstorageaccount.blob.core.windows.net/mycontainer/test2.csv
If you want to use the SAS token to protect my blob file, you could generate the SAS token as below images shows:
The result is like this
Then you could add this token behind the access blob url.
More details, you could refer to follow link:
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-shared-access-signature-part-2
I have a .NET app which uses the WebClient and the SAS token to upload a blob to the container. The default behaviour is that a blob with the same name is replaced/overwritten.
Is there a way to change it on the server, i.e. prevents from replacing the already existing blob?
I've seen the Avoid over-writing blobs AZURE but it is about the client side.
My goal is to secure the server from overwritting blobs.
AFAIK the file is uploaded directly to the container without a chance to intercept the request and check e.g. existence of the blob.
Edited
Let me clarify: My client app receives a SAS token to upload a new blob. However, an evil hacker can intercept the token and upload a blob with an existing name. Because of the default behavior, the new blob will replace the existing one (effectively deleting the good one).
I am aware of different approaches to deal with the replacement on the client. However, I need to do it on the server, somehow even against the client (which could be compromised by the hacker).
You can issue the SAS token with "create" permissions, and without "write" permissions. This will allow the user to upload blobs up to 64 MB in size (the maximum allowed Put Blob) as long as they are creating a new blob and not overwriting an existing blob. See the explanation of SAS permissions for more information.
There is no configuration on server side but then you can implement some code using the storage client sdk.
// retrieve reference to a previously created container.
var container = blobClient.GetContainerReference(containerName);
// retrieve reference to a blob.
var blobreference = container.GetBlockBlobReference(blobName);
// if reference exists do nothing
// else upload the blob.
You could do similar using the REST api
https://learn.microsoft.com/en-us/rest/api/storageservices/fileservices/blob-service-rest-api
GetBlobProperties which will return 404 if blob does not exists.
Is there a way to change it on the server, i.e. prevents from replacing the already existing blob?
Azure Storage Services expose the Blob Service REST API for you to do operations against Blobs. For upload/update a Blob(file), you need invoke Put Blob REST API which states as follows:
The Put Blob operation creates a new block, page, or append blob, or updates the content of an existing block blob. Updating an existing block blob overwrites any existing metadata on the blob. Partial updates are not supported with Put Blob; the content of the existing blob is overwritten with the content of the new blob.
In order to avoid over-writing existing Blobs, you need to explicitly specify the Conditional Headers for your Blob Operations. For a simple way, you could leverage Azure Storage SDK for .NET (which is essentially a wrapper over Azure Storage REST API) to upload your Blob(file) as follows to avoid over-writing Blobs:
try
{
var container = new CloudBlobContainer(new Uri($"https://{storageName}.blob.core.windows.net/{containerName}{containerSasToken}"));
var blob = container.GetBlockBlobReference("{blobName}");
//bool isExist=blob.Exists();
blob.UploadFromFile("{filepath}", accessCondition: AccessCondition.GenerateIfNotExistsCondition());
}
catch (StorageException se)
{
var requestResult = se.RequestInformation;
if(requestResult!=null)
//409,The specified blob already exists.
Console.WriteLine($"HttpStatusCode:{requestResult.HttpStatusCode},HttpStatusMessage:{requestResult.HttpStatusMessage}");
}
Also, you could combine your blob name with the MD5 code of your blob file before uploading to Azure Blob Storage.
As I known, there is no any configurations on Azure Portal or Storage Tools for you to achieve this purpose on server-side. You could try to post your feedback to Azure Storage Team.