I want to download a file from Azure blob storage to Nodejs server file system,do some processing on that file on Nodejs side and upload the updated copy of file to the Blob storage again from the file system.
Please suggest some way/or anyone has implemented the same.
Thanks :)
Downloading a Blob using Node.js:
getBlobToFile - writes the blob contents to file
getBlobToStream - writes the blob contents to a stream
getBlobToText - writes the blob contents to a string
createReadStream - provides a stream to read from the blob
The following example demonstrates using getBlobToStream to download the contents of the myblob blob and store it to the output.txt file using a stream:
var fs = require('fs');
blobSvc.getBlobToStream('mycontainer', 'myblob', fs.createWriteStream('output.txt'), function(error, result, response){
if(!error){
// blob retrieved
}
});
Upload a file
How to: Upload a blob into a container
A blob can be either block, or page based. Block blobs allow you to more efficiently upload large data, while page blobs are optimized for read/write operations. For more information, see Understanding block blobs and page blobs.
Block blobs
To upload data to a block blob, use the following:
createBlockBlobFromLocalFile - creates a new block blob and uploads the contents of a file.
createBlockBlobFromStream - creates a new block blob and uploads the contents of a stream.
createBlockBlobFromText - creates a new block blob and uploads the contents of a string.
createWriteStreamToBlockBlob - provides a write stream to a block blob.
The following example uploads the contents of the test.txt file into myblob.
blobSvc.createBlockBlobFromLocalFile('mycontainer', 'myblob', 'test.txt', function(error, result, response){
if(!error){
// file uploaded
}
});
Related
I am trying to make a blob trigger azure function for log files, but the problem is it will pass through the entire blob content when any blob is created or updated.
So I am wondering is there a way to only get the appended blob content?
module.exports = async function main(context, myBlob) {
// I am using javascript, myblob contains the entire content of single blob.
// For logs in append blob, it results in duplicated logs, which is not ideal.
};
So I am wondering is there a way to only get the appended blob content?
No.
Unless you
maintain the byte-index/position per log file where you read last some place (e.g. using a file/DB/any-persistant-storage) or use Durable Function
on change notification, you find the last byte-index/position and read starting from that location using appropriate SDK/API. Here is the REST API (for ADLS Gen2, find the right one if you're using Gen1 or Blob) and some description on how to read a byte range out of a file in blobs.
I am having a problem updating a csv file in my blob. I have an existing file inside my blob which is a CSV file, and when I press the download button it will automatically download the file to my machine.
Now I already created a Logic App that will update the csv file. When I run the trigger of the app, it updates the file but when I press download, it opens up a new tab where the csv file will be displayed.
I want it the way like the original when I press download it download the file to my machine.
Any help will do or verification if this is possible.
I already tried "compose" and "create to csv" but this way it will not store it to a blob.
As I have test, when you want to create a .csv file with "create blob" action in logic app, it will always get the same problem with you. Because the blob content is "text/plan" which will display in another tab to show.
So, I suggest that you could use azure function to create the blob. In azure function you could set:blob.Properties.ContentType = "application/octet-stream";
Here is the create blob method in azure function:
storageAccount = CloudStorageAccount.Parse(connectionString);
client = storageAccount.CreateCloudBlobClient();
container = client.GetContainerReference("data");
await container.CreateIfNotExistsAsync();
blob = container.GetBlockBlobReference(name);
blob.Properties.ContentType = "application/octet-stream";
using (Stream stream = new MemoryStream(Encoding.UTF8.GetBytes(data)))
{
await blob.UploadFromStreamAsync(stream);
}
For more detailed code, you could refer to this article. After following that, you could download your blob in your local machine.
Note: when you create the azure function action in logic app, it will show error. Just delete Appsetting of "AzureWebJobsSecretStorageType" to "blob".
I want to save the pdf to Azure Blob storage container and retrieve it. How can I do this?
I tried this to save the PDF to Local Drive but I want to store this directly to BLOB storage without saving it on Local Drive.
let doc = new jsPDF('p','pt','a4');
doc.addHTML(document.body,function() {
doc.save('html.pdf');
});
**Can any suggest me a solution?
How to send the PDF content to Azure Blob storage?**
When I upload an jpg image to Azure Blob Storage, then pull it back out, it always comes back as a png. Is this a safe to assume default? I could not find any documentation. I can easily convert it back to a jpg in my byte conversion as I store the extension when I pull it out of the container, but can I safely assume any image type that is stored in Azure Blob Storage is done as a png?
but can I safely assume any image type that is stored in Azure Blob Storage is done as a png
As far as I know, blob storage doesn't have the default file type. It contains a content-type which could tell the user the file's content type is. If you set the blob name is xxx.jpg. Then it will store the xxx.jpg in the blob storage.
Here is an example.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("connection string");
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
CloudBlockBlob blockBlob = container.GetBlockBlobReference("brandotest.jpg");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\1.PNG"))
{
blockBlob.UploadFromStream(fileStream);
}
Result:
By using storage explorer, you could find the image's type in the blob storage. It is brandotest.jpg.
I guess you set the download image file's type when you use codes to download it.
Like this(I set the download file's path is myfile.png):
CloudBlockBlob blockBlob = container.GetBlockBlobReference("brandotest.jpg");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite(#"D:\authserver\myfile.png"))
{
blockBlob.DownloadToStream(fileStream);
}
The result will be myfile.png.
All in all, the file's type you have download to local or upload to the blob is according to your set when you using codes to get or upload it.
It is best not to assume that png is the default. I think it depends on your upload mechanism and the name you give for the file during the upload process.
I have a zipped 2GB file that I need to upload to Windows Azure through my home cable connection, how long should I expect that this file gets uploaded? Has anyone averaged their upload time for their files?
Also when I do this to create a blob for uploading a file
CloudBlob _blob = _container.GetBlobReference("file1");
is this creating CloudBlockBlob or CloudPageBlob by default? I have been using the above code to upload files and it has been quite slow.
CloudBlob _blob = _container.GetBlobReference("file1");
It does not create CloudBlockBlob or CloudPageBlob by default.
If you want to use CloudBlockBlob (Azure SDK v2.0):
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blob = container.GetBlockBlobReference("myblob");
now split your file to a small pieces (4MB max), and upload each piece like this:
blob.PutBlock(blockId, memoryStream, null);
where: blockId is a base64-encoded block ID that identifies the block.
and memoryStream A stream that provides the data for the block.
MSDN