I’m using azure function with node js to create a zip file(abc.zip) with some files in function app temp folder and on the next step I need to upload the zip file in azure blob storage. Problem is the blob storage path has to be something like ‘/xyz/pqr/ghk’. How to achieve this?
As #GauravMantri indicated,Try this :
const {
BlobServiceClient
} = require("#azure/storage-blob");
const connectionString = ''
const container = ''
const destBlobName = 'test.zip'
const blob = 'xyz/pqr/ghk/' + destBlobName
const zipFilePath = "<some function temp path>/<filename>.zip"
const blobClient = BlobServiceClient.fromConnectionString(connectionString).getContainerClient(container).getBlockBlobClient(blob)
blobClient.uploadFile(zipFilePath)
Result:
Let me know if you have any more questions :)
Related
I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.
I am following (previous and) this tutorial: https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/10-exercise-connect-with-your-azure-storage-configuration?pivots=javascript to upload an image to the Azure Storage account.
After following all the steps and copying the final file given for index.js, when I run the app by node index.js:
$ node index.js
I suppose the created blobs will be logged and the corresponding sizes will be shown. The corresponding output is only:
Container photos already exists
Therefore I suppose the file uploading is not successful. Where could the error be (or what I should check to see why the file uploading does not work?)
Any comments are welcome. Thanks!
I tried in my environment and got successful results:
Code:
#!/usr/bin/env node
require('dotenv').config();
const { BlobServiceClient } = require("#azure/storage-blob");
const storageAccountConnectionString = process.env.AZURE_STORAGE_CONNECTION_STRING;
const blobServiceClient = BlobServiceClient.fromConnectionString(storageAccountConnectionString);
async function main() {
// Create a container (folder) if it does not exist
const containerName = 'container2';//container name
const containerClient = blobServiceClient.getContainerClient(containerName);
const containerExists = await containerClient.exists()
if ( !containerExists) {
const createContainerResponse = await containerClient.createIfNotExists();
console.log(`Create container ${containerName} successfully`, createContainerResponse.succeeded);
}
else {
console.log(`Container ${containerName} already exists`);
}
// Upload the file
const filename = 'image1.png';//filename
const blockBlobClient = containerClient.getBlockBlobClient(filename);
blockBlobClient.uploadFile(filename);
// Get a list of all the blobs in the container
let blobs = containerClient.listBlobsFlat();
let blob = await blobs.next();
while (!blob.done) {
console.log(`${blob.value.name} --> Created: ${blob.value.properties.createdOn} Size: ${blob.value.properties.contentLength}`);
blob = await blobs.next();
}
}
main();
Console:
Portal:
Container photos already exists
The above command shows when you run again the same you will get that command because you have created container name with photos so thats the problem here.
I have also runned again once more I got same command:
*
i am trying to connect to the azure container and to save a text file
const azure = require('azure-storage');
const BlobServiceClient = azure.createBlobService();
const AZURE_STORAGE_CONNECTION_STRING = "my key";
const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
const containerName = "tempp";
console.log('\nCreating container...');
console.log('\t', containerName);
// Get a reference to a container
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobName = 'test' + uuidv1() + '.txt';
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
console.log('\nUploading to Azure storage as blob:\n\t', blobName);
const data = "message";
const uploadBlobResponse = blockBlobClient.upload(data, data.length);
console.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse.requestId);
The azure_storage connection string I got it from /security + networking/Access Keys by following the documentation but when running the project i am getting an error
Error: Credentials must be provided when creating a service client.
As suggested by Gaurav Mantri , I tried to repro in my local and able to deploy files to my container in azure.
Make sure you have install the following nuget packages
Azure.Storage.Blobs(12.10.0) latest sdk, Microsoft.Extensions.Configuration & Microsoft.Extensions.Configuration.Json
Here is the Open source code that i followed to upload files to my Container.
For more information please refer this MS DOC for.net & MS DOC for Node.js.
i have to make an api call passing a compressed file as input. i have a working example on premise but I would like to move the solution in cloud. i was thinking to use azure blob storage and the azure function trigger. i have the below code that works for file but I don't know how to do the same with azure blob storage and azure function in nodejs
const zlib = require('zlib');
const fs = require('fs');
const def = zlib.createDeflate();
input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')
input.pipe(def).pipe(output)
this code read a file as stream , compress the file and write another file as stream.
what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save in a different container with different name, then make an API call passing as input the compressed file saved in the other container
I tried this code for compressing the incoming file
const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code
context.bindings.outputBlob = s.pipe(def)
i don't have the compressed file, while if i use this
s.pipe(def).pipe(process.stdout)
i can read the compressed file
as you can see above i also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work
any idea?
thank you
this is the solution
var input = context.bindings.myBlob;
var inputBuffer = Buffer.from(input);
var deflatedOutput = zlib.deflateSync(inputBuffer);
context.bindings.myOutputBlob = deflatedOutput;
https://learn.microsoft.com/en-us/answers/questions/500368/compress-and-write-a-file-in-another-container-wit.html
What would be the best way to copy a blob from one storage account to another storage account using #azure/storage-blob?
I would imagine using streams would be best instead of downloading and then uploading, but would like to know if the code below is the correct/optimal implementation for using streams.
const srcCredential = new ClientSecretCredential(<src-ten-id>, <src-client-id>, <src-secret>);
const destCredential = new ClientSecretCredential(<dest-ten-id>, <dest-client-id>, <dest-secret>);
const srcBlobClient = new BlobServiceClient(<source-blob-url>, srcCredential);
const destBlobClient = new BlobServiceClient(<dest-blob-url>, destCredential);
const sourceContainer = srcBlobClient.getContainerClient("src-container");
const destContainer = destBlobClient.getContainerClient("dest-container");
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name)
// copy blob
await destBlob.uploadStream((await sourceBlob.download()).readableStreamBody);
Your current approach downloads the source blob and then re-uploads it which is not really optimal.
A better approach would be to make use of async copy blob. The method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions). You would need to create a Shared Access Signature URL on the source blob with at least Read permission. You can use generateBlobSASQueryParameters SDK method to create that.
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name);
const sourceBlobSasUrl = GenerateSasUrlWithReadPermissionOnSourceBlob(sourceBlob);
// copy blob
await destBlob.beginCopyFromURL(sourceBlobSasUrl);