Copy blob from one storage account to another using #azure/storage-blob - node.js

What would be the best way to copy a blob from one storage account to another storage account using #azure/storage-blob?
I would imagine using streams would be best instead of downloading and then uploading, but would like to know if the code below is the correct/optimal implementation for using streams.
const srcCredential = new ClientSecretCredential(<src-ten-id>, <src-client-id>, <src-secret>);
const destCredential = new ClientSecretCredential(<dest-ten-id>, <dest-client-id>, <dest-secret>);
const srcBlobClient = new BlobServiceClient(<source-blob-url>, srcCredential);
const destBlobClient = new BlobServiceClient(<dest-blob-url>, destCredential);
const sourceContainer = srcBlobClient.getContainerClient("src-container");
const destContainer = destBlobClient.getContainerClient("dest-container");
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name)
// copy blob
await destBlob.uploadStream((await sourceBlob.download()).readableStreamBody);

Your current approach downloads the source blob and then re-uploads it which is not really optimal.
A better approach would be to make use of async copy blob. The method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions). You would need to create a Shared Access Signature URL on the source blob with at least Read permission. You can use generateBlobSASQueryParameters SDK method to create that.
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name);
const sourceBlobSasUrl = GenerateSasUrlWithReadPermissionOnSourceBlob(sourceBlob);
// copy blob
await destBlob.beginCopyFromURL(sourceBlobSasUrl);

Related

How to check existence of soft deleted file in Azure blob container with node js?

I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.

compress and write a file in another container with azure blob storage trigger in nodejs

i have to make an api call passing a compressed file as input. i have a working example on premise but I would like to move the solution in cloud. i was thinking to use azure blob storage and the azure function trigger. i have the below code that works for file but I don't know how to do the same with azure blob storage and azure function in nodejs
const zlib = require('zlib');
const fs = require('fs');
const def = zlib.createDeflate();
input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')
input.pipe(def).pipe(output)
this code read a file as stream , compress the file and write another file as stream.
what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save in a different container with different name, then make an API call passing as input the compressed file saved in the other container
I tried this code for compressing the incoming file
const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code
context.bindings.outputBlob = s.pipe(def)
i don't have the compressed file, while if i use this
s.pipe(def).pipe(process.stdout)
i can read the compressed file
as you can see above i also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work
any idea?
thank you
this is the solution
var input = context.bindings.myBlob;
var inputBuffer = Buffer.from(input);
var deflatedOutput = zlib.deflateSync(inputBuffer);
context.bindings.myOutputBlob = deflatedOutput;
https://learn.microsoft.com/en-us/answers/questions/500368/compress-and-write-a-file-in-another-container-wit.html

How to read Azure Blob Url?

I am creating list of blog posts in react and express/ Azure SQL db. I am able to use the Azure blob storage to store the image associated to the post. I am also able to get the blob url and I am storing that in my SQL db. However when I want to read the url directly it threw an error resource not found. After searching docs and other stackoverflow answers I could infer that it has something to do with SAS token. Can anyone explain what would be the better way to approach this?
https://yourdomain.blob.core.windows.net/imagecontainer/yourimage.png
Below is the nodejs code.
router.post('/image', async function (req, res) {
try {
console.log(req.files.files.data);
const blobName = 'test' + uuidv1() + '.png';
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(req.files.files.data, req.files.files.data.length)
res.send({tempUrl:blockBlobClient.url});
} catch (e) {
console.log(e);
}
})
However when I want to read the url directly it threw an error
resource not found.
Most likely you're getting this error because the blob container containing the blob has a Private ACL and because of that anonymous access is disabled. To enable anonymous access, please change the blob container's ACL to Blob or Public and that will solve this problem.
If you can't (or don't want to) change the blob container's ACL, other option would be to create a Shared Access Signature (SAS) on a blob. A SAS essentially gives time and permission bound access to a blob. For your needs, you would need to create a short-lived SAS token with just Read permission.
To generate a SAS token, you will need to use generateBlobSASQueryParameters method. Once you create a SAS token, you will need to append it to your blob's URL to get a SAS URL.
Here's the sample code to do so. It makes use of #azure/storage-blob node package.
const permissions = new BlobSASPermissions();
permissions.read = true;//Set read permission only.
const currentDateTime = new Date();
const expiryDateTime = new Date(currentDateTime.setMinutes(currentDateTime.getMinutes()+5));//Expire the SAS token in 5 minutes.
var blobSasModel = {
containerName: 'your-blob-container-name',
blobName: 'your-blob-name',
permissions: permissions,
expiresOn: expiryDateTime
};
const sharedKeyCredential = new StorageSharedKeyCredential('your-storage-account-name', 'your-storage-account-key');
const sasToken = generateBlobSASQueryParameters(blobSasModel, sharedKeyCredential);
const sasUrl = blockBlobClient + "?" + sasToken;//return this SAS URL to the client.

(Azure Storage - nodeJS) Getting SAS policies that are applied on blob container and queue

I'm trying to get the expiration date of the SAS policies that are applied on blob container and queue.
I'm able to get the information via powershell with the Get-AzStorageQueueStoredAccessPolicy and Get-AzStorageContainerStoredAccessPolicy but I cannot find a way to look a way to do the same via nodeJS.
I've went trough the MS node sdk for storage, i was able to find a way to setup the SAS policy but not to retrieve an existing one.
Do I need to go trough the ms graph?
Thank you for your help.
To get the access policies for a blob container, the method you would want to use is getAccessPolicy() which is in ContainerClient class.
import {BlobServiceClient} from '#azure/storage-blob';
const connectionString = "your-storage-account-connection-string";
const containerName = "your-container-name";
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(containerName);
const accessPolicyResult = await containerClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"
Similarly to get the access policies for a queue, the method you would to use is getAccessPolicy() which is in QueueClient class.
import {QueueServiceClient} from '#azure/storage-queue';
const connectionString = "your-storage-account-connection-string";
const queueName = "your-queue-name";
const queueServiceClient = QueueServiceClient.fromConnectionString(connectionString);
const queueClient = queueServiceClient.getQueueClient(queueName);
const accessPolicyResult = await queueClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"

Azure Blob Storage Compressing files by default?

I am uploading JSONs to Azure Blob storage using the Azure Blob storage API's function:
const response = await blobClient.upload(content, content.length);
There is absolutely no compression logic in the code nor any encoding headers being added but the files seem to be around 60% of their original size when they reach the storage. Also, monitoring the PUT requests using fiddler it seems that the file is compressed and then uploaded by the API.
My question is, does Azure do compression by default?
EDIT:
I was stringifying and then uploading the json objects. They get all the white-spaces remove and hence the reduced size.
Based on my test, there is no compression problem. Here is my sample:
const { BlobServiceClient } = require("#azure/storage-blob");
var fs = require('fs');
async function main() {
const AZURE_STORAGE_CONNECTION_STRING = "Your_Stroage_Account_Connection_String";
const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
const containerName = 'demo';
const blobName = 'test.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
if(!await containerClient.exists()){
await containerClient.create();
}
const contents = fs.readFileSync('test.txt');
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
await blockBlobClient.upload(contents,contents.length);
}
main().then(() => console.log('Done')).catch((ex) => console.log(ex.message));
The test.txt file's size is about 99.9KB.
And, from the portal, the uploaded file's size is 99.96KB,which is in line with our expectations.
You should also use byte length when uploading, as storage blob api expects number of bytes, the string length can be different
const content = "Hello 世界!";
console.log(`length: ${content.length}`);
console.log(`byteLength: ${Buffer.byteLength(content)}`);
the output:
length: 9
byteLength: 15

Resources