I am following (previous and) this tutorial: https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/10-exercise-connect-with-your-azure-storage-configuration?pivots=javascript to upload an image to the Azure Storage account.
After following all the steps and copying the final file given for index.js, when I run the app by node index.js:
$ node index.js
I suppose the created blobs will be logged and the corresponding sizes will be shown. The corresponding output is only:
Container photos already exists
Therefore I suppose the file uploading is not successful. Where could the error be (or what I should check to see why the file uploading does not work?)
Any comments are welcome. Thanks!
I tried in my environment and got successful results:
Code:
#!/usr/bin/env node
require('dotenv').config();
const { BlobServiceClient } = require("#azure/storage-blob");
const storageAccountConnectionString = process.env.AZURE_STORAGE_CONNECTION_STRING;
const blobServiceClient = BlobServiceClient.fromConnectionString(storageAccountConnectionString);
async function main() {
// Create a container (folder) if it does not exist
const containerName = 'container2';//container name
const containerClient = blobServiceClient.getContainerClient(containerName);
const containerExists = await containerClient.exists()
if ( !containerExists) {
const createContainerResponse = await containerClient.createIfNotExists();
console.log(`Create container ${containerName} successfully`, createContainerResponse.succeeded);
}
else {
console.log(`Container ${containerName} already exists`);
}
// Upload the file
const filename = 'image1.png';//filename
const blockBlobClient = containerClient.getBlockBlobClient(filename);
blockBlobClient.uploadFile(filename);
// Get a list of all the blobs in the container
let blobs = containerClient.listBlobsFlat();
let blob = await blobs.next();
while (!blob.done) {
console.log(`${blob.value.name} --> Created: ${blob.value.properties.createdOn} Size: ${blob.value.properties.contentLength}`);
blob = await blobs.next();
}
}
main();
Console:
Portal:
Container photos already exists
The above command shows when you run again the same you will get that command because you have created container name with photos so thats the problem here.
I have also runned again once more I got same command:
*
Related
I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.
I'm following up on this article to download objects from GCP Cloud storage bucket: https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-nodejs
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
return contents;
);
}
downloadIntoMemory().catch(console.error);
I'm currently getting a buffer data in contents. I've this code hooked upto a API on NodeJS backend. I'm using React Typescript on frontend. Calling the API, gives me data buffer. How can I use it to download the file instead of the data buffer?
I tried the above method explicitly providing file destination, but I'm still getting the following error: EISDIR: illegal operation on a directory, open '{file_path_which_i_was_set}. Err: -21
As rightly pointed out by #John Hanley, you are referring to the documentation, where the code sample downloads an object into memory/ buffer in memory. If you want to download an object from a bucket to a file, refer to this code sample, where the ‘options’ parameter has to be passed to the download() method.
The code goes like this :
// The ID of your GCS bucket
const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
const fileName = 'your-file-name';
// The path to which the file should be downloaded
const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file to the destination file path
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);
I am using Node.js and the Azure SDK v12. I want to copy an existing blob with access tier==='Archive. To do so, I want to copy the blob and write it to the same container with a different blob name and a changed (rehydrated) access tier.
I could change the access tier of the existing 'Archived" blob directly, but that is not my goal. I want to keep the blob with access tier "Archive" and create a new blob with access tier==="Cool" || "Hot".
I am proceeding as per the documentation (https://learn.microsoft.com/en-us/azure/storage/blobs/archive-rehydrate-overview).
The below code works if the blob has access tier==='Cool' || 'Hot'. It fails for blobs with access tier==='Archive', though.
Aside: I think for SDK 'syncCopyFromUrl' and 'beginCopyFromUrl' do not work for copying blobs with access tier==='Archive'. I get the following errors if I try that: for 'syncCopyFromUrl' it gives me: "This operation is not permitted on an archived blob." For 'beginCopyFromUrl" it gives me: "Copy source blob has been modified" - when I check, the blob has not been modified (I check the last modification date and it is in the past).
How do I copy the archived blob and save a new blob in the same container with a different access type
const { BlobServiceClient,generateBlobSASQueryParameters, BlobSASPermissions } = require("#azure/storage-blob");
export default async (req, res) => {
if (req.method === 'POST') {
const connectionString = 'DefaultEndpointsProtocol=...'
const containerName = 'container';
const srcFile='filename' // this is the filename as it appears on Azure portal (i.e. the blob name)
async function getSignedUrl(blobClient, options={}){
options.permissions = options.permissions || "racwd"
const expiry = 3600;
const startsOn = new Date();
const expiresOn = new Date(new Date().valueOf() + expiry * 1000);
const token = await generateBlobSASQueryParameters(
{
containerName: blobClient.containerName,
blobName: blobClient.name,
permissions: BlobSASPermissions.parse(options.permissions),
startsOn, // Required
expiresOn, // Optional
},
blobClient.credential,
);
return `${blobClient.url}?${token.toString()}`;
}
(async () => {
try {
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(containerName);
const sourceBlobClient = containerClient.getBlockBlobClient(srcFile);
const targetBlobClient = containerClient.getBlockBlobClient('targetFileName');
const url = await getSignedUrl(sourceBlobClient);
console.log(`source: ${url}`);
const result = await targetBlobClient.syncCopyFromURL(url);
// const result = await targetBlobClient.beginCopyFromURL(url);
console.log(result)
} catch (e) {
console.log(e);
}
})();
}
}
export const config = {
api: {
bodyParser: {
sizeLimit: '1gb',
},
},
}
The main step which we need to know is to change the access tier of the blob.
With the below code we can set the access tier from JS:
// Archive the blob - Log the error codes
await blockBlobClient.setAccessTier("Archive");
try {
// Downloading an archived blockBlob fails
console.log("// Downloading an archived blockBlob fails...");
await blockBlobClient.download();
} catch (err) {
// BlobArchived Conflict (409) This operation is not permitted on an archived blob.
console.log(
`requestId - ${err.details.requestId}, statusCode - ${err.statusCode}, errorCode - ${err.details.errorCode}`
);
console.log(`error message - ${err.details.message}\n`);
}
And the rest of the operation can be done with help of copy events which are as below:
import logging
import sys
import os
import azure.functions as func
from azure.storage import blob
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient, __version__
def main(myblob: func.InputStream):
try:
logging.info(f"Python blob trigger function processed blob \n")
CONN_STR = "ADD_CON_STR"
blob_service_client = BlobServiceClient.from_connection_string(CONN_STR)
# MAP SOURCE FILE
blob_client = blob_service_client.get_blob_client(container="newcontainer0805", blob="source.txt")
#SOURCE CONTENTS
content = blob_client.download_blob().content_as_text
# WRITE HEADER TO A OUT PUTFILE
output_file_dest = blob_service_client.get_blob_client(container="target", blob="target.csv")
#INITIALIZE OUTPUT
output_str = ""
#STORE COULMN HEADERS
data= list()
data.append(list(["column1", "column2", "column3", "column4"]))
output_str += ('"' + '","'.join(data[0]) + '"\n')
output_file_dest.upload_blob(output_str,overwrite=True)
logging.info(' END OF FILE UPLOAD')
except Exception as e:
template = "An exception of type {0} occurred. Arguments:\n{1!r}"
message = template.format(type(e).__name__, e.args)
print (message)
if __name__ == "__main__":
main("source.txt")
This helps you to copy the blob and append the data to it, if you want to save the blob in same container modify destination as same container as source.
i am trying to connect to the azure container and to save a text file
const azure = require('azure-storage');
const BlobServiceClient = azure.createBlobService();
const AZURE_STORAGE_CONNECTION_STRING = "my key";
const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
const containerName = "tempp";
console.log('\nCreating container...');
console.log('\t', containerName);
// Get a reference to a container
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobName = 'test' + uuidv1() + '.txt';
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
console.log('\nUploading to Azure storage as blob:\n\t', blobName);
const data = "message";
const uploadBlobResponse = blockBlobClient.upload(data, data.length);
console.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse.requestId);
The azure_storage connection string I got it from /security + networking/Access Keys by following the documentation but when running the project i am getting an error
Error: Credentials must be provided when creating a service client.
As suggested by Gaurav Mantri , I tried to repro in my local and able to deploy files to my container in azure.
Make sure you have install the following nuget packages
Azure.Storage.Blobs(12.10.0) latest sdk, Microsoft.Extensions.Configuration & Microsoft.Extensions.Configuration.Json
Here is the Open source code that i followed to upload files to my Container.
For more information please refer this MS DOC for.net & MS DOC for Node.js.
I’m using azure function with node js to create a zip file(abc.zip) with some files in function app temp folder and on the next step I need to upload the zip file in azure blob storage. Problem is the blob storage path has to be something like ‘/xyz/pqr/ghk’. How to achieve this?
As #GauravMantri indicated,Try this :
const {
BlobServiceClient
} = require("#azure/storage-blob");
const connectionString = ''
const container = ''
const destBlobName = 'test.zip'
const blob = 'xyz/pqr/ghk/' + destBlobName
const zipFilePath = "<some function temp path>/<filename>.zip"
const blobClient = BlobServiceClient.fromConnectionString(connectionString).getContainerClient(container).getBlockBlobClient(blob)
blobClient.uploadFile(zipFilePath)
Result:
Let me know if you have any more questions :)