Azure Storage emulator with Nodejs createQueueService error - node.js

I'm receiving the following error when I try connecting to the emulated storage queue service:
The MAC signature found in the HTTP request '...' is not the same as any computed signature.
Make sure the value of Authorization header is formed correctly including the signature.
This is the approach I'm using to connect to the azure storage:
var storageAccount = 'devstoreaccount1'
var accessKey= 'Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=='
var azure = require('azure-storage');
var queueSvc = azure.createQueueService(storageAccount,accessKey);
queueSvc.createMessage('myqueue', "Hello world!", function(error, results, response){
if(!error){
// Message inserted
}
});
I also tryed using the following connection strings, without success:
UseDevelopmentStorage=true
and
DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;
BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;
TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;
QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;
Everything is working properly in production environments, the issue is only related to the emulated service and specifically for the queues (emulated blobs are working as expected).
Any idea?

I have reproduced your error after testing your code with SDK 2.8.1.
I get detailed log from console by using queueSvc.logger.level = azure.Logger.LogLevels.DEBUG;.
The request uri generated by this method is https://devstoreaccount1.queue.core.windows.net:443/myqueue/messages, which is used to access storage account online called devstoreaccount1.
To access storage emulator:
var azure = require('azure-storage');
var devStoreCreds = azure.generateDevelopmentStorageCredentials();
var queueSvc = azure.createQueueService(devStoreCreds);

Related

Is it possible to fetch a VM by IP address,without knowing resource group, in Node.js Runtime on Azure Function?

I am running Node.js in Azure Function and I am trying to retrieve the resource group and name of a VM while only knowing the IP that was sent. Is it possible to retrieve all Public IP addresses associated with a subscription using Node.js SDK in Azure?
In powershell I can write az network public-ip list which will provide a JSON which contains information like:
"id": "/subscriptions/444444-4444-43444d8-44444c/resourceGroups/testserver/providers/Microsoft.Network/publicIPAddresses/publicip",
"idleTimeoutInMinutes": 4,
"ipAddress": "55.55.55.55",
However in Node when calling a similar function networkClient.PublicIPAddresses.listAll() I receive a list of id's but not an IP address. ie:
[1/25/2019 7:19:47 PM] publicIPAddressVersion: 'IPv4',
[1/25/2019 7:19:47 PM] ipConfiguration:
[1/25/2019 7:19:47 PM] { id:
[1/25/2019 7:19:47 PM] '/subscriptions/444444-4444-43444d8-44444/resourceGroups/TEST/providers/Microsoft.Network/networkInterfaces/test-vm1968/ipConfigurations/ipconfig1' },
Is it possible in Node to fetch all public IP's and use that to determine the Resource group and associated VM?
Thank you.
Yes, it's possible in Node. On Azure, all SDKs are built by wrapping the REST APIs. So if you can find out the REST API of the feature you want, you also can use the related SDK API for your used language.
There is a REST API PublicIPAddress(Preview) - List All which response is same with the result of the command az network public-ip list. We must note the value of the required parameter api-version is 2018-11-01, and it's a newest version of an ARM API.
Considering for different api-version values used in the different SDK versions, first of all to upgrade the azure-arm-network package version to newest via npm update azure-arm-network --save, then to run the code like below which refer to the sample code vm-sample.js ` in GitHub.
var util = require('util');
var path = require('path');
var msRestAzure = require('ms-rest-azure');
var NetworkManagementClient = require('azure-arm-network');
var FileTokenCache = require('../../lib/util/fileTokenCache');
var tokenCache = new FileTokenCache(path.resolve(path.join(__dirname, '../../test/tmp/tokenstore.json')));
//Environment Setup
_validateEnvironmentVariables();
var clientId = process.env['CLIENT_ID'];
var domain = process.env['DOMAIN'];
var secret = process.env['APPLICATION_SECRET'];
var subscriptionId = process.env['AZURE_SUBSCRIPTION_ID'];
var credentials = new msRestAzure.ApplicationTokenCredentials(clientId, domain, secret, { 'tokenCache': tokenCache });
var networkClient = new NetworkManagementClient(credentials, subscriptionId);
function listAllPublicIP(options, callback) {
return networkClient.publicIPAddresses.listAll(options, callback);
}
You can check the API calling above whether correctly calls the correct REST API with the query paramater api-version=2018-11-01 via Fiddler.
Also, you can directly call the REST API with the Authorization header which value get from the credentials variable of the code above.

Azure blob returns 403 forbidden from Azure function running on portal

I've read several posts regarding similar queries, like this one, but I keep getting 403.
Initially I wrote code in Visual Studio - azure function accessing a storage blob - and everything runs fine. But when I deploy the very same function, it throws 403! I tried the suggested, moving to x64 etc and removing additional files, but nothing works.
Please note - i have verified several times - the access key is correct and valid.
So, I did all the following
(1) - I wrote a simple Azure function on Portal itself (to rule out the deployment quirks), and voila, same 403!
var storageConnection = "DefaultEndpointsProtocol=https;AccountName=[name];AccountKey=[key1];EndpointSuffix=core.windows.net";
var cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
var blobClient = cloudStorageAccount.CreateCloudBlobClient();
var sourceContainer = blobClient.GetContainerReference("landing");
CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx");
using (var inputStream = new MemoryStream())
{
log.Info($"Current DateTime: {DateTime.Now}");
log.Info("Starting download of blob...");
blob.DownloadToStream(inputStream); // <--- 403 thrown here!!
log.Info("Download Complete!");
}
(2) - I verified the date time by logging it, and its UTC on the function server
(3) - I used Account SAS key, generated on portal, but still gives 403. I had waited for over 30seconds after SAS key generation, to ensure that the SAS key propagates.
var sasUri = "https://[storageAccount].blob.core.windows.net/?sv=2017-11-09&ss=b&srt=sco&sp=rwdlac&se=2019-07-31T13:08:46Z&st=2018-09-01T03:08:46Z&spr=https&sig=Hm6pA7bNEe8zjqVelis2y842rY%2BGZg5CV4KLn288rCg%3D";
StorageCredentials accountSAS = new StorageCredentials(sasUri);
var cloudStorageAccount = new CloudStorageAccount(accountSAS, "[storageAccount]", endpointSuffix: null, useHttps: true);
// rest of the code same as (1)
(4) - I generated the SAS key on the fly in code, but again 403.
static string GetContainerSasUri(CloudBlobContainer container)
{
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(25);
sasConstraints.Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Add | SharedAccessBlobPermissions.Create;
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return container.Uri + sasContainerToken + "&comp=list&restype=container";
}
and used the above as
var sourceContainer = blobClient.GetContainerReference("landing");
var sasKey = GetContainerSasUri(sourceContainer);
var container = new CloudBlobContainer(new Uri(sasKey));
CloudBlockBlob blob = container.GetBlockBlobReference("a.xlsx");
I completely fail to understand why the code works flawlessly when running from visual studio, accessing the storage (not emulator) on cloud, but when same is either deployed or run explicitly on the portal, it fails to run.
What am I missing here?
Since you have excluded many possible causes, the only way I can reproduce your problem is to configure Firewall on Storage Account.
Locally the code works as you may have added your local IP into White List while this step was omitted for Function. On portal, go to Resource Explorer under Platform features. Search outboundIpAddresses and add those(usually four) IPs into Storage Account White List.
If you have added Function IPs but still get 403 error, check location of Storage and Function app. If they live in the same region(like both in Central US), two communicate in an internal way without going through outboundIpAddresses. Workaround I can offer is to create a Storage in different region if Firewall is necessary in your plan. Otherwise just allow all networks to Storage.

Google Cloud Storage access without providing credentials?

I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"

Azure blob service, randomly 404-not found errors

We have a web app which processes external callbacks. To isolate our app from the external service, we use an azure blob to store the callback data (.json) and put a message to Azure Service bus for a processing service to pick up later.
The following code is used to write the data to blob storage:
var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("MyStorage"));
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("containername");
var dataFileName = Guid.NewGuid().ToString();
var blockBlob = container.GetBlockBlobReference(dataFileName);
blockBlob.UploadText(data);
blockBlob.Properties.ContentType = "application/json";
blockBlob.SetProperties();
var connectionString = CloudConfigurationManager.GetSetting("serviceBusCS");
var queueName = "MyQueue";
var client = QueueClient.CreateFromConnectionString(connectionString, queueName);
var payload = new MyCustomMessage {
Id = dataFileName
};
var message = new BrokeredMessage(payload);
client.Send(message);
On the processing side, we have a web job reading the messages one at a time and retrieving the data as follow:
var storageAccount = CloudStorageAccount.Parse("MyCS");
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("containername");
var blockBlob = container.GetBlockBlobReference(message.Id);
if (blockBlob == null) {
return;
}
if (!blockBlob.Exists()) {
return; ==> FAILS HERE
}
// Process the message here...
// Once the processing is done, delete the blob
This design works well most of the time but we get a 404-NotFound from now and then (marked with FAILED HERE above).
QUESTION
The only way this code can fail is by having two messages with the same file name i.e: the same GUID which is nearly impossible or am I missing something?
Any idea why the blob data can't be found?
EDIT 1
Searching for the missing blob from the Azure portal shows that the blob is actually not there.
Shouldn't blockBlob.UploadText(data); throw if it fails to write data to the blob container?
EDIT 2
Thanks to Jason for indicating where to look. We crawled our logs and found that the blob is getting written successfully. The web job kicks off and processes the message but one minute later, exactly one minute later we see the web job kicking off trying to process the very same message again and it won't find the blob which is expected as the first run cleared the blob and deleted it.
If the client application receives an HTTP 404 (Not found) message from the server, this implies that the object the client was attempting to use (such as an entity, table, blob, container, or queue) does not exist in the storage service. There are a number of possible reasons for this, such as:
•The client or another process previously deleted the object
•A Shared Access Signature (SAS) authorization issue
•Client-side JavaScript code does not have permission to access the object
•Network failure
See Storage Monitoring, Diagnosing and Troubleshooting Guide for more information.

Azure mobile service and Azure storage integration

We use Azure Mobile Services with Javascript (Node.js) back-end. The front-end is HTML/Javascript and runs as Azure Web App. We want to use Azure blob storage to store files (uploaded from front-end). I searched for a working example implementing this scenario, but I can't find it. There are examples with .NET back-end or Android/Windows Phone front-end. As a workaround it's possible to post the file to mobile service and do the storage from there, but the mobile service api body has a 1MB limit. I know I have to use a Shared Access Signature (SAS), but I don't know how to implement that. Generating the url from mobile service works. But it's not accepted when I use it in the client.
This guide is not working anymore: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
Thanks in advance for your help!
As usually, custom APIs on mobile services are used in handling logic workflows or event triggers. So Azure mobile service limit the body size of custom APIs requests for better performance. To implement upload files from clients to Azure Storage, we recommend to leverage SAS URIs.
And lots samples use the backend project to generate the SAS URI and return back to front-end.We can leverage Azure Node.js SDK in Mobile service custom APIs scripts to generate SAS URI.
Here is the code snippet:
exports.get = function(request, response) {
var azure = require('azure');
var qs = require('querystring');
var accountName = { accountName };
var accountKey = { accountKey };
var host = accountName + '.blob.core.windows.net';
var blobService = azure.createBlobService(accountName, accountKey, host);
var startDate = new Date();
var expiryDate = new Date(startDate);
expiryDate.setMinutes(startDate.getMinutes() + 30);
startDate.setMinutes(startDate.getMinutes() - 30);
var sharedAccessPolicy = {
AccessPolicy: {
Permissions: azure.Constants.BlobConstants.SharedAccessPermissions.WRITE,
Start: startDate,
Expiry: expiryDate
},
};
// you can custom send container name and blob name via http get request
/** e.g. var containerName = request.query.container,
blobName = request.query.blob
client side use invokeApi method, e.g.
client.invokeApi('getSAS',{
method:'GET',
parameters:{container:'mycontainer',blob:'myblob'}
})
**/
var blobSAS = blobService.generateSharedAccessSignature('mycontainer', 'myblob', sharedAccessPolicy);
var sasQueryString = qs.stringify(blobSAS.queryString);
var sasUri = blobSAS.baseUrl + blobSAS.path;
response.send(sasUri+"?"+sasQueryString);
};
You can refer to Upload images to Azure Storage from an Android device and Work with a JavaScript backend mobile service for reference.
Furthermore, for a deep understanding of generating SAS URI, you can refer to Constructing a Service SAS and Shared Access Signatures, Part 1: Understanding the SAS Model
Additionally, here is a similar example built in this architecture, Upload files to Microsoft Azure Storage from JavaScript

Resources