Google Cloud Storage - how to receive specific generation of file in Node.JS - node.js

I am using #google-cloud/storage for accessing the GCS API. After enabling versioning, I am trying to fetch a specific generation of a file using the following code:
bucket.file(`${name}#${generationId}`).download({
destination
}).then(() => {
res.set({
'Content-Type': 'image/jpeg'
});
return res.status(200).sendFile(destination);
})
Where name is the complete object name and generationId contains the number of the generation of the file.
However, if I am executing above code within Google Cloud Functions, I am receiving the following error message:
ApiError: No such object: my-bucket/image-mxmg569kdvnby6z85mi#1522107649220516
I am sure that the file and generation exists, as I have checked it with the JSON API explorer. My guess is that #google-cloud/storage does not support file versioning (yet), the documentation did not contain any information. Does anyone have experience with that?

If you set options.generation when instantiating the File object, it should work! https://github.com/googleapis/nodejs-storage/blob/4f4f1b4043c4f70ee99f051499ac62e893abdde0/src/file.js#L82
bucket.file(name, { generation: generationId })

Related

Chrome Extension Manifest V3 Loading Remote JSON Configuration

I am working on transitioning from Manifest 2 to Manifest 3 and in Manifest 2 I was pulling in a remotely hosted json file and using that to enable and disable settings. It has to be dynamic otherwise I'd include it in the extension itself. Looking at the migration guide from Google I see this:
In Chrome Extension manifest v3 remotely hosted code is no longer allowed. The migration documentation has two solutions.
Configuration-driven features and logic—In this approach, your
extension loads a remote configuration (for example a JSON file) at
runtime and caches the configuration locally. The extension then uses
this cached configuration to decide which features to enable.
Externalize logic with a remote service—Consider migrating application
logic from the extension to a remote web service that your extension
can call. (Essentially a form of message passing.) This provides you
the ability to keep code private and change the code on demand while
avoiding the extra overhead of resubmitting to the Chrome Web Store.
The first option sounds like what I am looking for, but I can't find any documentation on to achieve this approach. If anyone can point me in the right direction it would be greatly appreciated.
For reference this is how I was previously pulling it into the extension:
async function readJson() {
const response = await fetch('https://www.example.com/config.json');
console.log(response.ok);
if (!response.ok) {
const message = `An error has occured: ${response.status}`;
throw new Error(message);
}
const json = await response.json();
return json;
}

Trying to use HttpClient.GetStreamAsync straight to the adls FileClient.UploadAsync

I have an Azure Function that will call an external API via HttpClient. The external API returns a JSON response. I want to save the response directly to an ADLS File.
My simplistic code is:
public async Task UploadFileBulk(Stream contentToUpload)
{
await this._theClient.FileClient.UploadAsync(contentToUpload);
}
The this._theClient is a simple wrapper class around the various Azure Data Lake classes such as DataLakeServiceClient, DataLakeFileSystemClient, DataLakeDirectoryClient, DataLakeFileClient.
I'm happy this wrapper calls works as I expect, I spin one up, set the service, filesystem, directory and then a filename to create. I've used this wrapper class to create directories etc. so it works as I expect.
I am calling the above method as follows:
await dlw.UploadFileBulk(await this._httpClient.GetStreamAsync("<endpoint>"));
I see the file getting created in the Lake directory with the name I want, however if I then download the file using Sorage Explorer and then try to open it in say VS Code it's not in a recognisable format (I can "force" code to open it but it looks like binary format to me).
If I sniff the traffic with fiddler I can see the content from the external API is JSON, content-type is application/json and the body shows in fiddler as JSON.
If I look at the calls to the ADLS endpoint I can see a PUT call followed by two PATCH calls.
The first PATCH call looks like it is the one sending the content, it has a content-header of application/octet-stream and the request body is the "binary looking content".
I am using HttpClient.GetStreamAsync as I don't want my Function to have to load the entire API payload into memory (some of the external API endpoints return very large files over 100mb). I am thinking I can "stream the response from the external API straight into ADLS".
Is there a way to change how the ADLS FileClient.UploadAsync(Stream stream) method works so I can tell it to upload the file as a JSON file with a content type of application/json?
EDIT:
So turns out the External API was sendng back zipped content and so once I added the following extra AutomaticDecompression code to my functions startup I got the files uploaded to ADLS as expected.
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddHttpClient("default", client =>
{
client.DefaultRequestHeaders.Add("Accept-Encoding", "gzip, deflate");
}).ConfigurePrimaryHttpMessageHandler(() => new HttpClientHandler
{
AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
});
}
#Gaurav Mantri has given me some pointers on if the pattern of "streaming from an output to an input" is actually correct, I will research this further.
Regarding the issue, please refer to the following code
var uploadOptions = new DataLakeFileUploadOptions();
uploadOptions.HttpHeaders = new PathHttpHeaders();
uploadOptions.HttpHeaders.ContentType ="application/json";
await fileClient.UploadAsync(stream, uploadOptions);

How to store files in firebase using node.js

I have a small assignment where I will have a URL to a document or a file like google drive link or dropbox link.
I have to use this link to store that file or doc in firebase using nodejs. How should i start?
Little head's up might help. What should i use? Please help I'm stuck here.
The documentation for using the admin SDK is mostly covered in GCP documentation.
Here's a snippet of code that shows how you could upload a image directly to Cloud Storage if you have a URL for it. Any public link works, whether it's shared from Dropbox or somewhere else on the internet.
Edit 2020-06-01 The option to upload directly from URL was dropped in v2.0 of the SDK (4 September 2018): https://github.com/googleapis/nodejs-storage/releases/tag/v2.0.0
const fileUrl = 'https://www.dropbox.com/some/file/download/link.jpg';
const opts = {
destination: 'path/to/file.jpg',
metadata: {
contentType: 'image/jpeg'
}
};
firebase.storage().bucket().upload(fileUrl, opts);
This example is using the default bucket in your application and the opts object provides file upload options for the API call.
destination is the path that your file will be uploaded to in Google Cloud Storage
metadata should describe the file that you're uploading (see more examples here)
contentType is the file MIME type that you are uploading

Trying to insert data into BigQuery fails from container engine pod

I have a simple node.js application that tries to insert some data into BigQuery. It uses the provided gcloud node.js library.
The BigQuery client is created like this, according to the documentation:
google.auth.getApplicationDefault(function(err, authClient) {
if (err) {
return cb(err);
}
let bq = BigQuery({
auth: authClient,
projectId: "my-project"
});
let dataset = bq.dataset("my-dataset");
let table = dataset.table("my-table");
});
With that I try to insert data into BiqQuery.
table.insert(someRows).then(...)
This fails, because the BigQuery client returns a 403 telling me that the authentication is missing the required scopes. The documentation tells me to use the following snippet:
if (authClient.createScopedRequired &&
authClient.createScopedRequired()) {
authClient = authClient.createScoped([
"https://www.googleapis.com/auth/bigquery",
"https://www.googleapis.com/auth/bigquery.insertdata",
"https://www.googleapis.com/auth/cloud-platform"
]);
}
This didn't work either, because the if statement never executes. I skipped the if and set the scopes every time, but the error remains.
What am I missing here? Why are the scopes always wrong regardless of the authClient configuration? Has anybody found a way to get this or a similar gcloud client library (like Datastore) working with the described authentication scheme on a Container Engine pod?
The only working solution I found so far is to create a json keyfile and provide that to the BigQuery client, but I'd rather create the credentials on the fly then having them next to the code.
Side note: The node service works flawless without providing the auth option to BigQuery, when running on a Compute Engine VM, because there the authentication is negotiated automatically by Google.
baking JSON-Keyfiles into the images(containers) is bad idea (security wise [as you said]).
You should be able to add these kind of scopes to the Kubernetes Cluster during its creation (cannot be adjusted afterwards).
Take a look at this doc "--scopes"

Content Type not being set on Azure File Store in Node JS

I'm testing the functionality of uploading a file to Azure File Storage with this github sample: Node Getting Started
I modified line 111 to include an option for the contentSettings:
var options = { contentSettings: { contentType: 'application/pdf'}};
fileService.createFileFromLocalFile(shareName, directoryName, fileName, imageToUpload, options, function (error) {
if (error) {
callback(error);
} else {
... and whether I upload a PDF with contentType of 'application/pdf' or an image with 'image/png', the file content type is not set once it's posted to Azure storage.
When I copy the URL to the file in my website, the error comes back saying the content type is incorrect.
What am I doing wrong? How do I set the content types of the uploaded files to make them work in my website?
what's the version of azure-storage package you are using? I tried the code you pasted and the content type is set successfully to Azure Storage (latest version).
After uploading successfully, try to call getFileProperties and you can see the properties stored on the Azure Storage server side.
And my very clear about the scenario of "copying the URL to the file in my website" and the error.

Resources