I am working on creating my first PWA. I created a Chrome Extension in the past, and used the chrome.storage.sync API to store data in the cloud (it's stored using Google Drive TMK).
chrome.storage.sync.set({ [key]: val }, function() {
if (chrome.runtime.lastError) {
reject(chrome.runtime.lastError);
} else {
resolve();
}
});
is there a way to use this API when creating PWAs? This API is documented here:
https://developer.chrome.com/extensions/storage
it might be possible to use the Google Drive REST API to store key/value data but I cant figure out how yet:
https://developers.google.com/drive/api/v3/appdata
Related
Trying out the transloadit api, the template works when I use the testing mode on the transloadit website, but when I try to use it in Node JS with the SDK I'm getting an error:
INVALID_FORM_DATA - https://api2.transloadit.com/assemblies - INVALID_FORM_DATA: The form contained bad data, which cannot be parsed.
The relevant code: (_asset.content) is a Buffer object
async function getThumbnailUrl(_assetkey: string, _asset: I.FormFile): Promise<string> {
let tOptions = {
waitForCompletion: true,
params: {
template_id: process.env.THUMB_TRANSLOADIT_TEMPLATE,
},
};
const stream = new Readable({
read() {
this.push(_asset.content);
this.push(null);
},
});
console.log(_asset.content);
util.transloadit.addStream(_assetkey, stream);
return new Promise((resolve, reject) => {
util.transloadit.createAssembly(tOptions, (err, status) => {
if (err) {
reject(err);
}
console.log(status);
//return status;
resolve(status);
});
});
}
I noticed that you also posted this question on the Transloadit forums - so in the case that anyone else runs into this problem you can find more information on this topic here.
Here's a work-around that the OP found that may be useful:
Just to provide some closure to this topic, I just tested my
workaround (upload to s3, then use import s3 robot to grab the file)
and got it to work with the nodejs sdk so i should be good using that.
I have a suspicion the error I was getting was not to do with the
transloadit api, but rather the form-data library for node js
(https://github.com/form-data/form-data 1) and that’s somehow not
inputting the form data in the way that the transloadit api is
expecting.
But as there aren’t alternatives to that library that I could find, I
wasn’t really able to test that hypothesis.
The Transloadit core team also gave this response regarding the issue:
It may try to set his streams to be Tus streams which would mean that
they’re not uploaded as multipart/form data.
In either case it seems like the error to his callback would be
originating from the error out of _remoteJson
These could be the problem areas
https://github.com/transloadit/node-sdk/blob/master/src/TransloaditClient.js#L146
https://github.com/transloadit/node-sdk/blob/master/src/TransloaditClient.js#L606
https://github.com/transloadit/node-sdk/blob/master/src/TransloaditClient.js#L642
It is also possible that the form-data library could be the source of
the error
To really test this further we’re going to need to try using the
library he was using, make sure the output of it is good, and then
debug the node-sdk to see where the logic failure is in it, or if the
logic failure is on the API side.
Here is my scenario.
I have placed a config file (.xml) into an Azure Blob Storage container
I want to edit that xml file and update/add content to it.
I want to deploy an api to an azure app service that will do that.
I built an api that runs locally that handles this but that isn't exactly going to cut it as a cloud application. This particular iteration is a NODEjs api that uses the Cheerio and File-System modules in order to manipulate and read the file respectively.
How can I retool this to be work with a file that lives in Azure blob storage?
note: Are azure blobs the best place to start with the file even? Is there a better place to put it?
I found this but it isn't exactly what I am after.....Azure Edit blob
Considering the data stored in blob is XML (in other words string type), instead of using getBlobToStream method, you can use getBlobToText method, manipulate the string, and then upload that updated string using createBlockBlobFromText.
Here's the pseudo code:
blobService.getBlobToText('mycontainer', 'taskblob', function(error, result, response) {
if (error) {
console.log('Error in reading blob');
console.error(error);
} else {
var blobText = result;//It
var xmlContent = someMethodToConvertStringToXml(blobText);//Convert string to XML if it is easier to manipulate
var updatedBlobText = someMethodToEditXmlContentAndReturnString(xmlContent);
//Reupload blob
blobService.createBlockBlobFromText('mycontainer', 'taskblob', updatedBlobText, function(error, result, response) {
if (error) {
console.log('Error in updating blob');
console.error(error);
} else {
console.log('Blob updated successfully');
}
});
}
});
Simply refactor your code to use the Azure Storage SDK for Node.js https://github.com/Azure/azure-storage-node
I have created and an API endpoint with Firebase Functions usign node.js. This
API endpoint collect JSON data from client browser and I am saving that JSON data to Firebase Firestore databse using Firebase Functions.
While this works fine but when I see Firestore usage tab it's shows really high number of read operations even I have not created any read function till now.
My API is in Production and and current usage data is : Reads 9.7K, Writes 1K, Deletes 0.
I have already tried to check with Firebase Firestore Document and Pricing but never seems to find anything on this issue.
I am using Firestore add function to create document with an auto generated document id. ValidateSubscriberData() is a simple function to validate client req.body inputs which is JSON data.
app.post('/subscribe', (req, res) => {
let subscriber = {};
ValidateSubscriberData(req.body)
.then(data => {
subscriber = data;
//console.log(data);
subscriber.time = Date.now();
return subscriber;
})
.then(subscriber => {
//console.log(subscriber);
// noinspection JSCheckFunctionSignatures
return db.collection(subscriber.host).add(subscriber);
})
.then(document => {
console.log(document.id);
res.json({id: document.id, iid: subscriber.iid});
return 0;
})
.catch(error => {
console.log({SelfError: error});
res.json(error);
});
});
I don't know this is an issue with Firestore or I am doing something in a way that makes read operations internally but I want find a way so I can optimize my code.
English is not my first language and I am trying my best explain my issue.
I think Firestore is working perfectly fine and my code too. I assume Firebase is counting those reads which I made through Firebase Console.
To verify this I have clicked on Data tab on Firestore page and scroll down to make all document name/id visible. And after that I see 1K Reads added on my old stats. So its proven Firestore counting all reads even its from firebase console and it is obvious but my bad I have not thinking about this before.
I don't think this question has any relevance but may be people like me find it helpful before posting any silly question on this helpful platform.
I want to create a google docs sheet within my alexa skill, that is written in Node.js. I have the enabled the google API, I set the required scope in amazon dev portal, I actually can log into the google account (so the first few lines of the posted code seem to work), and I do not get any error messages. But the sheet is never created.
Now the main question would be whether anyone can see the problem in my code.
But I would also have an additional question I would be very interested in: since I use account linking, I can not try that code in the Alexa test simulator, but have to upload it to Alexa before running it, where I can not get any debug messages. How does one best debug in that way?
if (this.event!== undefined)
{
if (this.event.session.user.accessToken === undefined)
{
this.emit(':tellWithLinkAccountCard','to start using this skill, please use the companion app to authenticate on Google');
return;
}
}
else
{
this.emit(':tellWithLinkAccountCard','to start using this skill, please use the companion app to authenticate on Google');
return;
}
var oauth2Client = new google.auth.OAuth2('***.apps.googleusercontent.com', '***', '***');
oauth2Client.setCredentials({
access_token: this.event.session.user.accessToken,
refresh_token: this.event.session.user.refreshToken
});
var services = google.sheets('v4');
services.spreadsheets.create({
resource : {properties:{title:"MySheet"}},
auth : oauth2Client
}, function(err,response) {
if( err ) {
console.log('Error : unable to create file, ' + err);
return;
} else {
console.dir(response);
}
});
Edit: I tried just the lower part manually, and could create a spreadsheet. So the problem seems indeed to be retrieving the access token with "this.event.session.user.accessToken" .
I find it is much easier to debug issues like this using unit tests. This allows rerunning code locally. I use NPM and Mocha and it makes it easier to debug both custom and smart home skills. There is quite a bit of information available online about how to use NPM and Mocha to test Nodejs code, so I won't repeat that here. For example, refer to the Big Nerd Ranch article. It makes it a bit more complex to setup your project initially, but you'll be glad you did every time you hit a bug.
In this example, I would divide the code in half:
The first half would handle the request coming from Alexa and extract the token.
The second half would use the token to create the Google doc. I would also pass the name of the doc to create.
I would test the 2nd part first, passing in a valid token (for testing only) and a test doc name. When that is working, at least you'd know that the doc creation code was working, and any issues would have to be with the token or how you're getting it.
Once that was working, I would then create a test for the first part.
I would us a hardcoded JSON object to pass in as the 'event', with event.session.user.accesToken set to a the working test token used in the first test:
'use strict';
var token = '<valid token obtained from google account>';
let testEvent = {
'session': {
'user': {
'accessToken': token
}
}
}
I want to be able to dynamically create buckets in Google Cloud Storage. I am using the MEAN.io stack, so I want to do this in Node.js on my backend.
I'm having trouble finding apis to help me, is there a way to do this?
Thanks!
UPDATE: Here I've tried using the Google-Node.js api, but nothing is happening. I tried to then authenticate it in case that's the problem, but I'm not really sure where to go from here.
googleAuth.authenticate(
credentials.jwt,
function(err, token) {
if (err) console.log(err);
console.log(token);
googleapis
.discover('storage', 'v1')
.execute(function(err, client) {
if (err) console.log(err);
client.storage.buckets.insert({'project': credentials.project}, {'name': bucketname}).execute();
})
});
The Google Cloud Storage JSON API is probably the API you want to use. You can code against it directly, or you could use a library like the Google APIs Node.js Client.