Why does Firebase Service Account credential JSON not have the form of a JSON? - node.js

I have not attempted to use the Firebase Admin SDK for some time, so I apologize if this ends up being trivial but I have spent two days on this.
I am creating a new web and mobile app using Firebase, for which I have a data model layer shared between the web and mobile client apps. I want to set up automated testing of the data models using the Firebase Node.js Admin SDK.
So, I followed the instructions here https://firebase.google.com/docs/admin/setup
However, the service account credentials I download from firebase, although it is indeed downloaded as a .json file, the file does not have the form of a JSON file. It is just a long alphanumeric string with an '=' at the end.
As expected, exporting the environment variable ($ export GOOGLE_APPLICATION_CREDENTIALS=...) and then calling useApplicationDefault() results in an unexpected token error.
If I attempt to reconstruct the data type which I think is expected and pull the string in the file into a properly formatted JSON with the key "privateKey", then I get this error:
FirebaseAppError: Invalid contents in the credentials file
If I attempt to use the code snippet provided by Firebase on the Service Account page of my project, with the raw unedited non-JSON .json file, I still get unexpected token, as expected, but if I use the edited .json file with the correctly-formatted JSON, I get a PEM error.
FirebaseAppError: Failed to parse private key: Error: Invalid PEM formatted message.
As stated, the .json file Firebase provides to me is not a JSON and only contains an alphanumeric string terminated by an '=' sign.
My edited version has the form
{
"projectId": "myprojectid-id123",
"clientEmail": "email#domain.com",
"privateKey": "abcde1234567890="
}

Related

BIM360 Issues editor Forge Node JS App -trying to run on localhost 3000

I am trying to work with the BIM 360 Issue Editor created by Petr and available on github https://github.com/petrbroz/bim360-issue-editor/tree/develop
I have added all the dependencies,etc. but seem to be stuck with the configuration.
I am testing on local host and I am getting invalid URI error, what would be the correct configuration variables for launch.json file for
"HOST_URL": "http://localhost:3000","SERVER_SESSION_SECRET","CLI_CONFIG_PASSWORD"
Also there is SENDGRID_API_KEY required, which throws an error on the console, I add the key from SendGrid in config.js, and the error goes away. Is it correct?
Please suggest. Thanks
Here's more details about the env. variables:
HOST_URL is just the host/port the app is listening on (for example, http://localhost:3000)
This value is used to built the callback URL for the 3-legged OAuth workflow; for example, if the host URL is http://localhost:3000, the callback URL will be http://localhost:3000/auth/callback
Note that the same callback URL must be configured for your Forge app on https://forge.autodesk.com/myapps
SERVER_SESSION_SECRET is an arbitrary string that will be used to encrypt/decrypt browser cookies
CLI_CONFIG_PASSWORD is only needed when you want to use the command-line utility that's part of the sample code; in that case the configuration for the CLI utility will be zipped in a password-protected *.zip file using this env. variable as the password
SENDGRID_API_KEY is also optional and only needed if you want the app to send email notifications to users who triggered the Excel export

Document AI unsupported input file format

Since the last update with Document AI nodeJS API, I'm not able to send in jpeg file formats any more. I received the following message first:
Error: 3 INVALID_ARGUMENT: At this time, the only MIME types supported are 'application/pdf','application/json', 'image/gif' and 'image/tiff'.
When I changed my code to handle TIFF images I get the following message:
"(node:15782) UnhandledPromiseRejectionWarning: Error: 3 INVALID_ARGUMENT: Unsupported input file format."
I'm sure the file is a TIFF, I store it in cloud storage first and the content type is described as "image/tiff"
I attached some images for clarification.
The Document AI API has been updated in the time since this post was originally made. I recommend using the v1 REST API and Node.JS Client Libraries
The Supported Files page in the documentation also lists the supported File Types with the appropriate MimeTypes.
I have ever had a similar problem to you with PDF file.
I uploaded pdf files to google cloud storage and was going to run document AI NodeJS API with the files, but I got the same error as you.
"(node:15782) UnhandledPromiseRejectionWarning: Error: 3 INVALID_ARGUMENT: Unsupported input file format."
In my code, the mimeType was set into 'application/pdf'.
But the problem was fixed after the mimeType was set into 'PDF'.
I wonder if this helps you even a little bit.

Passing Dropbox file contents to an Azure Function from a Logic App

I am trying a sample app with the workflow
Wait for new file (csv) in dropbox folder
Load the file contents
Pass the file contents to an azure function to further process
I am getting stuck on how to pass the file contents to the azure function. I keep getting an unsupportedmediatype error with "Message": "The WebHook request must contain an entity body formatted as JSON
How do I get the output of the second stage into a function?
What I typically do in those scenario's is create a json-body for the Function and add the messagecontent I want to sent to the function as a Base64-string as a part of the json-body (eg. Payload, or Body).
Similar approach on how Logic Apps handles certain media types at runtime.
{"OriginalFileName" : "myfile.csv", "PayLoad" : "ContentBase64String"}

generateSharedAccessSignature not adding sv parameter?

I'm trying to generate a Shared Access Signature and am using the code here (http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx) for a custom API to generate the SAS.
It seems to be missing the sv=2014-02-14 parameter when calling "generateSharedAccessSignature()".
The SAS url doesn't seem to work when I try it (getting a 400 xml not valid error) but if I try a SAS generated from Azure Management Studio the URL contains the "sv" parameter and works when I attempt to upload with it.
Any ideas?
Based on the Storage Service REST API Documentation, sv parameter in Shared Access Signature is introduced in storage service version 2014-02-14. My guess is that Azure Mobile Service is using an older version of the storage service API and this is the reason you don't see sv parameter in your SAS token.
You could be getting 400 error (invalid XML) because of this. In the earlier version of storage service API, the XML syntax for committing block list was different than what is used currently. I have had one more user come to my blog post complaining about the same error. Please try the following XML syntax when performing a commit block list operation and see if the error is gone:
<?xml version="1.0" encoding="utf-8"?>
<BlockList>
<Block>[base64-encoded-block-id]</Block>
<Block>[base64-encoded-block-id]</Block>
...
<Block>[base64-encoded-block-id]</Block>
</BlockList>
Please notice that we're not using Latest node. Instead we're using Block node.
Leaving the sv parameter out and setting it as part of the PUT request header worked using:
xhr.setRequestHeader('x-ms-version','2014-02-14');
You can check out this example for an azure file upload script: http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
...which will work with the generated SAS from the question's original blog link - http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx
Add the request header in the beforeSend like so:
beforeSend: function(xhr) {
xhr.setRequestHeader('x-ms-version','2014-02-14');
},

Setting Metadata in Google Cloud Storage (Export from BigQuery)

I am trying to update the metadata (programatically, from Python) of several CSV/JSON files that are exported from BigQuery. The application that exports the data is the same with the one modifying the files (thus using the same server certificate). The export goes all well, that is until I try to use the objects.patch() method to set the metadata I want. The problem is that I keep getting the following error:
apiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/storage/v1/b/<bucket>/<file>?alt=json returned "Forbidden">
Obviously, this has something to do with bucket or file permissions, but I can't manage to get around it. How come if the same certificate is being used in writing files and updating file metadata, i'm unable to update it? The bucket is created with the same certificate.
If that's the exact URL you're using, it's a URL problem: you're missing the /o/ between the bucket name and the object name.

Resources