How to get Firebase Function request size? - node.js

What is the best way to read size of a request object sent to Firebase Callable Function?
I would like to take actions based on the size of request sent from a client but the only options I see are:
calculating size of the object in a custom function as suggested here.
using the object-sizeof npm package.
Is there any build in Firebase solution to this?

Related

How to convert Base64 to String in logic app inline code (Javascript)

Summary: Logic app inline code (which uses NodeJS) is missing Buffer class.
Detailed: I am trying to trigger a logic app when some content is pushed into SFTP. I want to add some meta-data and save the details in the cosmos DB.
The issue is, The name of the file is received as a base64 encoded string in the inline code and Buffer is not available to parse it.
I even tried to create a set variable step (and decode filename there) but I am unable
to pass this variable to the inline code step. (Not supported)
The final option would be to use cloud functions instead of inline code which I am trying to avoid.
Looking for a workaround for conversion.
Logic App error image
link to ms doc
Doesn't support require() statements
Doesn't work with variables
Inline code can only perform the simplest Javascript operations, we may not be able to use Buffer.
As for passing the base64 encoded string, you can put it in Compose first, and then pass it in the inline code.
I suggest you use the built-in base64 related methods in the Azure logic app first.
If this does not meet your needs, you can create an Azure function and then call it in the Azure logic app.

Is it alright to store user authentication token as a global variable (process.env) in a nodejs lambda function?

We have a BFF built with AWS Lambda (nodejs) and API Gateway that interfaces with an API that requires user authentication. And the way we've built it is we have a separate module/file for the API services. Something like this:
src
--handlers
--users.js // with function getMe()
--apiServices
--usersApi.js // with function getUser(id)
So what happens is the getMe() function will receive the event with the request headers with the authentication token. But we need to use the auth token in getUser(id). I've thought of two options to do this:
update getUser(id) to accept an authToken param.
store the auth token in the global variable
I'm preferring to do #2 because it requires less changes but I'm worried that this might not be a good idea because there's no way of knowing for sure when a lambda container will be reused (or if will be reused at all): https://aws.amazon.com/blogs/compute/container-reuse-in-lambda
Has someone tried the 2nd approach before? Or should I just go with #1? The thing with #1 is that we have a lot of files under apiServices with a lot of functions so I would like to apply as little change as possible.
You can do it both ways, but be careful and double check switching context between users because lambda persists for a short period of time and can be hit multiple times.

apollo graphql query an uploaded file

Apollo Server 2.0 has the ability to receive file uploads as described in this blog post.
However, all the tutorials and blog posts I found only showed how to upload a file. Nobody demonstrated how to actually retrieve the file back to display it onscreen.
Does anybody know how to properly query the file contents for display onscreen?
Also, there's the possibility that maybe there is no way of querying a file and you have to build a separate rest endpoint to retrieve the contents?
Some thoughts:
I imagine the query to be something like
query {
fetchImage(id: 'someid')
}
with the respective server-side definition
type Query {
fetchImage(id : ID!): Upload //maybe also a custom type, but how do I include the actual file contents?
}
Hint: Upload is a scalar type that apollo-server automatically adds to your type definition. It is used for the upload so I imaging it also being usable for the download/query. Please read the blog post mentioned above for more information.
The response from a GraphQL service is always serialized as a JSON object. Technically, a format other than JSON could be used in serialization but in practice only JSON is used because it meets the serialization requirements in the spec. So, the only way to send a file through GraphQL would be to convert the file into some format that's JSON-compatible. For example, you could convert a Buffer to a byte array and send that as an array of integers. You would also have to send the appropriate mime type. It would be up to the client to convert the byte array back into a usable format on receiving the response.
If you go this route, you'd have to use your own scalar or object type -- the Upload scalar does not support serialization so it will throw if you try to use it as an output type (and it's not really suitable for this sort of thing anyway).
However, while doing this is technically possible, it's also inadvisable. Serializing a larger file could cause you to run out of memory since there's no way to stream data through GraphQL (the entire response has to be in memory before it can be sent). It's much better to serve the file statically (ideally using nginx instead of Node). If your API needs to refer to the file, it can then just return the file's path.
You can do this by installing express with apollo server.
apollo-server-express
Install above package and instantiate Express object with Apollo Server as explained in package docs.
Then set the static folder using express like this
app.use("/uploads", express.static("uploads")); //Server Static files over Http
uploads is my static folder & /uploads will server get request to that path
//Now I can access static files like this
http://localhost:4000/uploads/test.jpg

Independent NPM library that validates request based on swagger file

We are building APIs using Swagger, AWS API gateway and Lambda functions with NodeJS. The API gateway will do the request validation, however as per the design, the lambda functions need to re-validate the request object as an API Gateway Proxy Request Event. This makes sense as in theory we can reuse the lambda functions by invoking them via other event source (e.g. SNS).
Therefore we need an NodeJS tool which can validate the request (not only body but also params, etc) based on the swagger spec - exactly what the swagger-tools and a few other tools (e.g. swagger-request-validator) are doing, but not as a middleware.
I did some search but could not find one, also looked into swagger-tools source code, reckon its validation component was written in the way that cannot be easily used separately.
Any suggestion is welcome. Thanks in advance.
You can use swagger-model-validator.
var Validator = require('swagger-model-validator');
var swaggerFile = require("./swagger.json");
const validator = new Validator(swaggerFile);
console.log(validator.validate({
name: 'meg'
}, swaggerFile.definitions.Pet, swaggerFile.definitions, true).GetErrorMessages())
This outputs:
[ 'photoUrls is a required field' ]
validator.validate returns an object, so you can also check if the returned object contains anything under the errors attribute. It should be as simple as
if (validator.validate({
name: 'meg'
}, swaggerFile.definitions.Pet, swaggerFile.definitions, true).errors) {
// do something with error
}
I have used Swagger's sample JSON for this answer.

Foxx/ArangoDB: Can you create a response that adhere to JSON API specification?

I am currently writing some micro services with Foxx to be consumed by Ember.js. Ember data plays very nicely with JSON API (http://jsonapi.org) responses. So I tried to serialize the Foxx responses with the json-api-serializer (https://www.npmjs.com/package/json-api-serializer) - but with no luck. I only found the forClient method, but this only allows me to operate on the JSON representation of single objects, not the whole response. So my question: Is it possible to implement JSON API with Foxx/ArangoDB?
You can return arbitrary responses from Foxx routes, so it's entirely possible to generate JSON responses that conform to JSON API.
However there's no built-in way to do this automatically.
I don't see anything in json-api-serializer that shouldn't work in Foxx, so I'm not sure what problems you are encountering. You should be able to simply return the output object with res.json(outputFromSerializer) and set the content type with res.set('content-type', 'application/vnd.api+json').
If everything else fails you can just write your own helper functions to generate the boilerplate and metadata JSON API expects.

Resources