How to fix "Error: /home/site/wwwroot/node_modules/canvas/build/Release/canvas.node: invalid ELF header" on NodeJs Azure Functions in Linux? - node.js

I am trying to deploy an AzureFunctions in NodeJs but it doesn't work on Azure.
My apllication is a v3 functions running on Linux.
When the deploy is completed, i get this 500 error:
Error:
/home/site/wwwroot/node_modules/canvas/build/Release/canvas.node:
invalid ELF header
Its happen only when I do this imports:
import ChartDataLabels from 'chartjs-plugin-datalabels';
const canvasRenderService = new CanvasRenderService(width, height, chartCallback);
const chartCallback = (ChartJS) => {
ChartJS.register(require('chartjs-plugin-datalabels'))
};
const jsdom = require("jsdom");
const { JSDOM } = jsdom;
const { document } = (new JSDOM(`...`)).window;
Would someone help me please?
It works (only) on my machine :(
Edit: It works when I make the deploy by Linux Subsystem.
I hope this will help somebody.

Azure function will not include the Node_modules while deploying into azure. Because Node_modules directory contains very large file. You can include your package.json in you function directory and run npm install as you normally would with Node.js projects using Kudu (https://<function_app_name>.scm.azurewebsites.net )or the Console in the Azure portal.
Check Dependency management for more information.
Refer here Link 1 & Link 2

Any updates on this topic?
Doesn't seem like a valid option for me to manually run npm install via KUDU or some other terminal in a Cloud Function App - especially with Continoues Deployment etc.
Got the same problem while using canvas for barcode generation...

Related

Dockerfile setup for Deploying a Puppeteer Nodejs App on Droplet

I have exhausted all the probable solutions on stackoverflow and beyond all to no success.
My use case is a very simple nodejs app that uses puppeteer ^19.7.1.
My directory structure has the file .puppeteerrc.cjs with content
const { join } = require('path');
/**
* #type {import("puppeteer").Configuration}
*/
module.exports = {
// Changes the cache location for Puppeteer.
cacheDirectory: join(__dirname, '.cache', 'puppeteer'),
};
However, when the server starts I am constantly greeted with the error message:
/workspace/.cache/puppeteer/chrome/linux-1069273/chrome-linux/chrome: error while loading shared libraries: libnss3.so: cannot open shared object file: No such file or directory
Everything works well on my localhost. The issue only starts when I hosted it on Digital Ocean Droplets.
I tried copying over the Dockerfile setup here https://pptr.dev/troubleshooting#running-puppeteer-in-the-cloud as is over to the root of my project to see if the issue will be resolved all to no available.
So, please I would really appreciate if any one can help me with a working Dockerfile configuration to address this as I've spent all day on this to no success.
I intend to host the app on Digital Ocean's Droplets
Thanks in anticipation for your time

Gatsby Source Drupal not fetching data when trying to deploy to netlify/heroku

I have a site running Gatsby and Gatsby-Source-Drupal7, it is a plugin that uses Graphql to make an axios get request to https://stagingsupply.htm-mbs.com/restws_resource.json and uses the json data to query. I am able to run it just fine on my computer by going to localhost:8000 and it creates over 200k nodes, but when I try to deploy on any cloud service provider like Gatsby Cloud or Netlify it doesn't fetch any nodes or data at all from the site.
Warning from console
Starting to fetch data from Drupal
warn The gatsby-source-drupal7 plugin has generated no Gatsby nodes. Do you need
it?
Code
code from gatsby config
module.exports = {
siteMetadata: {
title: `new`,
siteUrl: `https://www.yourdomain.tld`,
},
plugins: [
{
resolve: `gatsby-source-drupal7`,
options: {
baseUrl: `https://stagingsupply.htm-mbs.com/`,
apiBase: `restws_resource.json`, // optional, defaults to `restws_resource.json`
},
},
]
}
gatsby-config.js from node_modules/gatsby-source-drupal7
const createNode = actions.createNode; // Default apiBase to `jsonapi`
apiBase = apiBase || `restws_resource.json`; // Fetch articles.
// console.time(`fetch Drupal data`)
console.log(`Starting to fetch data from Drupal`);
const data = yield axios.get(`${baseUrl}/${apiBase}`, {
auth: basicAuth
});
const allData = yield Promise.all(_.map(data.data.list,
Link to repo that works on local computer https://github.com/nicholastorr/gatsby-d7
any and all help will be appreciated
As you pointed out, you've played around with the Node versions using NODE_ENV and engines workarounds. My guess also relies on a mismatching Node version between environments but as Netlify docs suggests, there are only two ways of customizing Node versions to manage dependencies.
Set a NODE_VERSION environment variable.
Add a .node-version or .nvmrc file to the site’s base directory in your repository. This will also tell any other developer using the
repository which version of Node.js it depends on.
Without seeing your Netlify build command (to see the NODE_VERSION) there's no .node-version nor .nvmrc in your repository. I'd try creating it at the root of the project with the v14.17.1 in it and trying a fresh install.
In addition, double-check other server-related conflicts like IP-blocking, etc.
Error was nothing Gatsby or Node related, my site was block the IP of the server :>

Folders missing after packaging with electron-packager

Introduction
We have an electron app, which uses azure-storage for getting documents from our azure blob.
Everything seems to work when we run the app in debug-mode, but when we start the app on itself (packaging with electron-packager . --platform=win32 --overwrite),
some folders of the azure-storage node_moduls are missing (md5-wrapper and request-wrapper).
The problem
The app throws an error
Uncaught Error: Cannot find module '../md5-wrapper'
in the module.js.
If we simply insert the two folders with copy and paste in the standalone app, everything works fine.
Why aren't the two folders missing? Every oder Package is complete, just these two are missing.
To reproduce the error, just use the sample-project from here and add an javascript file in the index.html, which has the line
var azure = require('azure-storage');
EDIT:
The Code I use to download the BLOB is:
var azure = require('azure-storage');
var blobService = azure.createBlobServiceWithSas(blobUri, SAS_TOKEN);
blobService.getBlobToStream('folder',
'file.zip',
fs.createWriteStream(DESTINATION_PATH+'\\file.zip'),
function(error, result, response) {//finished});
node -v prints v6.4.0

azure-storage not working with web pack (azure-functions-pack) server side

I made an Azure Function micro service using Node.js and I'm using the npm module azure-storage to insert files in a Blob Storage.
Locally is working fine, but when deploying to development environment, it is executed a script that executes azure-functions-pack and generate a bundle with the service code and all the required npm modules. Then when making a request to the micro service, it return a status code 500 and in the logs the error is the following:
System.Exception : Error: Cannot find module "."
at webpackMissingModule (D:\home\site\wwwroot\.funcpack\index.js:238044:68)
at Object.<anonymous> (D:\home\site\wwwroot\.funcpack\index.js:238044:147)
at __webpack_require__ (D:\home\site\wwwroot\.funcpack\index.js:21:30)
...
I only know that the problem is the azure-storage module because If I comment the "azureStorage = require('azure-storage');" , then start working. I also tried the npm module fast-azure-storage without success and until now I was not able to find a workaround for this problem. The code that uses this module is the following:
const blobSvc = azureStorage.createBlobService(storageConnectionString);
const writeStream = blobSvc.createWriteStreamToBlockBlob('containerName', fileName);
return new Promise(function (resolve) {
writeStream.write(svgString);
writeStream.on('close', () => {
resolve('https://' + storageAccount + '.blob.core.windows.net/containerName/' + fileName);
});
writeStream.end();
});
The version of azure-storage is 2.6.0. Thanks for any help.
Not a direct answer to your question - but you should use output binding feature of Azure Function to insert Blobs instead of doing that manually with library calls.
If you do that, you won't have to import the package, so it will also solve your problem.
Read more about output bindings in docs, there is a node example there too.
Actually the problem wasn't the azure-storage module, but the node-chartist module which for some reason also was causing problems in other modules. After remove node-chartist all the modules started working perfectly.

Serverless Framework with AWS Lambda error "Cannot find module"

I'm trying to use the Serverless Framework to create a Lambda function that uses open weather NPM module. However, I'm getting the following exception, but my node_modules contain the specific library.
I have managed to run the sample, (https://github.com/serverless/examples/tree/master/aws-node-rest-api-with-dynamodb) successfully, now hacking to add node module to integrate open weather API.
Endpoint response body before transformations: {"errorMessage":"Cannot find module 'Openweather-Node'","errorType":"Error","stackTrace":["Module.require (module.js:353:17)","require (internal/module.js:12:17)","Object.<anonymous> (/var/task/todos/weather.js:4:17)","Module._compile (module.js:409:26)","Object.Module._extensions..js
My code
'use strict';
const AWS = require('aws-sdk'); // eslint-disable-line import/no-extraneous-dependencies
var weather = require('Openweather-Node');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.weather = (event, context, callback) => {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: event.pathParameters.id,
},
};
weather.setAPPID("mykey");
//set the culture
weather.setCulture("fr");
//set the forecast type
weather.setForecastType("daily");
const response = {
statusCode: 200,
body: "{test response}",
};
callback(null, response);
};
Did you npm install in your working directory before doing your serverless deploy? The aws-sdk node module is available to all lambda functions, but for all other node dependencies you must install them so they will be packaged with your lambda when you deploy.
You may find this issue on the serverless repository helpful (https://github.com/serverless/serverless/issues/948).
I fixed this error when in package.json I moved everything from devDependencies to dependencies.
Cheers
I don't if it applies to this answer but in case someone just needs a brain refresh I forgot to export my handler and was exporting the file with was looking for a default export that didn't exist...
changed from this...
handler: foldername/exports
to this...
handler: foldername/exports.handler
I have the same problem with serverless framework to deploy multiple lambda functions. I fixed by the following steps
Whatever you keep the path at the handler like handler: foldername/exports.handler
Name the file inside the folder as exports.js(whatever you name the handler)
run serverless deploy
This should solve your problem
I went to cloud watch and looked for the missing packages
Then npm i "missing package" and do sls deploy
The missing packages are needed in the dependencies in my case there was some on the devDepencies and another missing
You need to do the package deployment in case you have external dependencies.
Please see this answer
AWS Node JS with Request
Reference
http://docs.aws.amazon.com/lambda/latest/dg/nodejs-create-deployment-pkg.html
In several cases, don't forget to check global serverless installation.
mine solved by reinstalling:
npm install -g serverless
In my case, what worked was switching to node 10 (via nvm). I was using a newer version of node (v15.14.0) than was probably supported by the packages.
My case was configuring params for creating an AWS lambda function. The right string for a handler was (last row):
Resources:
StringResourceName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: myFileName.handler
Where myFileName is the name of a file that I use as the zip file.
You have issue with your ts files, as serverless-offline plugin cannot find equivalent js file it throws error module not found.
There is a work around to it to install (serverless-plugin-typescript) . Only issue with the plugin is that it creates a new .build/.dist folder with transpiled js files
I was doing some stupidity. But still wanted to put here so any beginner like me should not struggle for it. I copied the serverless.xml from example where handler value was
handler: index.handler
But my index.js was in src folder. Hence I was getting file not found. It worked after I change the handler value to
handler: src/index.handler
For anyone developing python lambda functions with serverless-offline and using a virtual environment during local development, deactivate your environment, delete it entirely, and recreate it. Install all python requirements and try again. This worked for me.
For me, the issue was that the handler file name contained a dot.
main-handler.graphql.js caused serverless "Error: Cannot find module 'main'.
when I changed the file to main-handler-graphql.js everything worked.

Resources