Error: Unable to read configuration file newrelic.js - node.js

I’m getting this error:
Error: Unable to read configuration file newrelic.js.
A base configuration file can be copied from 536 and renamed to in the
directory from which you will start your application.
Project is Angular 7 SPA with SSR.

There are two solutions:
The first by bj97301 on official new relic forums.
in my webpack config file, make newrelic external:
externals: ['newrelic']
in my webpack config file, use commonjs as my output library target:
output: { libraryTarget: 'commonjs'}
change newrelic.js to use exports. instead of export:
was: export const config = {
now: exports.config = {
This methods won't work if you want to deploy a whole package to remove server since you excluded the new relic package.
To solve this, you can use the second method.
As a workaround, change newrelic.js to override Environment variable instead of exporting the config.
e.g.
process.env.NEW_RELIC_NO_CONFIG_FILE = 'true';
process.env.NEW_RELIC_APP_NAME = 'YOUR APP NAME';
process.env.NEW_RELIC_LICENSE_KEY = 'YOUR_KEY';
process.env.NEW_RELIC_LOG_LEVEL = 'warn';
process.env.NEW_RELIC_ALLOW_ALL_HEADERS = 'true';
console.log('New relic has been configurated.');
And you can remove the original config that was exported:
'use strict';
exports.config = {
app_name: [],
license_key: '',
logging: {
level: 'warning'
},
allow_all_headers: true
}
Don't forget to require on your server.ts, simply add:
require('./newrelic');
I hope it would save someone some hours.

Related

Is it possible to use webpack with a library that uses dependency injection?

I'm pretty new to webpack so apologies if this is an obvious answer. I'm currently trying to deploy a typescript lambda api to AWS using this library: https://www.npmjs.com/package/ts-lambda-api
The library uses Inversify for dependency injection: https://github.com/inversify/InversifyJS
The library also advises building by just using zip but I'd like to use webpack if I can.
I currently have two files for my api, a main api.ts file and then a controller.ts file. Currently the api class finds the controllers using the same lines as in the ts-lambda-api docs:
const controllersPath = [path.join(__dirname, "controllers")]
const app = new ApiLambdaApp(controllersPath, appConfig)
As a result I need the controller directory to be available to the api.ts class once it's been bundled. Currently my webpack config looks like this:
import path = require('path')
import webpack = require('webpack')
const config: webpack.Configuration = {
module: {
rules: [
{ test: /\.js\.map$/, use: "ignore-loader" },
{ test: /\.d\.ts$/, use: "ignore-loader" }
]
},
mode: 'production',
entry: {
my_api: ['./src/my-api.js'],
},
output: {
path: path.resolve(__dirname, 'dist'),
filename: '[name]/main.js',
library: '[name]',
libraryTarget: 'commonjs2'
},
target: 'node',
externals: [/aws-sdk.*/, 'aws-lambda']
};
export default config
How can I add to it to allow my api access to the other classes it needs? The webpack docs have a page on multiple entry points (https://webpack.js.org/concepts/entry-points/#multi-page-application) but that doesn't seem to be what I'm looking for as it gives me multiple bundles.
I can put multiple files into one bundle by doing something like:
api: ['./src/api.js','./src/controllers/controller.js],
Although this won't then work with the Injectable annotations. Anyone have any other ideas?
I managed to get around this issue by manually loading in the controllers following the section in the ts-lambda-api docs on how to do so.

#google-cloud/speech - Error: ENOENT: no such file or directory, open 'protos.json

I'm trying to use the google cloud speech to text api.
I'm using the sample google code and when i create the client object i got this error.
{
"errno":-2,
"syscall":"open",
"code":"ENOENT",
"path":"protos.json",
"stack":"Error: ENOENT: no such file or directory, open 'protos.json'\n at Object.openSync (fs.js:440:3)\n at Object.readFileSync (fs.js:342:35)\n at fetch (transcript-server-js/node_modules/protobufjs/src/root.js:160:34)\n at Root.load (/transcript-server-js/node_modules/protobufjs/src/root.js:194:13)\n at Root.loadSync (/transcript-server-js/node_modules/protobufjs/src/root.js:235:17)\n at Object.loadSync (/transcript-server-js/node_modules/#grpc/proto-loader/build/src/index.js:221:27)\n at GrpcClient.loadFromProto /transcript-server-js/node_modules/google-gax/src/grpc.ts:165:40)\n at GrpcClient.loadProto (/transcript-server-js/node_modules/google-gax/src/grpc.ts:199:17)\n at new SpeechClient /transcript-server-js/lib/webpack:/src/v1/speech_client.ts:135:28)\n at createText$ (/transcript-server-js/lib/webpack:/src/transcriptGenerator.js:50:18)"
}
this is the code
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
const results = await storage.getBuckets();
const speech = require('#google-cloud/speech');
const client = new speech.SpeechClient();
google cloud storage api works.
can someone help me?
thanks
I ran into this with #google-cloud/firestore. Both #google-cloud/firestore and #google-cloud/speech use the same mechanism to to load protos.json, so my solution should be relevant here.
This happened to me because webpack was building the #google-cloud/firestore package into my bundle. The #google-cloud/firestore package uses __dirname to find protos.json. Since the #google-cloud/firestore code was in my bundle, the __dirname variable was set to my bundle's directory instead of to the node_modules/#google-cloud/firestore/ subdirectory that contains protos.json.
Possible fix #1
Set this in your webpack config to tell webpack to set the value of __dirname:
node: {
__dirname: true,
}
https://webpack.js.org/configuration/node/
Possible fix #2
Update your webpack config to exclude #google-cloud/speech from your bundle.
One way to do this is to use the webpack-node-externals package to exclude all dependencies from the node_modules directory:
var nodeExternals = require('webpack-node-externals')
...
module.exports = {
...
externals: [nodeExternals()],
target: 'node',
...
};
https://webpack.js.org/configuration/externals/
https://www.npmjs.com/package/webpack-node-externals
Thank you so much Gabriel Deal.
I faced the same issue like you in firestore package. I understood why this is occurring from your explanation. Unfortunately the fixes didn't help me. So I had to take an alternate. I copied the protos.json file to the path it is searching for in my dist folder.
Copy protos.json from the node_modules to a folder (I named it external_files)
In the webpack.config.js copy the protos.json file from the external_files directory to the path in which it is searching (In my case it was searching in node_modules/google-gax/protos). Use the plugin CopyWebpackPlugin to do that job as shown below.
module.exports = {
.
.
.
plugins: [
new CopyWebpackPlugin([
{ from: "external_files/protos.json", to: "dist/node_modules/google-gax/protos" }
])
]
}

Swageer js doc does not instant update api docs on changes?

I am using swagger-jsdoc
I have setup swagger js docs like below in my app.js
//include swagger js doc
var swaggerJSDoc = require('swagger-jsdoc');
const swaggerUi = require('swagger-ui-express');
const pathToSwaggerUi = require('swagger-ui-dist').absolutePath()
const swaggerDefinition = {
swagger: '2.0',
info: {
// API informations (required)
title: 'API', // Title (required)
version: '1.0.0', // Version (required)
description: 'Used for api documentation', // Description (optional)
},
host: `localhost:3000`, // Host (optional)
basePath: '/app/v1', // Base path (optional)
};
// Options for the swagger docs
const options = {
// Import swaggerDefinitions
swaggerDefinition,
// Path to the API docs
// Note that this path is relative to the current directory from which the Node.js is ran, not the application itself.
apis: ['./app/v1/docs/*.yaml']
};
// Initialize swagger-jsdoc -> returns validated swagger spec in json format
const swaggerSpec = swaggerJSDoc(options);
app.use('/v1/docs', swaggerUi.serve, swaggerUi.setup(swaggerSpec));
I have certain yaml files which i have written for document the api. I hit the url from browser
localhost:3000/v1/docs
This shows me documented api in swagger ui. But when i make update in any of the yaml files and refresh the page then i don't see updated changes. I have to stop the nodemon process and restart the process again which i do not want to do. So let me know how can i do this?
By default, nodemon looks for files with the .js, .mjs, .coffee, .litcoffee, and .json extensions.
To add other extensions please use the following command:
nodemon -e yaml
For more details, refer to the official docs: https://www.npmjs.com/package/nodemon

Node-Red custom node_modules location

I'm using node-red as embedded in my Express.js application like this
https://nodered.org/docs/embedding. When embedded like this node-red cant load new nodes from npm.
Issue is that when defining custom user dir in settings.js, for example userDir: 'node-red-data/' Node-red adds loaded nodes to this folder inside node_modules.
So I have two node_modules folder:
myapp/node_modules => this is containing node-red
myapp/node-red-data/node_modules => this is containing node-red extra nodes
Some how node-red can't load modules in side myapp/node-red-data/node_modules
Is there any solutions?
Issue was on the settings file.
My setting in user dir:
var settings = {
httpAdminRoot: '/admin',
httpNodeRoot: '/ap',
nodesDir: '/nodes',
flowFile: "flows.json",
userDir: './data/'
}
Right setup:
var path = require('path');
var dir = path.dirname(__filename);
var settings = {
httpAdminRoot: '/admin',
httpNodeRoot: '/ap',
nodesDir: dir + '/nodes',
flowFile: "flows.json",
userDir: dir+'/data/'
}
So adding static path to user dir and nodes dir makes it working
I have similar problem.
I used process.execPath
userdir = path.resolve(process.execPath,'..'); //better that __dirname;
Because the dir is diferent when application is compiled.
// Create the settings object - see default settings.js file for other options
var settings = {
verbose: true,
httpAdminRoot:"/admin",
httpNodeRoot: "/",
userDir: userdir, // problem with dir...
flowFile: 'flows.json',
};

How do I serve static files using Sails.js only in development environment?

On production servers, we use nginx to serve static files for our Sails.js application, however in development environment we want Sails to serve static files for us. This will allow us to skip nginx installation and configuration on dev's machines.
How do I do this?
I'm going to show you how you could solve this using serve-static module for Node.js/Express.
1). First of all install the module for development environment: npm i -D serve-static.
2). Create serve-static directory inside of api/hooks directory.
3). Create the index.js file in the serve-static directory, created earlier.
4). Add the following content to it:
module.exports = function serveStatic (sails) {
let serveStaticHandler;
if ('production' !== sails.config.environment) {
// Only initializing the module in non-production environment.
const serveStatic = require('serve-static');
var staticFilePath = sails.config.appPath + '/.tmp/public';
serveStaticHandler = serveStatic(staticFilePath);
sails.log.info('Serving static files from: «%s»', staticFilePath);
}
// Adding middleware, make sure to enable it in your config.
sails.config.http.middleware.serveStatic = function (req, res, next) {
if (serveStaticHandler) {
serveStaticHandler.apply(serveStaticHandler, arguments);
} else {
next();
}
};
return {};
};
5). Edit config/http.js file and add the previously defined middleware:
module.exports.http = {
middleware: {
order: [
'serveStatic',
// ...
]
}
};
6). Restart/run your application, e.g. node ./app.js and try to fetch one of static files. It should work.

Resources