error enoent when assigning path to ffmpeg - node.js

I am trying to set up a relay from my rtmp server and I'm using an npm module called node-media-server. I have set up my rtmp protocol successfully but when I am trying to set up the relay it expects me to add a path my my ffmpeg library so i npm installed ffmpeg then supplied the node_module path to the relay but I keep on getting error uncaughtException Error: spawn C:\blah\blah\node_modules\ffmpeg ENOENT the ffmpeg library definitely exist at the specified location. Why is this happening? Thanks in advance.
Link to the module im using: https://www.npmjs.com/package/node-media-server
const NodeMediaServer = require('node-media-server');
const path = require('path')
const ffmpegPath = path.join(__dirname, '..', 'node_modules', 'ffmpeg')
const config = {
rtmp: {
port: 1935,
chunk_size: 60000,
gop_cache: true,
ping: 30,
ping_timeout: 60
},
http: {
port: 8000,
allow_origin: '*'
},
relay: {
ffmpeg: ffmpegPath,
tasks: [{
app: 'live',
mode: 'push',
edge: 'rtmp://localhost:1936',
},
{
app: 'live',
mode: 'push',
edge: 'rtmp://localhost:1937',
}
]
}
};
var nms = new NodeMediaServer(config)
nms.run();

It seems like the rtmp server is not finding a ffmpeg installation. I had to install node-ffmpeg-installer, a package that installs ffmpeg as a node-module. Then I just assigned ffmpeg.path to ffmpeg key in the relay config. Hope this helps.
const NodeMediaServer = require('node-media-server');
const ffmpeg = require('#ffmpeg-installer/ffmpeg');
const config = {
rtmp: {
port: 1935,
chunk_size: 60000,
gop_cache: true,
ping: 30,
ping_timeout: 60
},
http: {
port: 8000,
allow_origin: '*'
},
relay: {
ffmpeg: ffmpeg.path,
tasks: [{
app: 'live',
mode: 'push',
edge: 'rtmp://localhost:1936',
},
{
app: 'live',
mode: 'push',
edge: 'rtmp://localhost:1937',
}
]
}
};
var nms = new NodeMediaServer(config)
nms.run();

Related

Use Redis with AWS SAM (Redis Client Error)

Right now what I'm trying to do is that every time a request is made, a query is made to the Redis service. The problem is that when using a basic configuration, it would not be working. The error is the following:
INFO Redis Client Error Error: connec at TCPConnectWrap.afterConnect [as oncomplete] (node} port: 6379127.0.0.1',
I have as always running redis-server with its corresponding credentials listening to port 127.0.0.1:6379. I know that AWS SAM runs with a container, and the issue is probably due to a network configuration, but the only command that AWS SAM CLI provides me is --host. How could i fix this?
my code is the following, although it is not very relevant:
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { createClient } from 'redis';
import processData from './src/lambda-data-dictionary-read/core/service/controllers/processData';
export async function lambdaHandler(event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> {
const body: any = await processData(event.queryStringParameters);
const url = process.env.REDIS_URL || 'redis://127.0.0.1:6379';
const client = createClient({
url,
});
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
await client.set('key', 'value');
const value = await client.get('key');
console.log('----', value, '----');
const response: APIGatewayProxyResult = {
statusCode: 200,
body,
};
if (body.error) {
return {
statusCode: 404,
body,
};
}
return response;
}
My template.yaml:
Transform: AWS::Serverless-2016-10-31
Description: >
lambda-data-dictionary-read
Sample SAM Template for lambda-data-dictionary-read
Globals:
Function:
Timeout: 0
Resources:
IndexFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: app/
Handler: index.lambdaHandler
Runtime: nodejs16.x
Timeout: 10
Architectures:
- x86_64
Environment:
Variables:
ENV: !Ref develope
REDIS_URL: !Ref redis://127.0.0.1:6379
Events:
Index:
Type: Api
Properties:
Path: /api/lambda-data-dictionary-read
Method: get
Metadata:
BuildMethod: esbuild
BuildProperties:
Minify: true
Target: 'es2020'
Sourcemap: true
UseNpmCi: true
Im using:
"scripts": {
"dev": "sam build --cached --beta-features && sam local start-api --port 8080 --host 127.0.0.1"
}

How to use pino-transport in nodejs for logs.?

const pino = require('pino')
const transport = pino.transport({
targets: [{
level: 'info',
target: 'pino-pretty' // must be installed separately
}, {
level: 'trace',
target: 'pino/file',
options: { destination: '/path/to/store/logs' }
}]
})
pino(transport)
Why I am getting this error pino.transport is not a function...?
It is because your project is using pino version 6.x.
Transports were introduced in v7.x or later

prevent ffmpeg from opening console window

I have a node/express server which is used to give streams from IP camera to a website. Everything is working well. I run that webserver with PM2 on a windows server.
The problem : for each stream I have a windows console opening with just nothing logged in. The console reopen when I try to close it.
Is there a way to prevent those console to open ?
Here is the related node.js code :
const { NodeMediaServer } = require('node-media-server');
private _initiate_streams(): void{
DatabaseProvider.instance.camerasDao.getCamerasList().pipe(
take(1)
).subscribe(
(databaseReadOperationResult: DatabaseReadOperationResult<ICamera[]>) => {
if (databaseReadOperationResult.successful === true){
const cameras = databaseReadOperationResult.result;
const tasks = [];
cameras.forEach( camera => {
tasks.push(
{
app : config.get('media_server.app_name'),
mode: 'static',
edge: camera.rtsp_url,
name: camera.stream_name,
rtsp_transport: 'tcp'
}
)
});
const configMediaServer = {
logType: 3, // 3 - Log everything (debug)
rtmp: {
port: 1935,
chunk_size: 60000,
gop_cache: true,
ping: 60,
ping_timeout: 30
},
http: {
port: config.get('media_server.port'),
allow_origin: '*'
},
auth: {
play: true,
api: true,
publish: true,
secret: config.get('salt'),
api_user: 'user',
api_pass: 'password',
},
relay: {
ffmpeg: 'C:\\FFmpeg\\bin\\ffmpeg.exe',
tasks: tasks
}
};
var nms = new NodeMediaServer(configMediaServer)
nms.run();
} else {
// catch exception
}
}
);
}

Webpacking a NodeJS Express API with MySQL throws connection error in mode '-p', not in '-d'

I have a simple Express API where I use MySQL to retrieve my data. I use Webpack 4 to bundle it with a very simple configuration:
'use strict';
const path = require('path');
module.exports = {
entry: './src/main.js',
target: 'node',
output: {
filename: 'gept_api.js',
path: path.resolve(__dirname, 'dist'),
},
node: {
__dirname: true,
},
};
When I use webpack --config webpack.config.js -d for development everything works just fine.
However, when I run webpack --config webpack.config.js -p for production it suddenly doesn't work anymore, and throws an error when it's getting a connection from the pool.
TypeError: Cannot read property 'query' of undefined
at Object.getItem (C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:154359)
at t.db_pool.getConnection (C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:154841)
at c._callback (C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:68269)
at c.end (C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:8397)
at C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:322509
at Array.forEach (<anonymous>)
at C:\Users\freek\Dropbox\Code\Apps\GEPT\GEPTv2_API\dist\gept_api.js:1:322487
at process._tickCallback (internal/process/next_tick.js:112:11)
So somehow this is broken by using the production mode in webpack 4. The connection object undefined somehow, while it isn't in development mode.
I have no idea how to fix this, since I'm a noob in using Webpack. I tried searching on google, but couldn't find anything relevant.
How I create my pool:
'use strict';
var mysql = require('mysql');
var secret = require('./db-secret');
module.exports = {
name: 'gept_api',
hostname: 'https://api.toxsickproductions.com/gept',
version: '1.3.0',
port: process.env.PORT || 1910,
db_pool: mysql.createPool({
host: secret.host,
port: secret.port,
user: secret.user,
password: secret.password,
database: secret.database,
ca: secret.ca,
}),
};
How I consume the connection:
pool.getConnection((err, connection) => {
PlayerRepository.getPlayer(req.params.username, connection, (statusCode, player) => {
connection.release();
res.status(statusCode);
res.send(player);
return next();
});
});
and
/** Get the player, and logs to HiscoreSearch if exists.
*
* Has callback with statusCode and player. Status code can be 200, 404 or 500.
* #param {string} username The player's username.
* #param {connection} connection The mysql connection object.
* #param {(statusCode: number, player: { username: string, playerType: string }) => void} callback Callback with statusCode and the player if found.
*/
function getPlayer(username, connection, callback) {
const query = 'SELECT p.*, pt.type FROM Player p JOIN PlayerType pt ON p.playerType = pt.id WHERE username = ?';
connection.query(query, [username.toLowerCase()], (outerError, results, fields) => {
if (outerError) callback(500);
else if (results && results.length > 0) {
logHiscoreSearch(results[0].id, connection, innerError => {
if (innerError) callback(500);
else callback(200, {
username: results[0].username,
playerType: results[0].type,
deIroned: results[0].deIroned,
dead: results[0].dead,
lastChecked: results[0].lastChecked,
});
});
} else callback(404);
});
}
I found what was causing the issue. Apparantly the mysql package relies on Function.prototype.name because setting keep_fnames: true fixed the production build. (https://github.com/mishoo/UglifyJS2/tree/harmony#mangle-options)
I disabled the Webpack 4 standard minification and used custom UglifyJSPlugin settings:
'use strict';
const path = require('path');
const UglifyJsPlugin = require('uglifyjs-webpack-plugin')
module.exports = {
entry: './src/main.js',
target: 'node',
output: {
filename: 'gept_api.js',
path: path.resolve(__dirname, 'dist'),
},
node: {
__dirname: true,
},
optimization: {
minimize: false,
},
plugins: [
new UglifyJsPlugin({
parallel: true,
uglifyOptions: {
ecma: 6,
mangle: {
keep_fnames: true,
},
},
}),
],
};

res.render doesn't render correctly after 1023 characters

I have a parent Express app, and a Ghost app as a child app, using Ghost as an npm module here.
I routed Ghost to be rendered at http://localhost:9000/blog. All the configuration works fine (Ghost will throw an error if the basic configuration isn't being provided correctly).
Here is my Ghost startup code
ghost({
config: path.join(__dirname, '/config/ghost.config.js')
}).then(function (ghostServer) {
app.use(ghostServer.config.paths.subdir, ghostServer.rootApp);
ghostServer.start(app);
});
here is my Ghost config
// # Ghost Configuration
// Setup your Ghost install for various [environments](http://support.ghost.org/config/#about-environments).
// Ghost runs in `development` mode by default. Full documentation can be found at http://support.ghost.org/config/
var path = require('path'),
config;
config = {
// ### Production
// When running Ghost in the wild, use the production environment.
// Configure your URL and mail settings here
production: {
url: 'http://my-ghost-blog.com',
mail: {},
database: {
client: 'sqlite3',
connection: {
filename: path.join(__dirname, '/content/data/ghost.db')
},
debug: false
},
server: {
host: '127.0.0.1',
port: '2368'
}
},
// ### Development **(default)**
development: {
// The url to use when providing links to the site, E.g. in RSS and email.
// Change this to your Ghost blog's published URL.
url: 'http://localhost:9000/blog/',
// Example mail config
// Visit http://support.ghost.org/mail for instructions
// ```
// mail: {
// transport: 'SMTP',
// options: {
// service: 'Mailgun',
// auth: {
// user: '', // mailgun username
// pass: '' // mailgun password
// }
// }
// },
// ```
// #### Database
// Ghost supports sqlite3 (default), MySQL & PostgreSQL
database: {
client: 'sqlite3',
connection: {
filename: path.join(__dirname, '../blog/data/ghost-dev.db')
},
debug: false
},
// #### Server
// Can be host & port (default), or socket
server: {
// Host to be passed to node's `net.Server#listen()`
host: '127.0.0.1',
// Port to be passed to node's `net.Server#listen()`, for iisnode set this to `process.env.PORT`
port: '9000'
},
// #### Paths
// Specify where your content directory lives
paths: {
contentPath: path.join(__dirname, '../blog/')
}
},
// **Developers only need to edit below here**
// ### Testing
// Used when developing Ghost to run tests and check the health of Ghost
// Uses a different port number
testing: {
url: 'http://127.0.0.1:2369',
database: {
client: 'sqlite3',
connection: {
filename: path.join(__dirname, '/content/data/ghost-test.db')
}
},
server: {
host: '127.0.0.1',
port: '2369'
},
logging: false
},
// ### Testing MySQL
// Used by Travis - Automated testing run through GitHub
'testing-mysql': {
url: 'http://127.0.0.1:2369',
database: {
client: 'mysql',
connection: {
host : '127.0.0.1',
user : 'root',
password : '',
database : 'ghost_testing',
charset : 'utf8'
}
},
server: {
host: '127.0.0.1',
port: '2369'
},
logging: false
},
// ### Testing pg
// Used by Travis - Automated testing run through GitHub
'testing-pg': {
url: 'http://127.0.0.1:2369',
database: {
client: 'pg',
connection: {
host : '127.0.0.1',
user : 'postgres',
password : '',
database : 'ghost_testing',
charset : 'utf8'
}
},
server: {
host: '127.0.0.1',
port: '2369'
},
logging: false
}
};
module.exports = config;
So basically, when I go to http://localhost:9000/blog, it isn't being rendered at all. Nothing. I was using Chrome and also testing it using Safari. Also tested those two without JavaScript turned on.
And then I try to do curl http://localhost:9000/blog, and try using a requester app (like Postman) and they returned the correct html string. I also tried to do a curl using the user agent as Chrome and as Safari, it also returns the correct html.
I traced down to ghost node_modules, and the renderer is in ghost > core > server > controllers > frontend > index.js in this line res.render(view, result)
I changed the res.render to be like this
res.render(view, result, function(err, string) {
console.log("ERR", err);
console.log("String", string);
res.send(string);
})
and there is no error, it logs the current string, but it doesn't render anything on the browser.
I tried curl, postman, works, but browser doesn't work.
then I tried to send a hello world string, it works, the browser rendered it.
Then I add the string length one by one, and it turns out, any str.length < 1023 will be able to be rendered by the browser, but once it get past that, it doesn't.
And I tried in my parent Express app, it is able to send string which length is more than 1023, and if I use the ghost module as a standalone, it also able to send string more than 1023.
So something must have happened between those two, but I don't know how to debug this.
Please help

Resources