node-media-server: session.reject() not working - node.js

I am trying to create an RTMP-server with the npm package: http://github.com/illuspas/Node-Media-Server. So the server works fine but I need to implement authentication in it. I am trying to check the authentication on "prePublish" event. I am querying the database and retrieving the user if the user was found then I want to let the user stream otherwise rejected. But the problem is, it doesn't leave it instead disconnects and then the stream automatically reconnected to it then it disconnects again and the loop goes on. How do I fix this problem?
Here is the code for the event:
const NodeMediaServer = require('node-media-server');
const config = require('./config').rtmp_server;
const db = require('./db');
const nms = new NodeMediaServer(config);
const getStreamKeyFromStreamPath = (path) => {
const parts = path.split('/');
return parts[parts.length - 1];
};
nms.on('prePublish', async (id, StreamPath, args) => {
const session = nms.getSession(id);
try {
const streamKey = getStreamKeyFromStreamPath(StreamPath);
const validStream = (
await db.query('SELECT * FROM public."People" WHERE stream_key = $1', [streamKey])
).rows[0];
console.log(validStream);
if (validStream) {
// do stuff
} else {
session.reject((reason) => {
console.log(reason);
});
}
console.log(
'[NodeEvent on prePublish]',
`id=${id} StreamPath=${StreamPath} args=${JSON.stringify(args)}`
);
} catch (err) {
session.reject();
}
});
module.exports = nms;
Here is the code of the entry point of the server:
require("dotenv").config();
const db = require("./db");
const nms = require("./nms");
// database connection
db.connect()
.then(() => {
console.log("Connected to database");
// start the rtmp server
nms.run();
})
.catch((err) => console.log(err.message));
Here is the db file:
const { Pool } = require('pg');
const connectionString = process.env.PG_CONNECTION_STRING;
const poolOptions = {
host: process.env.PG_HOST,
user: process.env.PG_USER,
port: process.env.PG_PORT,
password: process.env.PG_PASSWORD,
database: process.env.PG_DATABASE,
};
const pool = new Pool(process.env.NODE_ENV === 'production' ? connectionString : poolOptions);
module.exports = pool;
My procedures to solve that problem:
Instead of the async function, I tried to handle the database query using a callback but it didn't work.
Before I was calling session.reject() now I am passing a callback there but the behavior is still the same
If you have any solution for that, please let me know.
Thanks in advance

Related

How to properly connect to MongoDB using Cloud functions?

I would like to connect to my Atlas cluster only once per instance running Cloud Functions.
Here is my code for an instance :
const MongoClient = require("mongodb").MongoClient;
const client = new MongoClient("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
client.connect(() => {
const testCollection = client.db("myDB").collection("test");
testCollection.insertOne(data);
});
});
And i would like to avoid the client.connect() in each function call that seems to be really too much.
I would like to do something like this :
const MongoClient = require("mongodb").MongoClient;
const client = await MongoClient.connect("myUrl", {
useNewUrlParser: true,
useUnifiedTopology: true,
});
const db = client.db("myDB");
exports.myHttpMethod = functions.region("europe-west1").runWith({
memory: "128MB",
timeoutSeconds: 20,
}).https.onCall((data, context) => {
console.log("Data is: ", data);
const testCollection = db.collection("test");
testCollection.insertOne(data);
});
But i can't await like this.
In my AWS Lambda functions (running in python) i have not this issue and i am able to connect only once per instance, so i guess there is an equivalent but i don't know much JS / Node JS.
You can store your database client as a global variable. From the documentation,
Cloud Functions often recycles the execution environment of a previous invocation. If you declare a variable in global scope, its value can be reused in subsequent invocations without having to be recomputed.
Try refactoring the code as shown below:
import * as functions from "firebase-functions";
import { MongoClient } from "mongodb";
let client: MongoClient | null;
const getClient = async () => {
if (!client) {
const mClient = new MongoClient("[MONGODB_URI]", {});
client = await mClient.connect();
functions.logger.log("Connected to MongoDB");
} else {
functions.logger.log("Using existing MongoDB connection");
}
functions.logger.log("Returning client");
return client;
};
export const helloWorld = functions.https.onRequest(
async (request, response) => {
const db = (await getClient()).db("[DATABASE]");
const result = await db.collection("[COLLECTION]").findOne({});
response.send("Hello from Firebase!");
}
);
This should reuse the connection for that instance.

Best practice running queries in Node.js with MongoDB driver 3.6?

The official documentation of the Node.js Driver version 3.6 contains the following example for the .find() method:
const { MongoClient } = require("mongodb");
// Replace the uri string with your MongoDB deployment's connection string.
const uri = "mongodb+srv://<user>:<password>#<cluster-url>?w=majority";
const client = new MongoClient(uri);
async function run() {
try {
await client.connect();
const database = client.db("sample_mflix");
const collection = database.collection("movies");
// query for movies that have a runtime less than 15 minutes
const query = { runtime: { $lt: 15 } };
const options = {
// sort returned documents in ascending order by title (A->Z)
sort: { title: 1 },
// Include only the `title` and `imdb` fields in each returned document
projection: { _id: 0, title: 1, imdb: 1 },
};
const cursor = collection.find(query, options);
// print a message if no documents were found
if ((await cursor.count()) === 0) {
console.log("No documents found!");
}
await cursor.forEach(console.dir);
} finally {
await client.close();
}
}
To me this somewhat implies that I would have to create a new connection for each DB request I make.
Is this correct? If not, then what is the best practise to keep the connection alive for various routes?
You can use mongoose to set a connection with your database.
mongoose.connect('mongodb://localhost:27017/myapp', {useNewUrlParser: true});
then you need to define your models which you will use to communicate with your DB in your routes.
const MyModel = mongoose.model('Test', new Schema({ name: String }));
MyModel.findOne(function(error, result) { /* ... */ });
https://mongoosejs.com/docs/connections.html
It's 2022 and I stumbled upon your post because I've been running into the same issue. All the tutorials and guides I've found so far have setups that require reconnecting in order to do anything with the Database.
I found one solution from someone on github, that creates a class to create, save and check if a client connection exist. So, it only recreates a client connection if it doesn't already exist.
const MongoClient = require('mongodb').MongoClient
class MDB {
static async getClient() {
if (this.client) {
return this.client
}
this.client = await MongoClient.connect(this.url);
return this.client
}
}
MDB.url='<your_connection_url>'
app.get('/yourroute', async (req, res) => {
try {
const client = await MDB.getClient()
const db = client.db('your_db')
const collection = db.collection('your_collection');
const results = await collection.find({}).toArray();
res.json(results)
} catch (error) {
console.log('error:', error);
}
})

Google Cloud Function returns vague 'Connection error' when trying to connect to Google CloudSQL (mysql) instance

I was trying out this codelab code snippet by Google. I wanted to alter it in a way so that the Vision API metadata would be stored in a relational MySQL using the CloudSQL service - following their examples on how to connect Cloud Functions with CloudSQL.
The code I ended up deploys but upon triggering the function (by uploading a new image) I get a vague 'connection error' in the logs with no more information. This is my code at this moment:
const vision = require('#google-cloud/vision');
const Storage = require('#google-cloud/storage');
const client = new vision.ImageAnnotatorClient();
const winston = require('winston');
const {LoggingWinston} = require('#google-cloud/logging-winston');
const loggingWinston = new LoggingWinston();
const logger = winston.createLogger({
level: 'info',
transports: [new winston.transports.Console(), loggingWinston],
});
const createUnixSocketPool = async (config) => {
const dbSocketPath = "/cloudsql"
return await mysql.createPool({
user: 'root',
password: 'mypassword',
database: 'mydatabase',
socketPath: `${dbSocketPath}/cs-03-282615:europe-west1:mydatabase`,
...config
});
}
const createPool = async () => {
const config = {
connectionLimit: 5,
connectTimeout: 10000,
acquireTimeout: 10000,
waitForConnections: true,
queueLimit: 0,
}
return await createUnixSocketPool(config);
};
let pool;
const poolPromise = createPool()
.then(async (pool) => {
return pool;
})
.catch((err) => {
logger.error(err);
process.exit(1)
});
exports.vision_analysis = async (event, context, pool) => {
console.log(`Event: ${JSON.stringify(event)}`);
const filename = event.name;
const filebucket = event.bucket;
console.log(`New picture uploaded ${filename} in ${filebucket}`);
const request = {
image: { source: { imageUri: `gs://${filebucket}/${filename}` } },
features: [
{ type: 'LABEL_DETECTION' },
{ type: 'IMAGE_PROPERTIES' },
{ type: 'SAFE_SEARCH_DETECTION' }
]
};
// invoking the Vision API
const [response] = await client.annotateImage(request);
console.log(`Raw vision output for: ${filename}: ${JSON.stringify(response)}`);
if (response.error === null) {
// listing the labels found in the picture
const labels = response.labelAnnotations
.sort((ann1, ann2) => ann2.score - ann1.score)
.map(ann => ann.description)
console.log(`Labels: ${labels.join(', ')}`);
// retrieving the dominant color of the picture
const color = response.imagePropertiesAnnotation.dominantColors.colors
.sort((c1, c2) => c2.score - c1.score)[0].color;
const colorHex = decColorToHex(color.red, color.green, color.blue);
console.log(`Colors: ${colorHex}`);
// determining if the picture is safe to show
const safeSearch = response.safeSearchAnnotation;
const isSafe = ["adult", "spoof", "medical", "violence", "racy"].every(k =>
!['LIKELY', 'VERY_LIKELY'].includes(safeSearch[k]));
console.log(`Safe? ${isSafe}`);
if (isSafe) {
const pool = await poolPromise();
const stmt = 'INSERT INTO sc_03_metadata_schema (labels, color, created) VALUES (?, ?, ?)';
await pool.query(stmt, [labels, colorHex, NOW()]);
console.log("Stored metadata in CloudSQL");
}
} else {
throw new Error(`Vision API error: code ${response.error.code}, message: "${response.error.message}"`);
}
};
function decColorToHex(r, g, b) {
return '#' + Number(r).toString(16).padStart(2, '0') +
Number(g).toString(16).padStart(2, '0') +
Number(b).toString(16).padStart(2, '0');
}
The full error is as such:
Error Logs
I understand you wish to connect to your Cloud SQL MySQL instance via a Cloud Function. From referencing the public documentation Connecting from Cloud Functions to Cloud SQL it seems that “process.env.DB_SOCKET_PATH || “ was not included when instantiating your dbSocketPath.
In regards to the vague error message, this is a known issue, one of which is currently being investigated by the Cloud Functions specialists. I would suggest that you take a look at the Public Tracker and click the star icon to subscribe to the issue for further updates and click the bell icon to be notified of updates via email. We do not have an ETA on the resolution of this issue at this time.

pool.request is not a function

I would like to setup my prepared statements with the mssql module. I created a query file for all user related requests.
const db = require('../databaseManager.js');
module.exports = {
getUserByName: async username => db(async pool => await pool.request()
.input('username', dataTypes.VarChar, username)
.query(`SELECT
*
FROM
person
WHERE
username = #username;`))
};
This approach allows me to require this query file and access the database by executing the query that is needed
const userQueries = require('../database/queries/users.js');
const userQueryResult = await userQueries.getUserByName(username); // call this somewhere in an async function
My database manager handles the database connection and executes the query
const sql = require('mssql');
const config = require('../config/database.js');
const pool = new sql.ConnectionPool(config).connect();
module.exports = async request => {
try {
const result = await request(pool);
return {
result: result.recordSet,
err: null
};
} catch (err) {
return {
result: null,
err
}
}
};
When I run the code I get the following error
UnhandledPromiseRejectionWarning: TypeError: pool.request is not a
function
Does someone know what is wrong with the code?
I think this happens because the pool is not initialized yet... but I used async/await to handle this...
Here is how I made your code work (I did some drastic simplifications):
const sql = require("mssql");
const { TYPES } = require("mssql");
const CONN = "";
(async () => {
const pool = new sql.ConnectionPool(CONN);
const poolConnect = pool.connect();
const getUserByName = async username => {
await poolConnect;
try {
const result = await pool.request()
.input("username", TYPES.VarChar, username)
.query(`SELECT
*
FROM
person
WHERE
username = #username;`);
return {
result: result.recordset,
err: null
};
} catch (err) {
return {
result: null,
err
};
}
};
console.log(await getUserByName("Timur"));
})();
In short, first read this.
You probably smiled when saw that the PR was created just 2 months before your questions and still not reflected in here.
Basically, instead of:
const pool = new sql.ConnectionPool(config).connect();
you do this:
const pool = new sql.ConnectionPool(config);
const poolConnection = pool.connect();
//later, when you need the connection you make the Promise resolve
await poolConnection;

Return the specific contents from a field of data in the result-set of a select query

I have select query that pulls a row of a data from a largeObject stored in a PostgreSQL table. The particular piece of data in the field is a html file.
I can output the name of the field of data into the console which appears as 16543 (for 16543kB).
So, my burning question is how I can return the actual contents (html) so that I can subsequently export it as one object and send it to the browser.
I am using node and express heres my source code so far:
var database = require('../database/postgresDB.js');
var pg = require('pg');
var html = {};
var connectionString = "postgres://dabladmin:dabldem#localhost:5432/dablpatient";
var client = new pg.Client(connectionString);
client.connect();
var query = client.query('SELECT * FROM htmlfiles WHERE id = 1', function(err, result){
console.log(JSON.stringify(result));
console.log(result.rows[0].htmlfile);
html = result.rows[0].htmlfile;
//return result.rows[0].htmlfile;
//console.dir(html);
});
module.exports = html;
This cannot be done directly. You need to export a function which will return the promise.
Following is an idea of how it can be done. Note: The code is not tested.
// htmlfile.model.js
const promise = require('bluebird'); // or any other Promise/A+ compatible library;
const initOptions = {
promiseLib: promise // overriding the default (ES6 Promise);
};
const pgp = require('pg-promise')(initOptions);
// Database connection details;
const cn = {
host: 'localhost', // 'localhost' is the default;
port: 5432, // 5432 is the default;
database: 'myDatabase',
user: 'myUser',
password: 'myPassword'
};
const db = pgp(cn); // database instance;
const getHtml = id => db.oneOrNone('SELECT * FROM htmlfiles WHERE id = $1', id);
module.exports = getHtml;
Inside some.controller.js:
const html_model = require('./htmlfile.model.js');
html_model.getHtml(1)
.then(data => {
if(data) {
// record found
res.send(data.htmlfile);
} else {
// record not found, do something else
}
})
.catch(error => {
// an error occurred
});

Resources