I'm trying to build a nodejs/express project that communicate between server/client with grpc protocol, in order to make simple Users CRUD action and display datas on the front-end.
Currently, the server side works fine with mysql librairy and a MariaDB database.
As far as I can see within the console, the server side send the datas as expected, but I can't manage to retrieve them on the client side.
Here is the revelent code :
In the server.js file, server side :
server.addService(customersProto.CustomerService.service, {
getAll: async(_, callback) => {
const allUsers = await sql.getAll(db);
callback(null, { allUsers });
console.log(allUsers);
},
}
The ouput of the console.log :
[{"id":"a68b823c-7ca6-44bc-b721-fb4d5312cafc","name":"David","age":33,"adress":"227th Baker Street"},{"id":"a68b823c-7ca6-44bc-b721-fb4d5312cahg","name":"Peter","age":99,"adress":"38th King Street"}
In the index.js file, client side :
async function getAll() {
return new Promise((resolve, reject) => {
client.getAll(null, (err, data) => {
if (!err) {
console.log("fonction getAll : " + JSON.stringify(data));
resolve(JSON.stringify(data));
} else {
reject(err);
}
});
});
}
app.get("/", async(req, res) => {
const data = await getAll();
console.log(data);
res.render("customers", {
results: JSON.stringify(data),
});
});
The output of the console.log :
fonction getAll : {"customers":[]}
{"customers":[]}
I don't understand why the array is empty...
Have you got an idea to retrieve my datas ?
Nevermind, I've found my problem.
Because the Proto Buffer needs an array of "customers", I didn't realise that I couldn't give any name to the variable in the callback of the getAll function (server side).
That solve the problem : In the server.js file, server side :
server.addService(customersProto.CustomerService.service, {
getAll: async(_, callback) => {
const customers = await sql.getAll(db);
callback(null, { customers });
);
},
}
Related
I have a NODE.JS api using expressjs that connects to an SQL Server, and I want to use it in an angular project. I make use two files, a route file and a controllers file. My route file is as follows:
module.exports = (app) => {
const UsrContrllr = require('../Controllers/users.controllers');
//1. GET ALL USERS
app.get('/api/users', UsrContrllr.func1);
//2. POST NEW USER
app.post('/api/user/new', UsrContrllr.func2);
};
And my controllers file is given below:
const mssql = require('mssql');
exports.func1 = (req, res) =>
{
// Validate request
console.log(`Fetching RESPONSE`);
// create Request object
var request = new mssql.Request();
// query to the database and get the records
const queryStr = `SELECT * FROM USERS`;
request.query(queryStr, function (err, recordset) {
if (err) console.log(err)
else {
if (recordset.recordset.toString() === '') {
res.send('Oops!!! Required data not found...');
}
else {
// send records as a response
res.send(recordset);
}
};
});
};
exports.func2 = (req, res) =>
{
// Validate request
console.log(`INSERTING RECORD ${req}`);
// create Request object
var request = new mssql.Request();
// query to the database and get the records
const queryStr = `INSERT INTO GDUSERS (USERCODE, PASSWORD, LANGUAGE, USERCLASS, FIRSTNAME, LASTNAME, CONTACTNO) VALUES ('${req.body.usercode}', '${req.body.password}', 'EN', '0', '${req.body.firstname}', '${req.body.lastname}', '${req.body.contactno}');`;
request.query(queryStr, function (err, recordset) {
if (err) console.log(err)
else {
if (recordset.recordset.toString() == '') {
res.send('Oops!!! Required data not found...');
}
else {
// Send records as response
res.send(recordset);
}
};
});
};
The GET request works well, but when I try to run the POST request directly from the angular application, I get an error stating
Cannot GET URL/api/user/new
The angular code in my angular project is:
signup() {
let headers = new Headers({ 'Content-Type': 'application/json' });
let options = new RequestOptions({ headers: headers });
console.log(this.user); //User details come from a form
this.http.post(“URL", this.user, options)
.subscribe(
(err) => {
if(err) console.log(err);
console.log("Success");
});
}
I’m not sure whether the angular code I’m using, is right or not, and I don’t know where I’m going wrong. How does one exactly send a http POST request from an Angular project?
this i the way i handled my user signup with http.post calls. my approach is slightly different when signing up user because i am using a promise instead of observable (which i normally use for my servicecalls). but i will show you both ways.
createUser(user: User): Promise < string > {
const promise = new Promise < string > ((resolve, reject) => {
const userForPost = this.createUserForPost(user);
this.http.post(environment.backendUrl + '/api/user/signup', userForPost, this.config).toPromise < HttpConfig > ()
.then(createdUser => {
}).catch(error => {
console.log(error);
});
});
return promise;
}
here another example with an observable
createForumPost(forumPost: ForumPost) {
this.http.post < { message: string, forumPostId: string } > (environment.backendUrl + '/api/forumPosts', forumPost).subscribe((responseData) => {
const id = responseData.forumPostId;
forumPost.id = id;
});
}
i defined my URL somewhere else and then just use the environment.backedUrl + 'path' to define my path (the same as the path in your backend controller)
this is one of my first answers here on SO. i am sry if it is a bit messy
i hope i was able to help with my examples :)
I have a SocketIO instance in an Express app, that listens to a React client requests. A user can send private messages to a specific person. The server receives the private message, and should dispatch it back to both sender & recipient thanks to the io.to(socketId).emit(content) method.
How to listen to this event in React and update the message array? In order to ease the process, I have created a connectedUsers object, whose keys are mongoDB's user._id, and whose values are the unique socketID generated by socketIO. This way, I can easily address message to specific persons in the client. Once sent, the messages are stored in a MongoDB database.
Here is the back-end. The point of interest is io.on("privateMessage")
const connectedUsers = {};
const socketManager = (io) => {
io.on("identifyUser", (user) => {
if (!([user.id] in connectedUsers)) {
connectedUsers[user.id] = io.id;
}
});
io.on("privateMessage", (data) => {
io.to(connectedUsers[data.recipientId]).emit(data.message);
io.to(connectedUsers[data.senderId]).emit(data.message);
});
io.on("disconnect", () => console.log("user disconnected!"));
};
Here is the listening function in React. Everything works but the "privateMessage" part.
async function getUser(socketId) {
try {
const res = await ax.get(`${serverUrl}/login`);
const socket = io(serverUrl);
socketId.current = socket;
socket.on("connect", () => {
socket.emit("identifyUser", { id: res.data._id });
socket.on("privateMessage", (data) =>
console.log("private message received!", data)
);
});
} catch (err) {
throw new Error(err);
}
}
Thanks for your help!
I think you need to put the socket.on("privateMessage") part outside the socket.on("connect") scope.
React must load all events at the beginning.
The backend side must be responsible for the authorization.
For the client there is connection event, not connect.
Subscription to event privateMessage should be outside connection callback.
This code should work. Hope this helps
import io from 'socket.io-client'
async function getUser(socketId) {
try {
const res = await ax.get(`${serverUrl}/login`);
const socket = io(serverUrl);
socketId.current = socket;
socket.on("connection", () => {
socket.emit("identifyUser", { id: res.data._id });
});
socket.on("privateMessage", (data) =>
console.log("private message received!", data)
);
} catch (err) {
throw new Error(err);
}
}
I'm developing a simple app with Node/Hapi/Mongodb, but running into a strange issue. Below is the route that handles adding/updating scores; when I send some data to this endpoint through Insomnia/Postman it works as expected. However, when this POST is coming from a different app I'm getting strange results; the value is always null for every field (again this only happens when the POST is coming from another site, but I've logged out the request payload and can see that the data is correct, just gets set to null when assigning to an object, or trying to use it a query)
server.route({
method: 'POST',
path: '/update-score',
handler: (request, h) => {
var scores = db.collection('scores');
var updateScore = new Promise((resp, rej) => {
console.log('payload ', request.payload);
scores.findOneAndUpdate({customerID: request.payload.customerID}, {$set: {customerID: request.payload.customerID, customerName: request.payload.customerName, highScore: request.payload.highScore}}, {upsert: true}, (err, res) => {
if (err) {
return rej(err);
}
else {
return resp(res);
}
})
});
return updateScore;
}
});
The console logs out the request payload correctly, but its null/undefined when the query tries to use it. I have also tried creating two objects, outside the mongo method call (like below), and after console logging these pre-defined objects out the value was null there as well; even though I can console.log the request.payload after defining these objects and the data is good.
server.route({
method: 'POST',
path: '/update-score',
handler: (request, h) => {
var scores = db.collection('scores');
var queryObj = {
customerID: request.payload.customerID
};
var updateObj = {
$set: {
customerName: request.payload.customerName,
highScore: request.payload.highScore
}
}
var updateScore = new Promise((resp, rej) => {
console.log('again ', request.payload);
scores.findOneAndUpdate(queryObj, updateObj, {upsert: true}, (err, res) => {
if (err) {
return rej(err);
}
else {
return resp(res);
}
})
});
return updateScore;
}
});
Logging the queryObj and valueObj would show the values are all null, even though I can log the request.payload and see the data correctly. Why can't I use the request.payload values anywhere?
Long story short, Insomnia/Postman sends an object as the POST body, but I was JSON encoding the POST from the app; just needed to parse that on the server!
I've been using node-oracledb for a few months and I've managed to achieve what I have needed to so far.
I'm currently working on a search app that could potentially return about 2m rows of data from a single call. To ensure I don't get a disconnect from the browser and the server, I thought I would try queryStream so that there is a constant flow of data back to the client.
I implemented the queryStream example as-is, and this worked fine for a few hundred thousand rows. However, when the returned rows is greater than one million, Node runs out of memory. By logging and watching both client and server log events, I can see that client is way behind the server in terms of rows sent and received. So, it looks like Node is falling over because it's buffering so much data.
It's worth noting that at this point, my selectstream implementation is within a req/res function called via Express.
To return the data, I do something like....
stream.on('data', function (data) {
rowcount++;
let obj = new myObjectConstructor(data);
res.write(JSON.stringify(obj.getJson());
});
I've been reading about how streams and pipe can help with flow, so what I'd like to be able to do is to be able to pipe the results from the query to a) help with flow and b) to be able to pipe the results to other functions before sending back to the client.
E.g.
function getData(req, res){
var stream = myQueryStream(connection, query);
stream
.pipe(toSomeOtherFunction)
.pipe(yetAnotherFunction)
.pipe(res);
}
I'm spent a few hours trying to find a solution or example that allows me to pipe results, but I'm stuck and need some help.
Apologies if I'm missing something obvious, but I'm still getting to grips with Node and especially streams.
Thanks in advance.
There's a bit of an impedance mismatch here. The queryStream API emits rows of JavaScript objects, but what you want to stream to the client is a JSON array. You basically have to add an open bracket to the beginning, a comma after each row, and a close bracket to the end.
I'll show you how to do this in a controller that uses the driver directly as you have done, instead of using separate database modules as I advocate in this series.
const oracledb = require('oracledb');
async function get(req, res, next) {
try {
const conn = await oracledb.getConnection();
const stream = await conn.queryStream('select * from employees', [], {outFormat: oracledb.OBJECT});
res.writeHead(200, {'Content-Type': 'application/json'});
res.write('[');
stream.on('data', (row) => {
res.write(JSON.stringify(row));
res.write(',');
});
stream.on('end', () => {
res.end(']');
});
stream.on('close', async () => {
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
stream.on('error', async (err) => {
next(err);
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
} catch (err) {
next(err);
}
}
module.exports.get = get;
Once you get the concepts, you can simplify things a bit with a reusable Transform class which allows you to use pipe in the controller logic:
const oracledb = require('oracledb');
const { Transform } = require('stream');
class ToJSONArray extends Transform {
constructor() {
super({objectMode: true});
this.push('[');
}
_transform (row, encoding, callback) {
if (this._prevRow) {
this.push(JSON.stringify(this._prevRow));
this.push(',');
}
this._prevRow = row;
callback(null);
}
_flush (done) {
if (this._prevRow) {
this.push(JSON.stringify(this._prevRow));
}
this.push(']');
delete this._prevRow;
done();
}
}
async function get(req, res, next) {
try {
const toJSONArray = new ToJSONArray();
const conn = await oracledb.getConnection();
const stream = await conn.queryStream('select * from employees', [], {outFormat: oracledb.OBJECT});
res.writeHead(200, {'Content-Type': 'application/json'});
stream.pipe(toJSONArray).pipe(res);
stream.on('close', async () => {
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
stream.on('error', async (err) => {
next(err);
try {
await conn.close();
} catch (err) {
console.log(err);
}
});
} catch (err) {
next(err);
}
}
module.exports.get = get;
Rather than writing your own logic to create a JSON stream, you can use JSONStream to convert an object stream to (stringified) JSON, before piping it to its destination (res, process.stdout etc) This saves the need to muck around with .on('data',...) events.
In the example below, I've used pipeline from node's stream module rather than the .pipe method: the effect is similar (with better error handling I think). To get objects from oracledb.queryStream, you can specify option {outFormat: oracledb.OUT_FORMAT_OBJECT} (docs). Then you can make arbitrary modifications to the stream of objects produced. This can be done using a transform stream, made perhaps using through2-map, or if you need to drop or split rows, through2. Below the stream is sent to process.stdout after being stringified as JSON, but you could equally send to it express's res.
require('dotenv').config() // config from .env file
const JSONStream = require('JSONStream')
const oracledb = require('oracledb')
const { pipeline } = require('stream')
const map = require('through2-map') // see https://www.npmjs.com/package/through2-map
oracledb.getConnection({
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
connectString: process.env.CONNECT_STRING
}).then(connection => {
pipeline(
connection.queryStream(`
select dual.*,'test' as col1 from dual
union select dual.*, :someboundvalue as col1 from dual
`
,{"someboundvalue":"test5"} // binds
,{
prefetchRows: 150, // for tuning
fetchArraySize: 150, // for tuning
outFormat: oracledb.OUT_FORMAT_OBJECT
}
)
,map.obj((row,index) => {
row.arbitraryModification = index
return row
})
,JSONStream.stringify() // false gives ndjson
,process.stdout // or send to express's res
,(err) => { if(err) console.error(err) }
)
})
// [
// {"DUMMY":"X","COL1":"test","arbitraryModification":0}
// ,
// {"DUMMY":"X","COL1":"test5","arbitraryModification":1}
// ]
I want to use gridfs-stream in a nodejs application.
A simple example is given in the documentation:
var mongoose = require('mongoose');
var Grid = require('gridfs-stream');
Grid.mongo = mongoose.mongo;
mongoose.connect('mongodb://localhost:27017/test');
// make sure the db instance is open before passing into `Grid`
mongoose.connection.once('open', function () {
var gfs = Grid(mongoose.connection);
// all set!
})
My problem is described by the comment:
make sure the db instance is open before passing into Grid
I try to use gfs in a post request. Now when the code gets initialized, the gfs variable is not defined yet.
api.post('/upload', function(req, res) {
req.pipe(gfs.createWriteStream({
filename: 'test'
}).on('close', function(savedFile){
console.log('file saved', savedFile);
return res.json({file: savedFile});
}));
})
Initializing my route from a callback seems kind of odd.
I read in this post (Asynchronous initialization of Node.js module) that require('') is performed synchronous, and since I rely on the connection being established, I'm kind of forced to wait
Basically I'm not sure if I should use a async pattern on startup now, or if I just miss a more elegant way to solve this.
I have a very similar problem with my server. In my case I am reading https certs asynchronously, the software version from git asynchronously and I want to make sure I have it all together by the time the user comes to log in so I can pass the software version back as a reply to login.
The solution is to use promises. Create the promises on user start up for each activity. Then in the code where you want to be sure its all ready, just call then on either the promise itself or Promise.all(array of promises).then()
Here is an example of what I am doing to read the ssl certs to start the server
class Web {
constructor(manager,logger) {
var self = this;
this.server = false;
this.logger = logger;
var key = new Promise((resolve,reject) => {
fs.readFile(path.resolve(__dirname, 'key.pem'),(err,data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
var cert = new Promise((resolve,reject) => {
fs.readFile(path.resolve(__dirname, 'certificate.pem'), (err,data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
Promise.all([key,cert]).then(values => {
var certs = {
key: values[0],
cert: values[1],
};
return certs;
}).then(certs => {
self.server = require('http2').createServer(certs,(req,res) => {
// NOW Started and can do the rest of the stuff
});
self.server.listen(...);
});
NEEDS SOME MORE CLOSING BRACKETS