While following this youtube video to learn about node.js
I got this error
$ nodemon index.js
(node:18129) [DEP0096] DeprecationWarning: timers.unenroll() is deprecated. Please use clearTimeout instead.
App server still able to run. However as I am trying to add router to add item as the video, it never go through successfully. However, I would call it half- successfully as I do see the data log in the terminal but never go back to database. as you can see, we can see xxx as name as 40 as price in the terminal after I queried
http://localhost:4000/products/add?name=xxx&price=40
in the browser as in the video around 10:52. (I got it working after hard-coding in the index.js file. Why can't I query it as the video did? ) I am assuming the timers.unenorll() causing this. but i Googled it, all i find is this one, and this one(on https://nodejs.org/en/blog/release/v10.0.0/ page)
but don't know what to do with it. cant' find a solution.please help
$ node -v
v10.0.0
const express = require('express');
const cors = require('cors');
const mysql = require('mysql');
const app = express();
app.use(cors());
const SELECT_ALL_PRODUCT_QUERY = 'SELECT * FROM products';
const connection = mysql.createConnection({
host: 'localhost',
user: 'genius',
password: 'genius',
database: 'react_sql'
});
connection.connect(err => {
if(err) {
return err;
}
});
console.log(connection);
app.get('/',(req, res) => {
res.send('go to /products to see the genius gyms')
});
app.get('/products/add', (req, res) => {
const {name, price} = req.query;
//console.log(name, price);
const INSERT_PRODUCTS_QUERY = `INSERT INTO products(name, price) VALUES('${name}',${price})`;
connection.query(INSERT_PRODUCTS_QUERY,(err, results) => {
if(err) {
return res.send(err)
} else {
return res.send('successfully added product')
}
});
});
app.get('/products', (req, res) => {
connection.query(SELECT_ALL_PRODUCT_QUERY,(err,results) => {
if(err) {
return res.send(err)
}
else {
return res.json({
data: results
})
}
});
});
app.listen(4000,() => {
console.log(`Genius gyms listening on
port 4000`)
});
npm install mysql#2.16.0 --save
Just update the mysql library to latest version.
The offender seems to be the mysql module at this spot in the code where you see a call to Timers.unenroll(sequence):
https://github.com/mysqljs/mysql/blob/61b173cbc3e207c5497c6c45c98a4871c01701f3/lib/protocol/Protocol.js#L325
Protocol.prototype._dequeue = function(sequence) {
Timers.unenroll(sequence);
// No point in advancing the queue, we are dead
if (this._fatalError) {
return;
}
this._queue.shift();
var sequence = this._queue[0];
if (!sequence) {
this.emit('drain');
return;
}
this._parser.resetPacketNumber();
this._startSequence(sequence);
};
From various comments on the node.js Github site, it sounds like the API has been deprecated, but won't be removed any time soon. Hopefully, mysql will switch to a substitute at some point. You could inquire on their Github site if you want to know more. The specific issue has already been filed on the mysql site here: https://github.com/mysqljs/mysql/issues/2003 and the current response is that the mysql module does not yet support node.js 10.x.
That presumably means you could rollback your node version to 8.x if you wanted and the issue would probably be gone or just wait for a mysql version that does directly support nodejs 10.x.
Related
I was learning to build a weather app using Node (Express) + React. I successfully fetched weather data from open weather API.
However I was directly using the open weather API key in my React app like this const weatherURL = 'http://api.openweathermap.org/data/2.5/weather?q=london,uk&APPID=1234567qwerty';. Obviously this is not safe as it exposed the API key to the client. I thought about storing the API key in .env file, but according to [this answer][1], I should never store API key in .env file or .gitignore. The right way is to make a request to backend API and make an API call to backend and send the data back. I could not find out how to do it. Can anyone help?
Following is my node js code:
const express = require('express');
const cors = require('cors');
const app = express();
const SELECT_ALL_QUERY = 'SELECT * FROM `mySchema`.`myTable`;';
app.use(cors());
app.get('/', (req, res) => {
res.send('go to /myTable to see content')
});
const pool = require('./awsPool');
pool.getConnection((err, connection) => {
if (err) {
return console.log('ERROR! ', err);
}
if(!connection) {
return console.log('No connection was found');
}
app.get('/myTable', (req, res) => {
console.log(connection);
connection.query(SELECT_ALL_QUERY, (err, results) => {
if (err) {
return res.send(err)
}
else {
return res.json({
data: results
})
};
});
});
});
let port=process.env.PORT||4000;
app.listen(port, () => {
console.log(`App running on port ${port} `);
});```
[1]: https://stackoverflow.com/a/57103663/8720421
What the linked answer was suggesting is to create a route in your Node/Express backend API that will make the call to the weather API for you, instead of the front end. This way the request and your API key are not public-facing whenever your front end makes a call.
The method for doing this would essentially be the same as what you have done in React, making an HTTP request using a built-in or 3rd party library. This resource I just found has some information on how to do both.
The simplest pure http-request in node looks like this:
const http = require('http')
const url = 'http://api.openweathermap.org/data/'
http.request(url, callback).end()
function callback (weatherResponse) {
let jsonString = ''
weatherResponse.on('data', chunk => {
jsonString += chunk
})
weatherResponse.on('end', () => {
// Now you have the complete response and can do whatever you want with it
// like return it to your user `res.send(jsonString)`
console.log(jsonString)
})
}
Many people find it bulky to having to handle chunks and the whole asynchronous thing, so there are many popular npm modules, like: https://www.npmjs.com/package/axios. (And here's a list of other contenders https://github.com/request/request/issues/3143).
Also, it is normal to store API-keys in environment variables on the backend. It makes things easy if you ever try to dockerize your app, or just scale up to using two backend servers instead of one.
I found a solution based on #ippi answer, add the following part to the original code:
const request = require('request');
const url = 'http://api.openweathermap.org/data/2.5/weather?q=london,uk&APPID=1234567';
app.get('/weather', (req, res) => {
request(url, (error, response, body) => {
if (!error && response.statusCode == 200) {
var info = JSON.parse(body)
res.send(info);
}
})
})
The url can be stored in .env file and passed into the above code. The returned weather data can be viewed in JSON format at http://localhost:4000/weather. In React the weather data can be fetched via this localhost url.
EDIT: request is deprecated, so here is a solution using axios
app.get('/weather', (req, res) => {
axios.get(url)
.then(response => {res.json(response.data)})
.catch(error => {
console.log(error);
});
})
User Passport middleware for nodeJs/Express. They provide passport-headerapikey strategy using which you can create and authorize apiKeys. http://www.passportjs.org/packages/passport-headerapikey/
I'm new to Electron and trying to make 1 st application in which I need to connect it to a SQL server database for data storing/retrieving. I've have installed this plugin (https://www.npmjs.com/package/mssql#connect-callback) and followed their instructions but got no success regarding the connection. The weird part is that I also get no error or whatever showing in the console so I'm totally lost. Any help would be much appreciated, thank you guys.
Ps: I'm sure that there's no problem with the database since I can still connect to it using the same config setting below with a database client manager tool.
Below is the code I've used for simple testing connection.
<script type="text/javascript">
$(document).ready(function () {
const electron = require('electron');
const sql = require('mssql');
const config = {
user: 'ql*****',
password: 'qlh****',
server: '123.20.****',
database: 'QLHS'
};
async () => {
try {
await sql.connect(config);
const result = await sql.query`select * from DM_DONVI`;
console.dir(result);
} catch (err) {
console.log(err);
}
};
});
</script>
The link you provided is working. I tried the same. The error log can be seen in view->Toogle Developer Tools. The issue is you need install mysql.
npm install mysql --save
Then the code works fine.
Thank you Mr :D Actually, the thing that didn't work in my original post is the async part. Changing that to this and everything is fine now:
sql.connect(config, function (err) {
if (err) console.log(err);
var request = new sql.Request();
request.query('select * from DM_DONVI', function (err, recordset) {
if (err) {
console.log("Something went wrong")
}
else {
var result = JSON.stringify(recordset);
console.log(recordset.recordsets[0]);
}
});
});
I have a client based on react and I bundle it with webpack 2. But the moment I import/require const SpeechToTextV1 = require('watson-developer-cloud/speech-to-text/v1'); I got some trouble. After I fixed it that it does not break the build, it still throws some warning like:
Module not found: Error: Can't resolve '../build/Release/validation' in '/Users/denbox/Desktop/schedulebot/web-interface/node_modules/websocket/lib'
# ./~/websocket/lib/Validation.js 9:21-59
# ./~/websocket/lib/WebSocketConnection.js
# ./~/websocket/lib/websocket.js
# ./~/websocket/index.js
# ./~/watson-developer-cloud/speech-to-text/recognize_stream.js
# ./~/watson-developer-cloud/speech-to-text/v1.js
# ./src/components/chat.jsx
# ./src/components/chat-page.js
# ./src/index.js
# multi (webpack)-dev-server/client?http://localhost:8080 ./src/index.js
Is it even possible to use the watson-developer-cloud node sdk for the speech-to-text service on the client or only directly on the nodejs server? Thank you.
The Watson Node.js SDK has growing compatibility for client-side usage, but it's not all the way there yet. However, for speech services, there is a separate SDK targeted at client-side usage: https://www.npmjs.com/package/watson-speech
I just added a Webpack example and confirmed that it works: https://github.com/watson-developer-cloud/speech-javascript-sdk/blob/master/examples/webpack.config.js
Update: I also went and added a Webpack example to the Node.js SDK - with the configuration there, it can build for the entire library, and actually works for a subset set of the modules as documented: https://github.com/watson-developer-cloud/node-sdk/tree/master/examples/webpack
Only in Node,js. The mechanism for using Speech To Text from the browser is to use websockets, but to do that you need a token, which will require a server side request. Once you have the token you can use the websockets interface.
With the answers above found a solution for my problem and it might help others who want to get started with the API:
import axios from 'axios';
import recognizeMicrophone from 'watson-speech/speech-to-text/recognize-microphone';
axios.get(`${BACKEND_ROOT_URL}/watsoncloud/stt/token`)
.then((res) => {
console.log('res:', res.data);
const stream = recognizeMicrophone({
token: res.data.token,
continuous: false, // false = automatically stop transcription the first time a pause is detected
});
stream.setEncoding('utf8');
stream.on('error', (err) => {
console.log(err);
});
stream.on('data', (msg) => {
console.log('message:', msg);
});
})
.catch((err) => {
console.log(`The following gUM error occured: ${err}`);
});
In the backend I create a proxy service that get's a token for the watson speech to text service so I don't have to save my credentials on the client:
const watson = require('watson-developer-cloud');
const express = require('express');
const cors = require('cors');
app.use(cors());
const stt = new watson.SpeechToTextV1({
// if left undefined, username and password to fall back to the SPEECH_TO_TEXT_USERNAME and
// SPEECH_TO_TEXT_PASSWORD environment properties, and then to VCAP_SERVICES (on Bluemix)
username: process.env.STT_SERVICE_USER,
password: process.env.STT_SERVICE_PW,
});
const authService = new watson.AuthorizationV1(stt.getCredentials());
// Endpoint to retrieve an watson speech to text api token
// Get token using your credentials
app.get('/watsoncloud/stt/token', (req, res, next) => {
// TODO check jwt at the auth service
authService.getToken((err, token) => {
if (err) {
next(err);
} else {
res.send({ token });
}
});
});
app.listen(port, (err) => {
if (err) {
console.log(`Error: ${err}`);
}
});
I'm using node js, express and postgresql as backend.
This is the approach I used to make a rest API:
exports.schema = function (inputs, res) {
var query = knex('schema')
.orderBy('sch_title', 'asc')
.select();
query.exec(function (err, schemas) {
if(err){
var response = {
message: 'Something went wrong when trying to fetch schemas',
thrownErr: err
};
console.error(response);
res.send(500, response);
}
if(schemas.length === 0){
var message = 'No schemas was found';
console.error(message);
res.send(400, message);
return;
}
res.send(200, schemas);
});
};
It works but after a while postgres logs an error and it's no longer working:
sorry, too man clients already
Do I need a close each request somehow? Could not find any about this in the express docs. What can be wrong?
This error only occurs on production server. Not on developing machine.
Update
The app only brakes in one 'module'. The rest of the app works fine. So it's only some queries that gives the error.
Just keep one connection open for your whole app. The docs shows an example how to do this.
This code goes in your app.js...
var Knex = require('knex');
Knex.knex = Knex.initialize({
client: 'pg',
connection: {
// your connection config
}
});
And when you want to query in your controllers/middlewares...
var knex = require('knex').knex;
exports.schema = function (req, res) {
var query = knex('schema')
.orderBy('sch_title', 'asc')
.select();
// more code...
};
If you place Knex.initialize inside an app.use or app.VERB, it gets called repeatedly for each request thus you'll end up connecting to PG multiple times.
For most cases, you don't need to do an open+query+close for every HTTP request.
I am new to Angular JS and node.js/express framework. I am working on a small application which uses angular and express frameworks. I have express app running with couple of end points. One for POST action and one for GET action. I am using node-mysql module to store and fetch from mysql database.
This application is running on my laptop.
angular.js client:
controller
function ItemController($scope, storageService) {
$scope.savedItems = storageService.savedItems();
alert($scope.savedItems);
}
service
myApp.service('storageService', function($resource) {
var Item = $resource('http://localhost\\:3000/item/:id',
{
id:'#id',
},
{
query: {
method: 'GET',
isArray: true
}
);
this.savedItems = function() {
Item.query(function(data){
//alert(data);
return data;
});
}
Express server with mysql database:
...
app.get('/item', item.list);
...
items.js
---------
exports.list = function(req, res) {
var sql = 'select * from item';
connect: function() {
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'admin',
database : 'test'
});
return connection;
},
query: function(sql) {
var connection = this.connect();
return connection.query(sql, function(err, results) {
if (err) throw err;
return results;
});
},
res.send(results);
};
When I send static array of items (json) from server, $scope.savedItems() is getting populated.
but when I access items in database, even though server is returning items, $scope.savedItems in client is empty. Using $http directly did not help either.
I read async nature of $resource and $http from angular.js documentation and I am still missing something or doing something wrong.
Thanks in advance and appreciate your help.
This has to do with the async nature of angular $resource.
$scope.savedItems = storageService.savedItems();
Returns immediately an empty array, which is populated after the data returns. Your alert($scope.savedItems); will therefore show only an empty array. If you look at your $scope.savedItems a little bit later you would see that it has been populated. If you would like to use the data just after it has been returned you can use a callback:
$scope.savedItems = storageService.savedItems(function(result) {alert(result); });
Just as a quick note. You could also watch the savedItems.
function ItemController($scope, storageService) {
$scope.savedItems = storageService.savedItems();
$scope.$watch(function() {
return $scope.savedItems;
}, function(newValue, oldValue) {
if (typeof newValue !== 'undefined') {
// Do something cool
}
},
true);
}
i suspect, node is not returning mysql results. The fact that it works for static files and not for mysql rules out issues with angular. Can you add firebug logs for the http call or chrome developer logs. This can shed more light on the matter