NodeJS SOAP server - node.js

I am new to SOAP and I need to develop a very simple web service in nodejs using SOAP for my college work. I was studying how to do it and the hardest thing to me is to create the wsdl and integrate the soap library to make it work.
I couldn't find any tutorial to create the wsdl or to create the whole SOAP web service in Node. So, what I need is to learn how to do these things.
My work is pretty simple, I just need to have a method that return the top ten students from my MySQL database. Anyone knows any tutorial or content that can help me? I already tried to create the wsdl using Java but didn't work and already made some code for the soap method that needs to be executed.
My server.js:
var express = require('express');
var soap = require('soap');
var mysql = require('mysql');
var xml = require('fs').readFileSync('topten.wsdl', 'utf8');
var connection = mysql.createConnection({
host: 'localhost',
user: 'root',
password: ''
});
var TopTenService = require('./src/services/TopTen')(connection);
var server = express();
server.listen(5488, () => {
soap.listen(server, '/wsdl', TopTenService, xml);
console.log("Servidor Startado!");
});
My TopTen.js:
module.exports = (connection) => {
var topTen = {
TopTen: {
TopTenPort: {
getTopTen = () => {
connection.query('SELECT * FROM `alunos` ORDER BY acertos DESC LIMIT 10', (error, results, fields) => {
if(error){
console.log(error);
}else{
console.log(results);
}
});
}
}
}
};
}
At server.js that line that is reading the file topten.wsdl doesn't work since I don't have that file. I really need some guidance here.. Please, help me!

Related

node-soap service that uses database (dependency issues)

First of all, this is one of my first projects in Node.js so I'm very new to it.
I have a project I want to make that is a SOAP (I know, SOAP... backwards compatibility, huh?) interface that connects to an Oracle database.
So I have a WSDL describing what these functions look like (validation for addresses and stuff) and I have a connection to the database.
Now when using the SOAP npm module, you need to create a server and listen using a service that allows you to respond to requests. I have a separate file that contains my SOAP service but this service should do queries on the database to get its results.
How would I go about sort of 'injecting' my database service into my SOAP service so that whenever a SOAP call is done, it orchestrates this to the correct method in my database service?
This is what my code looks like:
databaseconnection.js
var oracledb = require('oracledb');
var dbConfig = require('../../config/development');
var setup = exports.setup = (callback) => {
oracledb.createPool (
{
user : dbConfig.user,
password : dbConfig.password,
connectString : dbConfig.connectString
},
function(err, pool)
{
if (err) { console.error(err.message); return; }
pool.getConnection (
function(err, connection)
{
if (err) {
console.error(err.message);
return callback(null);
}
return callback(connection);
}
);
}
);
};
databaseservice.js
var DatabaseService = function (connection) {
this.database = connection;
};
function doSomething(callback) {
if (!this.database) { console.log('Database not available.'); return; }
this.database.execute('SELECT * FROM HELP', function(err, result) {
callback(result);
});
};
module.exports = {
DatabaseService: DatabaseService,
doSomething: doSomething
};
soapservice.js
var myService = {
CVService: {
CVServicePort: {
countryvalidation: function (args, cb, soapHeader) {
console.log('Validating Country');
cb({
name: args
});
}
}
}
};
server.js
app.use(bodyParser.raw({type: function(){return true;}, limit: '5mb'}));
app.listen(8001, function(){
databaseconnection.setup((callback) => {
var temp = databaseservice.DatabaseService(callback);
soapservice.Init(temp);
var server = soap.listen(app, '/soapapi/*', soapservice.myService, xml);
databaseservice.doSomething((result) => {
console.log(result.rows.length, ' results.');
});
});
console.log('Server started');
});
How would I go about adding the databaseservice.doSomething() to the countryvalidation soap method instead of 'name: args'?
Also: I feel like the structure of my code is very, very messy. I tried finding some good examples on how to structure the code online but as for services and database connections + combining them, I didn't find much. Any comments on this structure are very welcome. I'm here to learn, after all.
Thank you
Dieter
The first thing I see that looks a little off is the databaseconnection.js. It should be creating the pool, but that's it. Generally speaking, a connection should be obtained from the pool when a request comes in and release when you're done using it to service that request.
Have a look at this post: https://jsao.io/2015/02/real-time-data-with-node-js-socket-io-and-oracle-database/ There are some sample apps you could have a look at that might help. Between the two demos, the "employees-cqn-demo" app is better organized.
Keep in mind that the post is a little dated now, we've made enhancements to the driver that make it easier to use now. It's on my list to do a post on how to build a RESTful API with Node.js and Oracle Database but I haven't had a chance to do it yet.

IBM Bluemix IotFoundation : Iotfclient is offline. Retrying connection

I am trying to follow the IBM Bluemix course on Coursera.
My steup: A raspberry pi as a device (client) which is connected as a registered client to Watson IoT Platform. It's emitting a continuous stream of random numbers per second.
I have deployed my custom Nodejs app (code that is available on Coursera) on IBM Bluemix.
var express = require('express');
var app = express();
var Client = require('ibmiotf');
var appConfig;
var serverPort = process.env.VCAP_APP_PORT || 3000;
var serverHost = process.env.VCAP_APP_HOST || 'localhost';
if (process.env.VCAP_SERVICES) {
var env = JSON.parse(process.env.VCAP_SERVICES);
appConfig = {
'org' : env["iotf-service"][0].credentials.org,
'id' : 'dna-nodeserver',
'auth-key' : env["iotf-service"][0].credentials.apiKey,
'auth-token' : env["iotf-service"][0].credentials.apiToken
}
} else {
appConfig = require('./application.json');
}
var responseString = 'Hello Coursera';
var appClient = new Client.IotfApplication(appConfig);
app.get('/', function(req, res) {
res.send(responseString);
});
var server = app.listen(serverPort, serverHost, function() {
var host = server.address().address;
var port = server.address().port;
console.log('Listening at http://%s:%s', host, port);
appClient.connect();
appClient.on('connect', function() {
appClient.subscribeToDeviceEvents('raspberrypi');
});
appClient.on("error", function (err) {
console.log("Error : "+err);
});
appClient.on('deviceEvent', function(deviceType, deviceId, eventType, format, payload) {
responseString = "Device Event from :: "+deviceType+" : "+deviceId+" of event "+eventType+" with payload : "+payload;
console.log("Device Event from :: "+deviceType+" : "+deviceId+" of event "+eventType+" with payload : "+payload);
});
});
The problem that I am facing is shown is the screenshot below:
Error
Also, since I am receiving continuous events from the raspberry pi... the webpage (served by res.send(responseString)) should show the changes automatically ...without the need for me to manually refresh the page. But this does not seem to happen.
Any help would be appreciated. Thanks!
You probably have multiple applications listening to events. Stop the previous Node Red Application on Bluemix. Instead only run the Nodejs app shown above.
Old thread, but thanks to Alexei for pointing me in the right direction.
In my case the same behaviour testing a local version of an event broker when the production version is still running in IBM Cloud.
You can have multiple listeners in this way if you generate an additional API key, save both sets of credentials, and check a suitable environment variable to see which set to apply.
In my app, I wrap these in a function in node:
function getEventBrokerCredentials(siteCode) {
var codeToCredentials = {
'REMOTE_BROKER': {
'orgId': 'abcde',
'id': 'ThisIsABroker',
'apikey': '111111111111111',
'authtoken': '2222222222222'
},
'LOCAL_BROKER': {
'orgId': 'abcde',
'id': 'ThisIsALocalBroker',
'apikey': '3333333333333333',
'authtoken': '4444444444444'
}
};
return codeToCredentials[siteCode];
}
var brokerCredentials = getEventBrokerCredentials(process.env.BROKER_HOST || 'REMOTE_BROKER');
var appClient = new IOTClient.IotfApplication({
'org': brokerCredentials.orgId,
'id': brokerCredentials.id,
'auth-key': brokerCredentials.apikey,
'auth-token': brokerCredentials.authtoken
});
and then use
BROKER_HOST=LOCAL_BROKER nodemon
for local testing and development.
There are many variations on this theme of course.

Ionic Cordova FileTransfer upload options or How to set value to be available on nodejs express middleware using req.value?

I am new to Ionic. I am trying to upload a file to server and basically what I need is to push a json object to be sent with the FileTransfer.upload and be able to recober this object on the server side using an express middleware from the request:
req.value = ${value_sent_by_ionic_client_upload};
I am current setting the object as a params entry and I can see the object in the FileUploadOptions but the object is not accessible on the server side as a value of request.
Current client side:
var options = new FileUploadOptions();
options.filename = fileURL.substr(fileURL.lastIndexOf('/') + 1);
var params = {};
params.user = StorageService.getUser();
options.params = params;
var ft = new FileTransfer();
ft.upload(fileURL,
encodeURI("http://192.168.192.62:3000/api/meals/picture"),
pictureUploaded,
uploadError,
options);
On server side express middleware:
var user = req.user;
but user is undefined on server side.
How to I pass the user using cordova FileTransfer.upload to make it available from a req.user call?
this is a bit late but maybe it can help someone looking for a code to upload
by the way i don't think cloudinary provides any option to send users
here you're using an upload made with jquery while you're using angularjs for ionic which is not really optimal i might suggest a code like this
$scope.uploadimage = function()
{
var options = new FileUploadOptions()
options.fileKey = "image";
$cordovaFileTransfer.upload('Link-to-your-server', $scope.yourpicture, options).then(function(result) {
console.log("File upload complete");
console.log(result);
$scope.uploadResults = "Upload completed successfully"
}, function(err) {
console.log("File upload error");
console.log(err);
$scope.uploadResults = "Upload failed"
}, function (progress) {
// constant progress updates
console.log(progress);
});
}

Express keeping connection open?

I'm using node js, express and postgresql as backend.
This is the approach I used to make a rest API:
exports.schema = function (inputs, res) {
var query = knex('schema')
.orderBy('sch_title', 'asc')
.select();
query.exec(function (err, schemas) {
if(err){
var response = {
message: 'Something went wrong when trying to fetch schemas',
thrownErr: err
};
console.error(response);
res.send(500, response);
}
if(schemas.length === 0){
var message = 'No schemas was found';
console.error(message);
res.send(400, message);
return;
}
res.send(200, schemas);
});
};
It works but after a while postgres logs an error and it's no longer working:
sorry, too man clients already
Do I need a close each request somehow? Could not find any about this in the express docs. What can be wrong?
This error only occurs on production server. Not on developing machine.
Update
The app only brakes in one 'module'. The rest of the app works fine. So it's only some queries that gives the error.
Just keep one connection open for your whole app. The docs shows an example how to do this.
This code goes in your app.js...
var Knex = require('knex');
Knex.knex = Knex.initialize({
client: 'pg',
connection: {
// your connection config
}
});
And when you want to query in your controllers/middlewares...
var knex = require('knex').knex;
exports.schema = function (req, res) {
var query = knex('schema')
.orderBy('sch_title', 'asc')
.select();
// more code...
};
If you place Knex.initialize inside an app.use or app.VERB, it gets called repeatedly for each request thus you'll end up connecting to PG multiple times.
For most cases, you don't need to do an open+query+close for every HTTP request.

Using angular.js $resource with express framework

I am new to Angular JS and node.js/express framework. I am working on a small application which uses angular and express frameworks. I have express app running with couple of end points. One for POST action and one for GET action. I am using node-mysql module to store and fetch from mysql database.
This application is running on my laptop.
angular.js client:
controller
function ItemController($scope, storageService) {
$scope.savedItems = storageService.savedItems();
alert($scope.savedItems);
}
service
myApp.service('storageService', function($resource) {
var Item = $resource('http://localhost\\:3000/item/:id',
{
id:'#id',
},
{
query: {
method: 'GET',
isArray: true
}
);
this.savedItems = function() {
Item.query(function(data){
//alert(data);
return data;
});
}
Express server with mysql database:
...
app.get('/item', item.list);
...
items.js
---------
exports.list = function(req, res) {
var sql = 'select * from item';
connect: function() {
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'admin',
database : 'test'
});
return connection;
},
query: function(sql) {
var connection = this.connect();
return connection.query(sql, function(err, results) {
if (err) throw err;
return results;
});
},
res.send(results);
};
When I send static array of items (json) from server, $scope.savedItems() is getting populated.
but when I access items in database, even though server is returning items, $scope.savedItems in client is empty. Using $http directly did not help either.
I read async nature of $resource and $http from angular.js documentation and I am still missing something or doing something wrong.
Thanks in advance and appreciate your help.
This has to do with the async nature of angular $resource.
$scope.savedItems = storageService.savedItems();
Returns immediately an empty array, which is populated after the data returns. Your alert($scope.savedItems); will therefore show only an empty array. If you look at your $scope.savedItems a little bit later you would see that it has been populated. If you would like to use the data just after it has been returned you can use a callback:
$scope.savedItems = storageService.savedItems(function(result) {alert(result); });
Just as a quick note. You could also watch the savedItems.
function ItemController($scope, storageService) {
$scope.savedItems = storageService.savedItems();
$scope.$watch(function() {
return $scope.savedItems;
}, function(newValue, oldValue) {
if (typeof newValue !== 'undefined') {
// Do something cool
}
},
true);
}
i suspect, node is not returning mysql results. The fact that it works for static files and not for mysql rules out issues with angular. Can you add firebug logs for the http call or chrome developer logs. This can shed more light on the matter

Resources