I am new to nodejs and working on a proof of concept just for fun.
Background:
I have a cloud directory of user information (like username, password and other info). This cloud directory can be used to authenticate a user only via restful API (i.e. no direct connectivity using LDAP or JDBC etc.).
Aim:
To build an LDAP interface for this cloud directory. To start with I am interested only in authentication (LDAP bind).
Intended Flow:
LDAPClient initiates a standard LDAP simple BIND request:
Host: host where my nodejs app will run
Port: 1389 (port that my nodejs app will be bound to)
Username: a user from cloud directory
Password: user's password
This request is received by my NodeJS app (I am using ldapjs module).
// process ldap bind operation
myLdapServer.bind(searchBase, function (bindReq, bindRes, next) {
// bind creds
var userDn = req.dn.toString();
var userPw = req.credentials;
console.log('bind DN: ' + req.dn.toString());
...
...
}
Within the above callback, I must use http.request to fire a restful API (POST) to the cloud directory with the details I received from the BIND request (i.e. username, password).
If restful api response status is 200 (auth success), then I must return success to the LDAPClient, else I must return invalid credentials error.
Success:
bindRes.end();
return next();
Failure:
Console.log("returning error");
return next(new ldap.InvalidCredentialsError());
Questions:
Is this possible using NodeJS? Asking because of the nesting involved as evident above (calling of REST API from within a callback). Also since this is an authentication operation, this is meant to be a blocking operation(?)
Thanks,
Jatin
UPDATE:
Thanks Klvs, my solution is more or less like the one you posted. Please have a look at the snippet below:
// do the POST call from within callback
var postRequest = https.request(postOptions, function(postResponse) {
console.log("statusCode: ", postResponse.statusCode);
if(postResponse.statusCode!=200) {
console.log("cloud authentication failed: "+postResponse.statusCode);
return next(ldapModule.InvalidCredentialsError());
} else {
postResponse.on('data', function(d) {
console.info('POST result:\n');
process.stdout.write(d);
console.info('\n\nPOST completed');
});
res.end();
return next();
}
});
// write json data
postRequest.write(postData);
postRequest.end();
postRequest.on('error', function(e) {
console.error("postRequest error occured: "+e);
});
Successful authentication works fine, however, failed authentication does not send any response back to the LDAPClient at all. My client just times out instead of showing authentication failure error. I do see the "cloud authentication failed: " log message on the Node console, which means the below statement is not doing what I intend it do:
return next(ldapModule.InvalidCredentialsError());
Note that the above statement works when I remove the rest call etc, and just return the error back to the client.
Am I missing something?
Thanks,
Jatin
Of course it's possible in nodejs. If I understand you want to make an authenticating request to a server and have it either fail or succeed.
const request = require('request')
// process ldap bind operation
myLdapServer.bind(searchBase, function (bindReq, bindRes, next) {
// bind creds
var userDn = req.dn.toString();
var userPw = req.credentials;
console.log('bind DN: ' + req.dn.toString());
request.post({username: userDn, password: userPw}, (err, res, body)=>{
if(err) {
console.log("returning error");
next(new ldap.InvalidCredentialsError());
} else {
bindRes.end();
next();
}
})
}
Is that what you're looking for? If so, you just need to get accustom to callbacks.
Related
I am implementing OAuth Google Sign in using backend (written in node.js, express framework at Heroku). Front end is Android and it sends the token id to the server just fine. And server receives the token id correctly.
Here is the code (which is ripped off straight from Google Documents)
var auth = new GoogleAuth;
var client = new auth.OAuth2(CLIENT_ID, '', '');
client.verifyIdToken(
token,
CLIENT_ID,
// Or, if multiple clients access the backend:
//[CLIENT_ID_1, CLIENT_ID_2, CLIENT_ID_3],
function(e, login) {
var payload = login.getPayload();
var userid = payload['sub'];
// If request specified a G Suite domain:
//var domain = payload['hd'];
});
But at times login in undefined. Its so strange that this problem occurs 1/10 rather than for every try so that I am not able to track the source of issue. For every other 9/10 it works just fine.
Any suggestions on how to solve this?
The problem in your code is that you are not checking if your callback get's any error.
The standard way in node.js to use a callback function is using two parameters - error is the first, the actual (success) returned data is the second, and the convention is that if an error exists - you should address it, and you're not gauranteed to get the data, and if everything went well- error will be null and you'll get your data.
So in your code, you are not checking that there's an error (and like you say, not always there's one).
Should be something like:
function(e, login) {
if (e) {
// handle error here
return; // don't continue, you don't have login
}
// if we got here, login is defined
var payload = login.getPayload();
var userid = payload['sub'];
// If request specified a G Suite domain:
//var domain = payload['hd'];
});
The first parameter to the callback function is an error that needs to handled.
function(error, login) {
if (error) return console.error(error)
var payload = login.getPayload();
var userid = payload['sub'];
// If request specified a G Suite domain:
//var domain = payload['hd'];
});
I have been trying to use a service worker within a IIS hosted web site that caches some of the static content of the site. The site is an internal application that uses Windows Authentication. I have been able to register and run a service worker without too much hassle, but as soon as I open the caches and start adding files to the cache, the promise fails with an authorisation failure. the returned HTTP result is 401 Unauthorised. This is the usual response for the first few requests until the browser and the server are able to negotiate the authorisation.
I will post some code soon that should help with the explanation.
EDIT
var staticCacheName = 'app-static-v1';
console.log("I AM ALIVE");
this.addEventListener('install', function (event) {
console.log("AND I INSTALLED!!!!");
var urlsToCache = [
//...many js files to cache
'/scripts/numeral.min.js?version=2.2.0',
'/scripts/require.js',
'/scripts/text.js?version=2.2.0',
'/scripts/toastr.min.js?version=2.2.0',
];
event.waitUntil(
caches.open(staticCacheName).then(function (cache) {
cache.addAll(urlsToCache);
}).catch(function (error) {
console.log(error);
})
);
});
This is just a guess, given the lack of code, but if you're doing something like:
caches.open('my-cache').then(cache => {
return cache.add('page1.html'); // Or caches.addAll(['page1.html, page2.html']);
});
you're taking advantage of the implicit Request object creation (see section 6.4.4.4.1) that happens when you pass in a string to cache.add()/cache.addAll(). The Request object that's created uses the default credentials mode, which is 'omit'.
What you can do instead is explicitly construct a Request object containing the credentials mode you'd prefer, which in your case would likely be 'same-origin':
caches.open('my-cache').then(cache => {
return cache.add(new Request('page1.html', {credentials: 'same-origin'}));
});
If you had a bunch of URLs that you were passing an array to cache.addAll(), you can .map() them to a corresponding array of Requests:
var urls = ['page1.html', 'page2.html'];
caches.open('my-cache').then(cache => {
return cache.addAll(urls.map(url => new Request(url, {credentials: 'same-origin'})));
});
I am using Hapi.js as a framework for our API development, and i am getting following error in very rare scenario.
2015-02-08T12:32:38.073Z - verbose: err.stack > Error: Already closed
at Object.exports.create (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/node_modules/boom/lib/index.js:21:17)
at Object.exports.internal (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/node_modules/boom/lib/index.js:252:92)
at /var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/lib/request.js:297:34
at iterate (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/node_modules/items/lib/index.js:35:13)
at done (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/node_modules/items/lib/index.js:27:25)
at validate (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/lib/auth.js:283:20)
at finish (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/lib/protect.js:45:21)
at wrapped (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/node_modules/hoek/lib/index.js:798:20)
at root (/var/www/ragchewAppServerSrc/ragchew_prod/ragchews/node_modules/hapi/lib/auth.js:198:50)
at /var/www/ragchewAppServerSrc/ragchew_prod/ragchews/src/middlewares/auth/ragchew_auth_strategy.js:75:28
2015-02-08T12:32:38.073Z - verbose: err > {"isBoom":true,"output":{"statusCode":500,"payload":{"statusCode":500,"error":"Internal Server
Neither am able to reproduce this in our testing environment, nor i understand the root cause of this error.
It would be great help if some-one highlight why/when this error is generated by framework.
In our code this error occur when we try to send the reply back from 'Authentication plugin'. We are using basic authentication scheme.
Sample snippet where issue occurs is:
exports.register = function (plugin, options, next) {
plugin.auth.scheme('basic', function (server, options) {
var settings = options;
// some code here
var scheme = {
authenticate: function (request, reply) {
// some code here
// assign access token value to token here.
settings.validateFunc.call(request, token, function (err, isValid, credentials) {
// handle error here.
return reply(null, { credentials: credentials }); // error occurs on this line
});
}
};
return scheme;
});
next();
};
Can you try return reply.continue({credentials: credentials}); on success instead of what you have now?
See this line in hapi-auth-basic for reference:
https://github.com/hapijs/hapi-auth-basic/blob/master/lib/index.js#L83
I am trying to store some data on Custom Object of Appcelerator ACS. So there will be a service to do that. Each time it require authentication to create a new object
But I am sometime getting below error while login with ACS. But it not occuring always. It only if I call service multiple time.
error i am getting is:
{
success: false,
error: true,
code: 400,
message: "Invalid request sent."
}
Code used to login :
ACS.Users.login(userData, function(data){
if(data.success) {
console.log("----------Successful to login.---------------");
console.log(data);
res.send(data);
res.end();
} else {
console.log("------------------login failed---------------");
console.log(data);
res.send(data);
res.end();
}
},req, res);
Can some one can help me to understand how to re-use session id from node.ACS web service app (Not web app)?
How I can keep session / check session validity before pushing something to custom object? Has anyone faced similar issue?
Thanks
Peter
Since you are passing in the req and res parameters into ACS.Users.login, the session is saved in the _session_id cookie:
http://docs.appcelerator.com/cloud/latest/#!/guide/node_acs
When you make subsequent calls to ACS, you pass in the req and res parameters and it will check for this session token.
A session can become invalid after timeout or logout. To check whether a session is still good, one way is to check against this REST API endpoint (GET):
https://api.cloud.appcelerator.com/v1/users/show/me.json?key=(YOUR ACS KEY)&_session_id=(SESSION ID OF THE USER)
Also, for some reason acs-node v0.9.3 appears to be returning the same session ID, even for different users. Some side-effects I've seen include (1) the wrong user attempting to make a change to an object, and (2) objects created by one user are actually owned by the last person who logged in. Making sure acs-node is at v0.9.2 avoids this issue.
Now that node-acs has been shut down, everyone is obliged to move to the new arrowdb node sdk.
To solve the issue above about not authenticating the User, before every ArrowDB call, make sure to pass the session_id of the user like so:
// Connect Node.ACS to existing ACS
var ArrowDB = require('arrowdb'),
arrowDBApp = new ArrowDB('XXYYZZZ', { // ArrowDB Key
apiEntryPoint: 'https://api.cloud.appcelerator.com',
autoSessionManagement: false, // handle session_id yourself
prettyJson: true,
responseJsonDepth: 3
});
// == Creates the ACS_Event for a logged in User on ArrowDB ==
function createACSEvent(uniqueId, params) {
arrowDBApp.sessionCookieString = params.session_id; //<-- THIS IS IT!
arrowDBApp.eventsCreate({
name: 'someEvent',
start_time: params.start_time,
custom_fields: params,
}, function(err, result) {
if (err) {
logger.info( 'ERROR ACS_Event created '+ err);
} else {
logger.info( 'Success Creating ACSEvent ' + JSON.stringify(result));
}
});
}
I am running a nodejs + express based api server from heroku and using the dropbox-js library. Here's what I'd like to do:
A user hits a specific api endpoint and kicks off the process.
Generate some text files via a node process and save them on the server
Transfer these files to a dropbox that I own using my own credentials (user and dropbox app).
There will never be a case when a random user needs to do this.. it's a team account and this is an internal tool.
The part that is tripping me up is that dropbox wants to open a browser window and get permission from me to connect to the app. The issue is that I obviously can't click the button when the process is running on the heroku instance.
Is there any way for me to authorize access to the app totally in node?
I feel like I could potentially use a phantomJS process to click the button - but it seems too complicated and I'd like to avoid it if possible.
Here is my authentication code:
// Libraries
var Dropbox = require('dropbox');
var DROPBOX_APP_KEY = "key";
var DROPBOX_APP_SECRET = "secret";
var dbClient = new Dropbox.Client({
key: DROPBOX_APP_KEY, secret: DROPBOX_APP_SECRET, sandbox: false
});
dbClient.authDriver(new Dropbox.Drivers.NodeServer(8191));
dbClient.authenticate(function(error, client) {
if (error) {
console.log("Some shit happened trying to authenticate with dropbox");
console.log(error);
return;
}
client.writeFile("test.txt", "sometext", function (error, stat) {
if (error) {
console.log(error);
return;
}
console.log("file saved!");
console.log(stat);
});
});
Took me a bit of testing, but it's possible.
First, you need to authenticate through the browser and save the token and token secret that are returned by Dropbox:
dbClient.authenticate(function(error, client) {
console.log('connected...');
console.log('token ', client.oauth.token); // THE_TOKEN
console.log('secret', client.oauth.tokenSecret); // THE_TOKEN_SECRET
...
});
Once you have the token and the secret, you can use them in the Dropbox.Client constructor:
var dbClient = new Dropbox.Client({
key : DROPBOX_APP_KEY,
secret : DROPBOX_APP_SECRET,
sandbox : false,
token : THE_TOKEN,
tokenSecret : THE_TOKEN_SECRET
});
After that, you won't get bothered with having to authenticate through a browser anymore (or at least not until someone runs the code again without the token and the secret, which will make Dropbox generate a new token/secret pair and invalidate the old ones, or the apps credentials are revoked).
Or you can just use the Implicit grant and get the oauth token.
var client = new Dropbox.Client({
key: "xxxxx",
secret: "xxxxx",
token:"asssdsadadsadasdasdasdasdaddadadadsdsa", //got from implicit grant
sandbox:false
});
No need to get to the browser at all.This line is no longer required!
client.authDriver(new Dropbox.AuthDriver.NodeServer(8191));