How to create and manipulate promises in Protractor? - node.js

I want to use the Node Http module to call my server directly in order to set up my Protractor tests. Http is callback based and I want to turn that into promises.
For example, I want to have this function return promise:
function callMyApi() {
var promise = // somehow create promise;
http.request({path: '/yada/yada', method: 'POST'}, function(resp) {
promise.complete(resp);
});
return promise;
}
So, the question is: what do I need to require() and put in place of "somehow create promise" for this to work?

Protractor uses WebDriver's promises and exposes that API globally on 'protractor'. So you should be able to do
var deferred = protractor.promise.defer();
return deferred.promise;
For the full WebDriverJS Promise API, see the code at https://code.google.com/p/selenium/source/browse/javascript/webdriver/promise.js

This is the wrong way to do this, but knowing about the Protractor Control Flow could help. If you want regular Javascript run in Protractor order add it through the control flow.
In this case you could use your own promise library if you want then just use browser.wait to wait for your promises to complete.
var Promise = require('bluebird');
var promises = [];
browser.controlFlow().execute(function() {
var p = new Promise...
promises.push(p);
});
browser.wait( function(){ return Promise.all(promises); }, timeoutMs );
I use this not for regular promises, but for console.log statements or doing timing for a part of a test, or even using fs to print something in a test to a file.
var startTime, duration;
browser.controlFlow().execute(function() {
startTime = new Date().getTime();
});
//Protractor code you want timed
browser.controlFlow().execute(function() {
duration = new Date().getTime() - startTime;
console.log("Duration:", duration);
});

Related

Dialogflow - Reading from database using async/await

it's the first time for me using async/await. I've got problems to use it in the context of a database request inside a dialogflow intent. How can I fix my code?
What happens?
When I try to run use my backend - this is what I get: "Webhook call failed. Error: Request timeout."
What do I suspect?
My helper function getTextResponse() waits for a return value of airtable, but never get's one.
What do I want to do?
"GetDatabaseField-Intent" gets triggered
Inside it sends a request to my airtable database via getTextResponse()
Because I use"await" the function will wait for the result before continuing
getTextResponse() will return the "returnData"; so the var result will be filled with "returnData"
getTextResponse() has finished; so the response will be created with it's return value
'use strict';
const {
dialogflow
} = require('actions-on-google');
const functions = require('firebase-functions');
const app = dialogflow({debug: true});
const Airtable = require('airtable');
const base = new Airtable({apiKey: 'MyKey'}).base('MyBaseID');
///////////////////////////////
/// Helper function - reading Airtable fields.
const getTextResponse = (mySheet, myRecord) => {
return new Promise((resolve, reject) => {
// Function for airtable
base(mySheet).find(myRecord, (err, returnData) => {
if (err) {
console.error(err);
return;
}
return returnData;
});
}
)};
// Handle the Dialogflow intent.
app.intent('GetDatabaseField-Intent', async (conv) => {
const sheetTrans = "NameOfSheet";
const recordFirst = "ID_OF_RECORD";
var result = await getTextResponse(sheetTrans, recordFirst, (callback) => {
// parse the record => here in the callback
myResponse = callback.fields.en;
});
conv.ask(myResponse);
});
// Set the DialogflowApp object to handle the HTTPS POST request.
exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);
As #Kolban pointed out, you are not accepting or rejecting the Promise you create in getTextResponse().
It also looks like the var result = await getTextResponse(...) call is incorrect. You have defined getTextResponse() to accept two parameters, but you are passing it three (the first two, plus an anonymous arrow function). But this extra function is never used/referenced.
I would generally avoid mixing explicit promises with async/await and definitely avoid mixing async/await with passing callbacks.
I don't know the details of the API you are using, but if the API already supports promises, then you should be able to do something like this:
const getTextResponse = async (mySheet, myRecord) => {
try {
return await base(mySheet).find(myRecord)
}
catch(err) {
console.error(err);
return;
}
)};
...
app.intent('GetDatabaseField-Intent', async (conv) => {
const sheetTrans = "NameOfSheet";
const recordFirst = "ID_OF_RECORD";
var result = await getTextResponse(sheetTrans, recordFirst)
myResponse = result.fields.en;
conv.ask(myResponse);
});
...
Almost all promised based libraries or APIs can be used with async/await, as they simply use Promises under the hood. Everything after the await becomes a callback that is called when the awaitted method resolves successfully. Any unsuccessful resolution throws a PromiseRejected error, which you handle by use of a try/catch block.
Looking at the code, it appears that you may have a misunderstanding of JavaScript Promises. When you create a Promise, you are passed two functions called resolve and reject. Within the body of your promise code (i.e. the code that will complete sometime in the future). You must invoke either resolve(returnData) or reject(returnData). If you don't invoke either, your Promise will never be fulfilled. Looking at your logic, you appear to be performing simple returns without invoking resolve or reject.
Let me ask you to Google again on JavaScript Promises and study them again with respect to the previous comments just made and see if that clears up the puzzle.

About the local variable in Node.js promise

I am a newer of Node.js.I defined a array as a local variable,and want to use it in the following then,I save some useful data in it.But in the end, the array is empty.Can somebody tell me why?Thanks for your support.
const Device = require("./mongo.js").Device;
const Video = require("./mongo.js").Video;
Device.findOne({id:"11112222"}).exec()
.then(function(data){
var videoIds = data.videoIds.split(",");
var videoId2URL = [];
console.log(videoIds);
videoIds.forEach(function(one){
return Video.findOne({id:one}).exec()
.then(function(data){
videoId2URL.push({id:one,url:data.url});
return videoId2URL;
})
});
console.log(videoId2URL);
});
The problem is that you are displaying videoId2URL too early.
Device.findOne returns a promise executed asynchronously. But Video.findOne also returns a promise executed asynchronously.
So when you do console.log(videoId2URL);, the promises created by Video.findOne are not executed yet. So your array is empty.
You must wait the end of all your promises. You can use Promise.all for that.
Promise.all(videoIds.map(function(one){
return Video.findOne({id:one}).exec()
.then(function(data){
videoId2URL.push({id:one,url:data.url});
return videoId2URL;
});
})
.then(function() {
console.log(videoId2URL);
});
You could use Promise.all to resolve your problem. You forEach code contains async code. Your last line does not wait for all promises to get resolved.
Try with:
var arr = [];
videoIds.forEach(function(one){
return arr.push(Video.findOne({id:one}).exec());
});
Promise.all(arr) // here we are waiting for all async tasks to get resolved
.then(function(data){
console.log(data);
// parse your data here and find array of videoId2URL
})
When you do console.log(videoId2URL), you're still in the main stack for the script, while none of the push callbacks have been executed.
You can use an array to collect the promises returned by Video.findOne, and at the end use Promise.all to drain all the promises and do the log then.
BTW, none of the 2 return are necessary, you can safely remove them.
The 1st one is not used because it's in a synchronous callback for forEach.
The 2nd one is not used because you're relying on the side effect, rather than use the resolved value.
Try:
const Device = require("./mongo.js").Device;
const Video = require("./mongo.js").Video;
Device.findOne({id:"11112222"}).exec()
.then(function(data){
var videoIds = data.videoIds.split(",");
var videoId2URL = [];
var promiseArr = [];
console.log(videoIds);
videoIds.forEach(function(one){
var p = Video.findOne({id:one}).exec()
.then(function(data){
videoId2URL.push({id:one,url:data.url});
});
promiseArr.push(p);
});
Promise.all(promiseArr).then(function() {
console.log(videoId2URL);
});
});

Promisify a synchronous method

Can I make a synchronous method into asynchronous by using promise?
For example reading a file synchronously (yes there is fs.readFile which has callback):
// Synchronous read
var data = fs.readFileSync('input.txt');
Should I do this:
function readFileAsync(){
return new Promise((resolve, reject) => {
try {
resolve(fs.readFileSync('input.txt'));
} catch(err) {
reject(err);
}
})
}
or use async/await:
function async readFileAsync(){
try {
let result = await fs.readFileSync('input.txt');
return result;
} catch(err) {
return err;
}
})
}
TL;DR NO, pure synchronous functions are not promisifiable in order to avoid blockage
No. For a method to be promisifiable it needs to be already asynchronous, i.e. return immediately, and also use callbacks upon finish.
For example:
function loop1000() {
for (let i = 0; i < 1000; ++i) {}
}
Is not promisifiable because it does not return immediately and does not use callbacks. But
function loop1000(err, callback) {
process.nextTick(() => {
for (let i = 0; i < 1000; ++i) { }
callback();
});
}
Is promisifiable as
function loop1000promisified() {
return new Promise((resolve, reject) => loop1000(resolve));
}
BUT all those approaches are going to block on the loop anyway. The original version blocks immediately and the one using process.nextTick() will block on the next processor tick. Making the application unresponsive for the duration of the loop.
If you wanted to make loop1000() asynchronous friendly you could rewrite it as:
function loop1000(err, callback) {
const segmentDuration = 10;
const loopEnd = 1000;
let i = 0;
function computeSegment() {
for (let segment = 0;
segment < segmentDuration && i < loopEnd;
++segment, ++i) { }
if (i == loopEnd) {
callback();
return;
}
process.nextTick(computeSegment);
}
computeSegment();
}
So instead of a longer blocking time it would have several smaller blockings. Then the promisified version loop1000promisified() could make some sense.
disclaimer: code typed directly on SO w/o any test.
Can I make a synchronous method into asynchronous by using promise?
No.
Can I make a synchronous method into asynchronous at all?
No. That's why promises don't help here. You need to use the natively asynchronous counterpart, i.e. fs.readFile instead of fs.readFileSync in your case.
Regarding your alternatives, you probably should do neither. But if you absolutely need a synchronous function that returns a fulfilled or rejected promise (instead of throwing exceptions), you can do
function readFileSync(){
return new Promise(resolve => {
resolve(fs.readFileSync('input.txt'))
});
}
or
async function readFileSync() {
return fs.readFileSync('input.txt');
}
I would re-phrase the the other to answers from "No" to "Not Really".
First a point of clarification: In NodeJS, everything is asynchronous, except your code. Specifically, one bit of your code will never run in parallel with another bit of your code -- but the NodeJS runtime may manage other tasks (namely IO) at the same time your code is being executed.
The beauty of functions like fs.readFile is that the IO happens in parallel with your code. For example:
fs.readFile("some/file",
function(err,data){console.log("done reading file (or failed)")});
do.some("work");
The second line of code will be executed while NodeJS is busily reading the file into memory. The problem with fs.readFileSync is that when you call it, NodeJS stops evaluating your code (all if it!) until the IO is done (i.e. the file has been read into memory, in this case). So if you mean to ask "can you take a blocking (presumably IO) function and make it non-blocking using promises?", the answer is definitely "no".
Can you use promises to control the order in which a blocking function is called? Of course. Promises are just a fancy way of declaring the order in which call backs are called -- but everything you can do with a promise, you can do with setImmediate() (albeit with a lot less clarity and a lot more effort).
I would disagree slightly with the others who say you should never promisify your function. There ARE cases when you want to promisify a function. For example a legacy code base that uses native processes and similar, where no callbacks and no promises were used, but you can assume the function is async and will execute within certain time.
Instead of writing a ton of setTimeout() callbacks you want to use promises.
This is how I do it for the testing purposes. Check the Ph library, especially the promisify function, and check how it is used to set up the mocha test in before function.
// Initial state
var foo = 1;
var xml = "";
// Promise helper library
var Ph = (function(){
return {
delay: function (milis){
var milis = milis || 200;
return function(){
return new Promise(function(resolve, reject){
setTimeout(function(){
resolve();
}, milis)
})
}
},
promisify: function(syncFunc){
return new Promise(function(resolve, reject){
syncFunc();
resolve();
})
}
}
}());
// 'Synchronous' functions to promisify
function setXML(){
console.log("setting XML");
xml = "<bar>";
}
function setVars(){
console.log("setting Vars");
foo = 2;
}
// Test setup
before(function(done) {
this.timeout(0);
Promise.resolve()
.then(promisify(setXML))
.then(Ph.delay(3000))
.then(Ph.promisify(setVars))
.then(Ph.delay(3000))
.then(function(){
done();
})
});
// Test assertions
describe("Async setup", function(done){
it("should have XML set", function(done){
expect(xml).to.be.not.equal("");
done();
});
it("should have foo not equal 1.", function(done){
expect(foo).to.be.not.equal(1);
done();
});
it("should have foo equal to 2.", function(done){
expect(foo).to.be.equal(2);
done();
});
});
To make it work in IE, I use Promise CDN:
<script src="https://cdnjs.cloudflare.com/ajax/libs/es6-promise/4.1.1/es6-promise.auto.min.js"></script>

How can I use Socket.IO with promises?

As a part of an ongoing effort, I'm changing my current callbacks technique to promises using blue-bird promise library.
I would like to implement this technique with Socket.IO as well.
How can I use Socket.IO with promises instead of callbacks?
Is there any standard way of doing it with Socket.IO? any official solution?
You might look into Q-Connection, which facilitates RPC using promises as proxies for remote objects and can use Socket.IO as a message transport.
Bluebird (and many other promise libraries) provide helper methods to wrap your node style functions to return a promise.
var readFile = Promise.promisify(require("fs").readFile);
readFile("myfile.js", "utf8").then(function(contents){ ... });
https://github.com/petkaantonov/bluebird/blob/master/API.md#promisification
Returns a function that will wrap the given nodeFunction. Instead of
taking a callback, the returned function will return a promise whose
fate is decided by the callback behavior of the given node function.
The node function should conform to node.js convention of accepting a
callback as last argument and calling that callback with error as the
first argument and success value on the second argument.
have a look here https://www.npmjs.com/package/socket.io-rpc
var io = require('socket.io').listen(server);
var Promise = require('bluebird');
var rpc = require('socket.io-rpc');
var rpcMaster = rpc(io, {channelTemplates: true, expressApp: app})
//channelTemplates true is default, though you can change it, I would recommend leaving it to true,
// false is good only when your channels are dynamic so there is no point in caching
.expose('myChannel', {
//plain JS function
getTime: function () {
console.log('Client ID is: ' + this.id);
return new Date();
},
//returns a promise, which when resolved will resolve promise on client-side with the result(with the middle step in JSON over socket.io)
myAsyncTest: function (param) {
var deffered = Promise.defer();
setTimeout(function(){
deffered.resolve("String generated asynchronously serverside with " + param);
},1000);
return deffered.promise;
}
});
io.sockets.on('connection', function (socket) {
rpcMaster.loadClientChannel(socket,'clientChannel').then(function (fns) {
fns.fnOnClient("calling client ").then(function (ret) {
console.log("client returned: " + ret);
});
});
});

Returning an Array using Firebase

Trying to find the best-use example of returning an array of data in Node.js with Q library (or any similar library, I'm not partial) when using Firebase .on("child_added");
I've tried using Q.all() but it never seems to wait for the promises to fill before returning. This is my current example:
function getIndex()
{
var deferred = q.defer();
deferred.resolve(new FirebaseIndex( Firebase.child('users').child(user.app_user_id).child('posts'), Firebase.child('posts') ) );
return deferred.promise;
}
function getPost( post )
{
var deferred = q.defer();
deferred.resolve(post.val());
return deferred.promise;
}
function getPosts()
{
var promises = [];
getIndex().then( function (posts) {
posts.on( 'child_added', function (_post) {
promises.push( getPost(_post) );
});
});
return q.all(promises);
}
The problem occurs in getPosts(). It pushes a promise into your array inside an async function--that won't work since q.all is called before the promise objects have been added.
Also, child_added is a real-time event notification. You can't use that as a way to grab "all of the data" because there is no such thing as "all"; the data is constantly changing in real-time environments. FirebaseIndex is also using child_added callbacks internally, so that's not going to work with this use case either.
You can grab all of the posts using the 'value' callback (but not a specific subset of records) as follows:
function getPosts() {
var def = q.defer();
Firebase.child('users').once('value', function(snap) {
var records = [];
snap.forEach(function(ss) {
records.push( ss.val() );
});
def.resolve(records);
});
return def.promise;
}
But at this point, it's time to consider things in terms of real-time environments. Most likely, there is no reason "all" data needs to be present before getting to work.
Consider just grabbing each record as they come in and appending them to whatever DOM or Array where they need to be stored, and working from an event driven model instead of a GET/POST centered approach.
With luck, you can bypass this use case entirely.

Resources