Alexa - Perform background tasks during audioPlayer.Play - node.js

How is it possible to carry out any background tasks while Alexa is playing something with audioPlayer.Play?
The below plays an audio stream, but I need to perform other tasks in the background without intervention while the stream is playing. I know it is possible because other Skills can do it.
var handlers = {
'LaunchRequest': function() {
this.emit('Play');
},
'Play': function() {
this.response.speak('Sure.').
audioPlayerPlay(
'REPLACE_ALL',
stream.url,
stream.url,
null,
0);
this.emit(':responseReady');
}
}
Does anyone know or have any suggestions? From what I can see, once it starts playing the stream, I cannot get it to do anything unless I interrupt the stream to command another intent?

Alexa has a few built in requests that she sends to your skill throughout the lifecycle of an Audio stream for just this purpose! They are as follows:
AudioPlayer.PlaybackStarted - Sent when a new audio item begins playing.
AudioPlayer.PlaybackNearlyFinished - Sent when an audio item is almost over (most commonly used by a skill service to handle queuing the next item.)
AudioPlayer.PlaybackFinished - Sent when an audio item ends.
There are a couple of other ones too that you can read about here, but my guess is these will do just fine for what you need.
To use them, just set up handlers in your code for any of these requests and perform any tasks you need to there!
I don't know node.js at all, but my guess is the end result will look relatively close to this:
var handlers = {
'LaunchRequest': function() { /* your launch request code from above */ }
'Play': function() { /* your play code from above */ }
'AudioPlayer.PlaybackNearlyFinished': function() {
// Perform any background tasks here
}
}

Related

How to get notified when data is actually ready for streaming?

I have two streams:
a source stream, which downloads an audio file from the Internet
a consumer stream, which streams the file to a streaming server
Before streaming to the server there should be a handshake which returns a handle. Then I have a few seconds to really start streaming or the server closes the connection.
Which means, that I should
FIRST wait until the source data is ready to be streamed
and only THEN start streaming.
The problem is that there doesn't seem to be a way to get notified when data is ready in the source stream.
The first event that comes to mind is the 'data' event. But it also consumes the data which is not acceptable and doesn't allow to use pipes at all.
So how to do something like this:
await pEvent(sourceStream, 'dataIsReady');
// Negotiate with the server about the transmission
sourceStream.pipe(consumerStream);
Thanks in advance.
Answering to myself.
Here is a solution which works for me.
It requires an auxiliary passthrough stream with a custom event:
class DataWaitPassThroughStream extends Transform {
dataIsReady: boolean = false;
constructor(opts: TransformOptions) {
super(opts);
}
_transform(chunk: any, encoding: BufferEncoding, callback: TransformCallback) {
if (!this.dataIsReady) {
this.dataIsReady = true;
this.emit('dataIsReady');
}
callback(null, chunk);
}
}
Usage
import pEvent from 'p-event';
const dataReadyStream = sourceStream.pipe(new DataWaitPassThroughStream());
await pEvent(dataReadyStream, 'dataIsReady');
// Negotiate with the server about the transmission...
dataReadyStream.pipe(consumerStream);

How can I implement a nodeJS worker that streams data from mongo to elasticsearch?

I'm building a CDC-based application that uses Mongo Change Streams to listen for change events and index the changes in elasticsearch in near real-time.
So far, I've implemented a worker that calls a function to capture events, transform them and index them in elasticsearch with no issues when implementing the stream for 1 mongo collection:
function syncChangeEvents() {
const stream = ModelA.watch()
while (!stream.isClosed()) {
if (await stream.hasNext()) {
const event = stream.next()
// transform event
// index to elasticsearch
}
}
}
I've implemented it using an infinite loop (probably a bad approach) but I'm not sure what alternatives there are when I have to keep the change stream alive forever.
The problem comes when I have to implement a change stream for another model. Since the first function has a while loop that is blocking, the worker can't call the second function to start the second change stream.
I'm wondering what the best way would be to spin up a worker that can trigger x no. of change streams without impacting the performance of each change stream. Would worker threads be the right way to go?
There are three primary ways to work with Change Streams in Node.js.
You can monitor the Change Stream using EventEmitter's on() function.
// See https://mongodb.github.io/node-mongodb-native/3.3/api/Collection.html#watch for the watch() docs
const changeStream = collection.watch(pipeline);
// ChangeStream inherits from the Node Built-in Class EventEmitter (https://nodejs.org/dist/latest-v12.x/docs/api/events.html#events_class_eventemitter).
// We can use EventEmitter's on() to add a listener function that will be called whenever a change occurs in the change stream.
// See https://nodejs.org/dist/latest-v12.x/docs/api/events.html#events_emitter_on_eventname_listener for the on() docs.
changeStream.on('change', (next) => {
console.log(next);
});
// Wait the given amount of time and then close the change stream
await closeChangeStream(timeInMs, changeStream);
You can monitor the Change Stream using hasNext().
// See https://mongodb.github.io/node-mongodb-native/3.3/api/Collection.html#watch for the watch() docs
const changeStream = collection.watch(pipeline);
// Set a timer that will close the change stream after the given amount of time
// Function execution will continue because we are not using "await" here
closeChangeStream(timeInMs, changeStream);
// We can use ChangeStream's hasNext() function to wait for a new change in the change stream.
// If the change stream is closed, hasNext() will return false so the while loop will exit.
// See https://mongodb.github.io/node-mongodb-native/3.3/api/ChangeStream.html for the ChangeStream docs.
while (await changeStream.hasNext()) {
console.log(await changeStream.next());
}
You can monitor the Change Stream using the Stream API
// See https://mongodb.github.io/node-mongodb-native/3.3/api/Collection.html#watch for the watch() docs
const changeStream = collection.watch(pipeline);
// See https://mongodb.github.io/node-mongodb-native/3.3/api/ChangeStream.html#pipe for the pipe() docs
changeStream.pipe(
new stream.Writable({
objectMode: true,
write: function (doc, _, cb) {
console.log(doc);
cb();
}
})
);
// Wait the given amount of time and then close the change stream
await closeChangeStream(timeInMs, changeStream);
If your MongoDB database is hosted on Atlas (https://cloud.mongodb.com), the simplest thing to do is create a Trigger. Atlas handles programming the Change Stream code for you, so you only have to write the code that will transform the event and index them in Elasticsearch.
More information on working with Change Streams and Triggers is available in my blog post. A complete code example for all of the snippets above is available on GitHub.

How to manually stop getDisplayMedia stream to end screen capture?

I'm interested in getting a screenshot from the user and I'm using the getDisplayMedia API to capture the user's screen:
const constraints = { video: true, audio: false };
if (navigator.mediaDevices["getDisplayMedia"]) {
navigator.mediaDevices["getDisplayMedia"](constraints).then(startStream).catch(error);
} else {
navigator.getDisplayMedia(constraints).then(startStream).catch(error);
}
When executed, the browser prompts the user if they want to share their display. After the user accepts the prompt, the provided callback receives a MediaStream. For visualization, I'm binding it directly to a element:
const startStream = (stream: MediaStream) => {
this.video.nativeElement.srcObject = stream;
};
This is simple and very effective so far. Nevertheless, I'm only interested in a single frame and I'd therefore like to manually stop the stream as soon as I've processed it.
What I tried is to remove the video element from the DOM, but Chrome keeps displaying a message that the screen is currently captured. So this only affected the video element but not the stream itself:
I've looked at the Screen Capture API article on MDN but couldn't find any hints on how to stop the stream.
How do I end the stream properly so that the prompt stops as well?
Rather than stopping the stream itself, you can stop its tracks.
Iterate over the tracks using the getTracks method and call stop() on each of them:
stream.getTracks()
.forEach(track => track.stop())
As soon as all tracks are stopped, Chrome's capturing prompt disappears as well.
start screen capture sample code:
async function startCapture() {
logElem.innerHTML = "";
try {
videoElem.srcObject = await navigator.mediaDevices.getDisplayMedia(displayMediaOptions);
dumpOptionsInfo();
} catch(err) {
console.error("Error: " + err);
}
}
stop screen capture sample code:
function stopCapture(evt) {
let tracks = videoElem.srcObject.getTracks();
tracks.forEach(track => track.stop());
videoElem.srcObject = null;
}
more info: MDN - Stopping display capture

Pausing a non-flowing stream (delaying readable event)

On one end I have a http request (long polling). On the other end is a "game server" dispatching events. These ends are tied together with a duplex non-flowing stream in object mode and that part works fine.
The long poll end listens on readable and drains the stream by calling stream.read repeatedly. Then closes the client's connection.
The game server uses stream.write to push events to the clients.
Some events in the game actually spand several events and here's the problem:
When the game server adds several events at once (calling stream.write repeatedly) the first write triggers readable and the long poll is filled with the event and closed. That's very inconvinient.
The essential of the problem is that I can't silent readable and then trigger it when I'm done writing.
So my question is; can I somehow "pause" the stream and resume afterwards?
Is there a known another solution to this problem?
My best bet is to write an array of events, but I think that's somehow an antipattern.
Here's some code to illustrate my problem:
var stream = require('stream');
var connection = stream.PassThrough({ objectMode: true });
var exhaust = function() {
console.log('exhausting');
var chunk;
while ((chunk = connection.read()) !== null)
console.log(chunk);
console.log('exhausting end');
}
connection.on('readable', function(){
console.log('Ready to read');
exhaust();
});
for (var i = 0;i < 10;i++)
connection.write({ test: true });
I ended up writing an array of events to write, but I'm looking forward to this upcoming feature, which could be the missin solution:
http://strongloop.com/strongblog/performance-node-js-v-0-12-whats-new/
http://nodejs.org/docs/v0.11.10/api/stream.html#stream_writable_cork

Socket.IO server throttling a fast client

I have a server that uses socket.io and I need a way of throttling a client that is sending the server data too quickly. The server exposes both a TCP interface and a socket.io interface - with the TCP server (from the net module) I can use socket.pause() and socket.resume(), and this effectively throttles the client. But with socket.io's socket class there are no pause() and resume() methods.
What would be the easiest way of getting feedback to a client that it is overwhelming the server and needs to slow down? I liked socket.pause() and socket.resume() because it didn't require any additional code on the client-side - backup the TCP socket and things naturally slow down. Any equivalent for socket.io?
Update: I provide an API to interact with the server (there is currently a python version which runs over TCP and a JavaScript version which uses socket.io). So I don't have any real control over what the client does. Which is why using socket.pause() and socket.resume() is so great - backing up the TCP stream slows the python client down no matter what it tries to do. I'm looking for an equivalent for a JavaScript client.
With enough digging I found this:
this.manager.transports[this.id].socket.pause();
and
this.manager.transports[this.id].socket.resume();
Granted this probably won't work if the socket.io connection isn't a web sockets connection, and may break in a future update, but for now I'm going to go with it. When I get some time in the future I'll probably change it to the QUOTA_EXCEEDED solution that Pascal proposed.
Here is a dirty way to achieve throttling. Although this is a old post; some people may benefit from it:
First register a middleware:
io.on("connection", function (socket) {
socket.use(function (packet, next) {
if (throttler.canBeServed(socket, packet)) {
next();
}
});
//You other code ..
});
canBeServed is a simple throttler as seen below:
function canBeServed(socket, packet) {
if (socket.markedForDisconnect) {
return false;
}
var previous = socket.lastAccess;
var now = Date.now();
if (previous) {
var diff = now - previous;
//Check diff and disconnect if needed.
if (diff < 50) {
socket.markedForDisconnect = true;
setTimeout(function () {
socket.disconnect(true);
}, 1000);
return false;
}
}
socket.lastAccess = now;
return true;
}
You can use process.hrtime() instead of Date.time().
If you have a callback on your server somewhere which normally sends back the response to your client, you could try and change it like this:
before:
var respond = function (res, callback) {
res.send(data);
};
after
var respond = function (res, callback) {
setTimeout(function(){
res.send(data);
}, 500); // or whatever delay you want.
};
Looks like you should slow down your clients. If one client can send too fast for your server to keep up, this is not going to go very well with 100s of clients.
One way to do this would be have the client wait for the reply for each emit before emitting anything else. This way the server can control how fast the client can send by only answering when ready for example, or only answer after a set time.
If this is not enough, when a client exceeded x requests per second, start replying with something like QUOTA_EXCEEDED error, and ignore the data they send in. This will force external developers to make their app behave as you want them to do.
As another suggestion, I would propose a solution like this:
It is common for MySQL to get a large amount of requests which would take longer time to apply than the rate the requests coming in.
The server can record the requests in a table in db assuming this action is fast enough for the rate the requests are coming in and then process the queue at a normal rate for the server to sustain. This buffer system will allow the server to run slow but still process all the requests.
But if you want something sequential, then the request callback should be verified before the client can send another request. In this case, there should be a server ready flag. If the client is sending request while the flag is still red, then there can be a message telling the client to slow down.
simply wrap your client emitter into a function like below
let emit_live_users = throttle(function () {
socket.emit("event", "some_data");
}, 2000);
using use a throttle function like below
function throttle(fn, threshold) {
threshold = threshold || 250;
var last, deferTimer;
return function() {
var now = +new Date, args = arguments;
if(last && now < last + threshold) {
clearTimeout(deferTimer);
deferTimer = setTimeout(function() {
last = now;
fn.apply(this, args);
}, threshold);
} else {
last = now;
fn.apply(this, args);
}
}
}

Resources