I am trying to use StreamingPull from Pub/Sub. There are messages published but I don't see any response. for following code
const {v1 } = require('#google-cloud/pubsub');
const request = {
subscription: 'projects/<projectId>/subscriptions/Temp'',
stream_ack_deadline_seconds: 600
};
console.log('Pulling Messages...');
const stream = await subClient.streamingPull({});
stream.on('data', response => {
console.log(response);
});
stream.on('error', err => {
console.log(err);
});
stream.on('end', () => {
console.log("end");
});
stream.write(request);
stream.end();
I see the code silently finishing without the response being logged. Is there any attribute I am missing in my request. As per doc of StreamingPullRequest nothing else is mandatory. The only usage example is here in the test files.
Related
I am looking at a few, currently widely used request libraries and how I could use them to automate file downloads + make them reliable enough.
I stumbled over download (npm), but since its based on got (npm) I thought I would try got first directly.
Problem
One problem I could encounter while downloading a file, is that the source file (on the server) could be overwritten during download. When I try reproduce this behaviour with got, got just stops the download process without rising any errors.
What I have so far
The only solution I could come up with, was to use got.stream - piping the request into a FileWriter, and compare total with transferred after the request has ended.
const app = require('express')();
const fs = require('fs');
const stream = require('stream');
const { promisify } = require('util');
const got = require('got');
const pipeline = promisify(stream.pipeline);
app.use('/files', require('express').static('files'));
app.listen(8080);
(async () => {
try {
let progress = null;
// Setup Got Request + EventHandlers
const request = got.stream('http://localhost:8080/files/1gb.test')
.on('downloadProgress', (p) => { progress = p; })
.on('end', () => {
console.log("GOT END");
console.log(progress && progress.transferred === progress.total ? "COMPLETE" : "NOTCOMPLETE");
})
// this does not get fired when source file is overwritten
.on('error', (e) => {
console.log("GOT ERROR");
console.log(e.message);
});
// WriteStream + EventHandlers
const writer = fs.createWriteStream('./downloaded/1gb_downloaded.test')
.on('finish', () => {
console.log("WRITER FINISH");
})
.on('error', (error) => {
console.log("WRITER ERROR", error.message);
})
.on('end', () => {
console.log("WRITER END");
})
.on('close', () => {
console.log("WRITER CLOSE");
});
await pipeline(request, writer);
} catch (e) {
console.error(e.name, e.message);
}
})();
Where do the files come from
In the real world the files i am trying to download are coming from a server which I do not have access to, I don't own it. I don't have any information how this server is setup. However I added a simple local express server to the example code above to try things out.
const app = require('express')();
app.use('/files', require('express').static('files'));
app.listen(8080);
Question
Is this solution reliable enough to detect a "none-finished" download ( so for the case the source file gets overwritten during download ) ? Or are there any othere events I could listen to which I missed ?
The got Request stream emits a error event whenever something goes wrong.
const request = got('http://localhost:8080/files/1gb.test')
.on('downloadProgress', (p) => { progress = p; })
.on('end', (e) => {
console.log("GOT END");
})
.on('error', (err) => {
// Handle error here
});
Various properties in the error object are available here
progress.total will not be available unless the server explicity sets the Content-Length (Most servers do, but you might want to have a look out for that)
It seems there is no inbuilt way to safely check if a download has been completed 100% using got. I came to the conclusion that my best option for now would be to use the NodeJS Module http, which includes, on the ClientRequest Object, an aborted property. When the ReadStream emits an end event I can check whether aborted is true or false.
I tested this method in the case when the source file gets overwritten during download, it works !
const http = require('http');
const app = require('express')();
const fs = require('fs');
app.use('/files', require('express').static('files'));
app.listen(8080);
http.get('http://localhost:8080/files/1gb.test', function (response) {
// WriteStream + EventHandlers
const writer = fs.createWriteStream('./downloaded/1gb_downloaded.test')
.on('finish', () => {
console.log("WRITER FINISH");
})
.on('error', (error) => {
console.log("WRITER ERROR", error.message);
})
.on('end', () => {
console.log("WRITER END");
})
.on('close', () => {
console.log("WRITER CLOSE");
});
// ReadStream + EventHandlers
response
.on('error', (e) => {
console.log("READER ERROR", e.message)
})
.on('end', () => {
console.log("READER END")
console.log(response.aborted ? "DOWNLOAD NOT COMPLETE" : "DOWNLOAD COMPLETE")
})
.on('close', () => {
console.log("READER CLOSE")
})
response.pipe(writer);
});
On the upside, this gives me -1 on dependencies :) , since I don't need got.
On the downside, this just assures me, that a running download was not aborted due to the source file being overwritten. When using http module I need to include more error handling when for example the file was not found to begin with, which had been more convenient using a request library like axios or got.
UPDATE
Realizing that the ReadableStream from http has something like the aborted property made me wonder why none of the request wrapper libraries like got does offer something similar. So I tried axios again, with :
axios({
method: 'get',
url: 'http://localhost:8080/files/1gb.test',
responseType: 'stream'
}).then( function ( response ) {
});
Here the ReadableStream comes in response.data and it has the same aborted property ! 🎉 .
I have 2 calls to the YouTube v3 API in my NodeJS code: channels and PlaylistItems. They both return JSON and the response to the first call is parsed just fine, but parsing the response to the second call causes a syntax error. I am uncertain whether it's an error on my side or in the PlaylistItems API endpoint.
Here is my code (taken out irrelevant parts):
// At start of the bot, fetches the latest video which is compared to if an announcement needs to be sent
function setLatestVideo () {
fetchData().then((videoInfo) => {
if (videoInfo.error) return;
latestVideo = videoInfo.items[0].snippet.resourceId.videoId;
});
}
// Fetches data required to check if there is a new video release
async function fetchData () {
let path = `channels?part=contentDetails&id=${config.youtube.channel}&key=${config.youtube.APIkey}`;
const channelContent = await callAPI(path);
path = `playlistItems?part=snippet&maxResults=1&playlistId=${channelContent.items[0].contentDetails.relatedPlaylists.uploads}&key=${config.youtube.APIkey}`;
const videoInfo = await callAPI(path);
return videoInfo;
}
// Template HTTPS get function that interacts with the YouTube API, wrapped in a Promise
function callAPI (path) {
return new Promise((resolve) => {
const options = {
host: 'www.googleapis.com',
path: `/youtube/v3/${path}`
};
https.get(options, (res) => {
if (res.statusCode !== 200) return;
const rawData = [];
res.on('data', (chunk) => rawData.push(chunk));
res.on('end', () => {
try {
resolve(JSON.parse(rawData));
} catch (error) { console.error(`An error occurred parsing the YouTube API response to JSON, ${error}`); }
});
}).on('error', (error) => console.error(`Error occurred while polling YouTube API, ${error}`));
});
}
Examples of errors I'm getting: Unexpected token , in JSON and Unexpected number in JSON
Till ~2 weeks ago this code used to work just fine without throwing any errors, I have no clue what has changed and can't seem to figure it out. What could possibly be causing this?
10 minutes later I figure out the fix! The variable rawData holds a Buffer and looking more into that, I figured that I should probably use Buffer.concat() on rawData before calling JSON.parse() on it. Turns out, that's exactly what was needed.
I'm still unsure how this was causing problems only 6 months after writing this code, but that tends to happen.
Changed code:
res.on('end', () => {
try {
resolve(JSON.parse(Buffer.concat(rawData)));
} catch (error) { console.error(`An error occurred parsing the YouTube API response to JSON, ${error}`); }
});
I'm trying to read an http/2 push stream, but the code below fails with an error. When I use the same URL in Chrome browser, I receive the 'heartbeat' being generated by the stream server and can see data coming in as well. Any pointers? My goal is simply to read all the JSON push responses being generated by the server.
Error:
{ Error [ERR_HTTP2_ERROR]: Protocol error
at Http2Session.onSessionInternalError [as error] (internal/http2/core.js:637:26)
code: 'ERR_HTTP2_ERROR',
name: 'Error [ERR_HTTP2_ERROR]',
errno: -505 }
Node.JS Code:
const http2 = require('http2');
const client = http2.connect('http://api.service.com/absolute/path/subscribe?api_key={key}');
const req = client.request();
req.setEncoding('utf8');
req.on('response', (headers, flags) => {
console.log(headers);
});
let data = '';
req.on('data', (d) => data += d);
req.on('end', () => {
console.log('end');
console.log(data);
client.destroy()
});
req.end();
Side Note: I'm new to Node.js, and http/2 push stream is a new topic for me as well, so consider me a beginner, struggling with this.
Resolved with some help, using 'request' library:
https://www.npmjs.com/package/request
const request = require("request");
const getPushData = () => {
let completeResponse = "";
request
.get('http://api.service.com/absolute/path/subscribe?api_key={key}')
.on('response', (response)=>{
console.log("Response received successfully");
})
.on('data', (chunk)=>{
console.log("Receiving the chunk...");
console.log(chunk);
let dataReceived = new Buffer(chunk).toString("utf8");
console.log(dataReceived);
completeResponse += chunk;
})
.on('error', (err)=>{
console.log("Something went wrong:");
console.log(err.message);
})
.on("end", ()=>{
console.log("The complete data received is:");
console.log(completeResponse);
let jsonObj = JSON.parse(completeResponse);
console.log(jsonObj);
});
}
getPushData();
Side Note: The problem with this script is that if connection drops, the stream doesn't resume.
We're doing some experimenting with Dialogflow and we've run into a complete stop for the time being. We're trying to set up a browser client that streams audio in chunks to Dialogflow via the node v2beta1 version of the dialogflow npm package. We followed the example to get it running and it works fine when we use the node server to pick up the sound via extra software (sox), but we want to stream from the browser. So we've set up a small code snippet that picks up the MediaStream from the mic.
When the data event is triggerend we get a chunk (an arraybuffer) that we, in chunks, pass to our node server.
On the server we've followed this example: https://cloud.google.com/dialogflow-enterprise/docs/detect-intent-stream#detect-intent-text-nodejs. The only thing we do different is instead of using pump to chain streams, we just write our chunks to the sessionsClient.
streamingDetectIntent().write({ inputAudio: [chunk] })
During experimentation we received several errors that we solved. But at this point we pass our chunks and receive empty responses, during and at the end.
Is this a valid way of passing audio to dialogflow, or do we really need to set up a stream? We do not want to use the node server as an entry, it needs to be the browser. We will have full control.
Client
import getUserMedia from 'get-user-media-promise';
import MicrophoneStream from 'microphone-stream';
export const startVoiceStream = () => {
const microphoneStream = new MicrophoneStream();
getUserMedia({ video: false, audio: true })
.then(function(micStream) {
microphoneStream.setStream(micStream);
socket.emit('startMicStream');
state.streamingMic = true;
setTimeout(() => {
// Just closing the stream on a timer for now
socket.emit('endMicStream');
}, 5000);
})
.catch(function(error) {
console.log(error);
});
microphoneStream.on('data', function(chunk) {
if (state.streamingMic) {
socket.emit('micStreamData', chunk);
}
});
};
Server code is much longer so I think I'll spare the details, but these are the main parts.
const initialStreamRequest = {
session: sessions.sessionPath,
queryParams: {
session: sessions.sessionPath, //TODO: try to delete
},
queryInput: {
audioConfig: {
audioEncoding: 'AUDIO_ENCODING_LINEAR_16',
sampleRateHertz: '16000',
languageCode: 'en-US',
},
singleUtterance: false
},
};
const startRecognitionStream = socketClient => {
streamIntent = sessions.sessionClient
.streamingDetectIntent()
.on('error', error => {
console.error({ error });
socketClient.emit('streamError', error);
})
.on('data', data => {
socketClient.emit('debug', { message: 'STREAM "ON DATA"', data });
if (data.recognitionResult) {
socketClient.emit(
'playerTranscript',
data.recognitionResult.transcript,
);
console.log(
`#Intermediate transcript : ${data.recognitionResult.transcript}`,
);
} else {
socketClient.emit('streamAudioResponse', data);
}
});
streamIntent.write(initialStreamRequest);
};
socket.on('micStreamData', data => {
if (streamIntent !== null) {
stop = true;
streamIntent.write({ inputAudio: data });
}
});
I am trying to call a rest API from Firebase function which servers as a fulfillment for Actions on Google.
I tried the following approach:
const { dialogflow } = require('actions-on-google');
const functions = require('firebase-functions');
const http = require('https');
const host = 'wwws.example.com';
const app = dialogflow({debug: true});
app.intent('my_intent_1', (conv, {param1}) => {
// Call the rate API
callApi(param1).then((output) => {
console.log(output);
conv.close(`I found ${output.length} items!`);
}).catch(() => {
conv.close('Error occurred while trying to get vehicles. Please try again later.');
});
});
function callApi (param1) {
return new Promise((resolve, reject) => {
// Create the path for the HTTP request to get the vehicle
let path = '/api/' + encodeURIComponent(param1);
console.log('API Request: ' + host + path);
// Make the HTTP request to get the vehicle
http.get({host: host, path: path}, (res) => {
let body = ''; // var to store the response chunks
res.on('data', (d) => { body += d; }); // store each response chunk
res.on('end', () => {
// After all the data has been received parse the JSON for desired data
let response = JSON.parse(body);
let output = {};
//copy required response attributes to output here
console.log(response.length.toString());
resolve(output);
});
res.on('error', (error) => {
console.log(`Error calling the API: ${error}`)
reject();
});
}); //http.get
}); //promise
}
exports.myFunction = functions.https.onRequest(app);
This is almost working. API is called and I get the data back. The problem is that without async/await, the function does not wait for the "callApi" to complete, and I get an error from Actions on Google that there was no response. After the error, I can see the console.log outputs in the Firebase log, so everything is working, it is just out of sync.
I tried using async/await but got an error which I think is because Firebase uses old version of node.js which does not support async.
How can I get around this?
Your function callApi returns a promise, but you don't return a promise in your intent handler. You should make sure you add the return so that the handler knows to wait for the response.
app.intent('my_intent_1', (conv, {param1}) => {
// Call the rate API
return callApi(param1).then((output) => {
console.log(output);
conv.close(`I found ${output.length} items!`);
}).catch(() => {
conv.close('Error occurred while trying to get vehicles. Please try again later.');
});
});