function fridayNight(){
const videoPath = "C:\\GitHub\\DivasLive\\DivasLive\\nsync.mp4";
console.log("It's Friday night and I just Got Paid!");
var b64content = fs.readFileSync(videoPath, { encoding: 'base64' });
var mediaType = MIM.getMIMEType(videoPath);
T.post('media/upload', { media_data: b64content, media_type: mediaType }, function (err, data,
response)
{
if(err)
{
console.log(err);
} else
{
console.log(data);
var mediaIdStr = data.media_id_string
var params = { status: "Just got paid!", media_id: [mediaIdStr] };
T.post('statuses/update', params, function (err, data, response)
{
console.log(data);
console.log(err);
});
};
});
};
I keep getting a 400: Media type unrecognized, but I'm trying to explicitly define it in line 88. Here's the full gist as well. https://gist.github.com/MetzinAround/25b5771564aa7de183391398db52dbef
For video and GIFs, you need to use the chunked media upload method - you cannot do it in a "single shot", it has multiple stages (INIT, APPEND, FINALIZE), per the Twitter docs.
Please note that to upload videos or GIFs (tweet_video, amplify_video, and tweet_gif), you need to use the chunked upload end-point.
It turns out that the twit module has a helper method for doing this, postMediaChunked - this also saves you having to tell Twitter the mime type of the data, which means you can remove the import of the mim module.
Here's a minimal example of just doing the media part - you just need to extract the media_id_string and use that in the statuses/update call:
// Create an Twitter object to connect to Twitter API
const Twit = require('twit')
// Making a Twit object for connection to the API
const T = new Twit(config)
var filePath = '/Users/myuser/Downloads/robot.mp4'
T.postMediaChunked({
file_path: filePath
}, function (err, data, response) {
console.log(data)
})
output:
{
media_id: 1379414276864151600,
media_id_string: '1379414276864151557',
media_key: '7_1379414276864151557',
size: 924669,
expires_after_secs: 86400,
processing_info: { state: 'pending', check_after_secs: 1 }
}
(note that, in JavaScript, you should always use the string versions of Twitter IDs - as you can see here, the numeric version in media_id does not match media_id_string, as JavaScript cannot properly handle long integers, and has mangled the numeric value)
Related
I have an application built using MongoDB as database, Node.js in back-end and AngularJS in front-end.
I need to export to cvs format file some large amount of data from MongoDB and make possible to the user download it. My first implementation retrieved data from MongoDB using stream (Mongoose), saved the file using writable stream and then returned the path to AngularJS download it.
This approach fails for bigger files as it takes longer to be written and the original request in AngularJS times out. I changed the back-end to, instead of writing the file to disk, stream the data to the response so AngularJS can receive them by chunks and download the file.
Node.js
router.get('/log/export', async(req: bloo.BlooRequest, res: Response) => {
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-Disposition', 'attachment; filename=\"' + 'download-' + Date.now() + '.csv\"');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Pragma', 'no-cache');
const query = req.query;
const service = new LogService();
service
.exportUsageLog(req.user, query)
.pipe(res);
});
public exportUsageLog(user: User, query: any) {
let criteria = this.buildQuery(query);
let stream = this.dao.getLogs(criteria);
return stream
.pipe(treeTransform)
.pipe(csvTransform);
}
public getLogs(criteria: any): mongoose.QueryStream {
return this.logs
.aggregate([
{
'$match': criteria
}, {
'$lookup': {
'from': 'users',
'localField': 'username',
'foreignField': '_id',
'as': 'user'
}
}
])
.cursor({})
.exec()
.stream();
}
This Node.js implementation works as I can see the data using Postman application.
My issue is how to receive these chunks in AngularJS, write them to the file and make the download starts in client side after changed the back-end to stream the data instead of only returning the file path.
My current AngularJS code:
cntrl.export = () => {
LogService
.exportLog(cntrl.query)
.then((res: any) => {
if (res.data) {
console.dir(res.data);
let file = res.data;
let host = location.protocol + "//" + window.location.host.replace("www.", "");
let path = file.slice(7, 32);
let fileName = file.slice(32);
let URI = host + "/"+ path + fileName;
let link = document.createElement("a");
link.href = URI;
link.click();
}
})
.catch((err: any) => console.error(err));
}
angular
.module('log', [])
.factory('LogService', LogService);
function LogService(constants, $resource, $http) {
const
ExportLog = $resource(constants.api + 'modules/log/export');
exportLog(query: any) {
return ExportLog.get(query, res => res).$promise;
},
};
}
With the above code, "res" is undefined.
How could I achieve that?
Thanks in advance.
I am making an app where the user browser records the user speaking and sends it to the server which then passes it on to the Google speech to the text interface. I am using mediaRecorder to get 1-second blobs which are sent to a server. On the server-side, I send these blobs over to the Google speech to the text interface. However, I am getting an empty transcriptions.
I know what the issue is. Mediarecorder's default Mime Type id audio/WebM codec=opus, which is not accepted by google's speech to text API. After doing some research, I realize I need to use ffmpeg to convert blobs to LInear16. However, ffmpeg only accepts audio FILES and I want to be able to convert BLOBS. Then I can send the resulting converted blobs over to the API interface.
server.js
wsserver.on('connection', socket => {
console.log("Listening on port 3002")
audio = {
content: null
}
socket.on('message',function(message){
// const buffer = new Int16Array(message, 0, Math.floor(data.byteLength / 2));
// console.log(`received from a client: ${new Uint8Array(message)}`);
// console.log(message);
audio.content = message.toString('base64')
console.log(audio.content);
livetranscriber.createRequest(audio).then(request => {
livetranscriber.recognizeStream(request);
});
});
});
livetranscriber
module.exports = {
createRequest: function(audio){
const encoding = 'LINEAR16';
const sampleRateHertz = 16000;
const languageCode = 'en-US';
return new Promise((resolve, reject, err) =>{
if (err){
reject(err)
}
else{
const request = {
audio: audio,
config: {
encoding: encoding,
sampleRateHertz: sampleRateHertz,
languageCode: languageCode,
},
interimResults: false, // If you want interim results, set this to true
};
resolve(request);
}
});
},
recognizeStream: async function(request){
const [response] = await client.recognize(request)
const transcription = response.results
.map(result => result.alternatives[0].transcript)
.join('\n');
console.log(`Transcription: ${transcription}`);
// console.log(message);
// message.pipe(recognizeStream);
},
}
client
recorder.ondataavailable = function(e) {
console.log('Data', e.data);
var ws = new WebSocket('ws://localhost:3002/websocket');
ws.onopen = function() {
console.log("opening connection");
// const stream = websocketStream(ws)
// const duplex = WebSocket.createWebSocketStream(ws, { encoding: 'utf8' });
var blob = new Blob(e, { 'type' : 'audio/wav; base64' });
ws.send(blob.data);
// e.data).pipe(stream);
// console.log(e.data);
console.log("Sent the message")
};
// chunks.push(e.data);
// socket.emit('data', e.data);
}
I wrote a similar script several years ago. However, I used a JS frontend and a Python backend instead of NodeJS. I remember using a sox transformer to transform the audio input into to an output that the Google Speech API could use.
Perhaps this might be useful for you.
https://github.com/bitnahian/speech-transcriptor/blob/9f186e5416566aa8a6959fc1363d2e398b902822/app.py#L27
TLDR:
Converted from a .wav format to .raw format using ffmpeg and sox.
For an API I need the possibility to create a byte array from a local pdf file.
I've already checked StackOverflow, but the existing solutions do not work. Whenever I try to execute the code I get a bad request from the soap API. If I send zero bytes then I get an error message that was sending zero bytes. The credentials also work. The only thing I can not solve is how I can generate a byte array from a pdf file and send it via node-soap.
Here my code:
var fs = require('fs');
var zlib = require('zlib');
var soap = require('soap');
var path_to_wsdl = "url to wsdl";
var data = {};
data['Authentication'] = {
'Username': "myusername",
'Password': "mypass"
}
data['Options'] = {
'Testmode': true
}
data['Recipient'] = {
'Company': "Firma",
'FirstName': 'Vorname',
'Name': 'Name',
'Address': 'Street Nr.',
'Zip': "68753",
'City': 'Stadt',
'Country': 'DE'
}
var smabuf = fs.createReadStream('test.pdf');
data['File'] = {
'Filename': "test.pdf",
'Bytes': smabuf,
'Color': false,
'Duplex': false,
'Enveloping': 0
}
soap.createClient(path_to_wsdl, function (err, client) {
if (err) {
console.log(err);
}
client.BriefManager.BriefManagerSoap.CreateJob(data, function (err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
});
I was running through this issue and my solution was to use fs library to read the target file in base64 encoding and send it within the request.
/**
* Get a file buffer as a bytes array string
* #param filePath Path to the file
*/
getFileAsBytesArray(filePath) {
return fs.readFileSync(filePath, { encoding: 'base64' });
}
Then use it in the file bytes field.
I'm trying to write a script that will cancel all my orders on GDAX. According to the documentation for Cancel an Order I need to send a DELETE request to /delete. But I assume before I can do that I need to sign the message first.
When I submit the request using fetch in Node, I get this response: { message: 'Invalid API Key' }
Here is the a code sample I am working on, with the confidential stuff replaced of course:
var crypto = require('crypto');
var fetch = require('fetch');
const coinbaseSecret = 'abc...';
const coinbaseAPIKey = 'abc...';
const coinbasePassword = 'abc...';
const coinbaseRestAPIURL = "https://api-public.sandbox.gdax.com";
function start(){
getTime(function(time){
cancelAll(time, function(){
console.log('done');
});
});
}
function getTime(callback){
fetch.fetchUrl(coinbaseRestAPIURL + '/time', null, function(error, meta, body){
var response = JSON.parse(body.toString());
console.log('response', response);
var timeStamp = response.epoch;
callback(timeStamp);
});
}
function cancelAll(timeStamp, callback) {
// Refer to https://docs.gdax.com/#cancel-an-order
var signature = getSignature('DELETE', '/delete', "");
console.log('signature', signature);
var headers = {
'Content-Type': 'application/json',
'CB-ACCESS-KEY': coinbaseAPIKey,
'CB-ACCESS-SIGN': signature,
'CB-ACCESS-TIMESTAMP': timeStamp, //Date.now() / 1000,
'CB-ACCESS-PASSPHRASE': coinbasePassword
};
console.log('headers', headers);
fetch.fetchUrl(coinbaseRestAPIURL + '/delete', {
method: 'DELETE',
headers: headers
}, function(error, meta, body){
var response = JSON.parse(body.toString());
console.log('response', response);
callback();
})
}
function getSignature(method, requestPath, body) {
// Refer to https://docs.gdax.com/#signing-a-message
const secret = coinbaseSecret;
const timestamp = Date.now() / 1000;
const what = timestamp + method + requestPath + body;
const key = Buffer(secret, 'base64');
const hmac = crypto.createHmac('sha256', key);
const signature = hmac.update(what).digest('base64');
return signature;
}
start();
Go to the Gdax-Node Github repo and take a look at their code and examples.
1) Create an authenticatedClient by configuring it with your api details,
2) Then simply use the authedClient object and calncelAllOrders method:
authedClient.cancelAllOrders({product_id: 'BTC-USD'}, callback);
You could wrap this with a function to call 'x' amount of times (it states in the documentation), or you cold think of something fancier if you'd like.
Note:- make sure you pull the github repo and do not install from npm directly as there are a few bugs and issues that have been fixed on the git repo but NOT pushed to npm.
...so use npm install coinbase/gdax-node when downloading your gdax package.
Hope that helps a little...
I am trying to send a post request from a node + express server to my Foxx service on Arangodb.
On the node side :
var route = arangopi + '/edge/' + col.name ;
var body = {data: data, from: fromId, to: toId} ;
console.log('|| body :', route, body) ;
>> || body : http//XXX/_db/my-DB/my-foxx-service/path/to/visitedBy { data: { isBackup: true, text: '', isHint: true, continuance: 3441.5 }, from: 'Drop/27237133', to: 'Bot/41116378' }
return requestify.post (route, body)
On the Foxx side, I receive the request but the logs tell me it has no body :
router.post('/path/to/:param', function (req, res) {
console.log ('|| body :', req.body)
var data = req.body ;
var result = api.DoSomething (req.stateParams.param, data)
res.send(result)
})
.response(joi.object().required(), 'Entry stored in the collection.')
.summary('Summary')
.description('Description')
>> || body : [Object { "binarySlice" : function binarySlice() { [native code] }, "asciiSlice" : function asciiSlice() { [native code] }, "base64Slice" : function base64Slice() { [native code] }, "ucs2Slice" : function ucs2Slice() { [native code] }, "hexSlice" : f...
On the node side I also tried the 'request' module.
return request.post(route, {form:body}, function (error, response, body) {
console.log('error:', error);
console.log('statusCode:', response && response.statusCode);
console.log('body:', body);
return response ;
});
And I get the same logs from Foxx.
What do I do wrong ?
Here is a screenshot of my operation on the Foxx interface. Is it normal that I cannot specify a request body for testing ?
I think the reason is because you haven't specified in the end point in Foxx that there is a body expected as part of the .post.
It took me a while to work out a way of defining Foxx MicroServices, and I read through a number of ArangoDB example code before I settled on a pattern.
To help you get started, I've provided how I would quickly mock up the Foxx MicroService code in a way that is extensible, allowing you to separate your Routes from your Models.
Use these as examples to get your example working.
I've made assumptions that there are two document collections, 'Drop' and 'Bot' with an edge collection that joins them called 'VisitedBy'.
All these files are stored on your Foxx MicroService:
main.js
'use strict';
module.context.use('/v1/visitedBy', require('./routes/visitedBy'), 'visitedBy');
routes/visitedBy.js
'use strict';
const request = require('#arangodb/request');
const joi = require('joi');
const createRouter = require('#arangodb/foxx/router');
const VisitedBy = require('../models/visitedBy');
const visitedDataSchema = joi.object().required().description('Data that tracks a visited event');
const router = createRouter();
module.exports = router;
/*********************************************
* saveVisitedBy
* Path Params:
* none
* Query Params:
* none
* Body Params:
* body (required) The data that is used to record when something is visited
*/
router.post('/', function (req, res) {
const visitedData = req.body;
const savedData = VisitedBy.saveVisitedByData(VisitedBy.fromClient(visitedData));
if (savedData) {
res.status(200).send(VisitedBy.forClient(savedData));
} else {
res.status(500).send('Data not saved, internal error');
}
}, 'saveVisitedBy')
.body(visitedDataSchema, 'visited data')
.response(VisitedBy.savedDataSchema, 'The response after the data is saved')
.summary('Save visited data')
.description('Save visited data');
models/visitedBy.js
'use strict';
const _ = require('lodash');
const joi = require('joi');
const db = require('#arangodb').db;
const visitedByEdgeCollection = 'VisitedBy';
/*
Schema for a response after saving visitedBy data
*/
const savedDataScema = {
id: joi.string(),
data: joi.object(),
_from: joi.string(),
_to: joi.string()
};
module.exports = {
savedDataSchema: savedDataScema,
forClient(obj) {
// Implement outgoing transformations here
// Remove keys on the base object that do not need to go through to the client
if (obj) {
obj = _.omit(obj, ['_id', '_rev', '_oldRev', '_key']);
}
return obj;
},
fromClient(obj) {
// Implement incoming transformations here
return obj;
},
saveVisitedByData(visitedData) {
const q = db._createStatement({
"query": `
INSERT {
_from: #from,
_to: #to,
data: #data,
date: DATE_NOW()
} IN ##col
RETURN MERGE ({ id: NEW._id }, NEW)
`
});
q.bind('#col', visitedByEdgeCollection);
q.bind('from', visitedData.from);
q.bind('to', visitedData.to);
q.bind('data', visitedData.data);
const res = q.execute().toArray();
return res[0];
}
};
Your service should look like this in the Swagger interface:
You can learn more about using joi to define data structures here.
It takes a bit getting used to joi, but once you get some good working examples you can define great data definitions for incoming and outgoing data.
I hope this helps, it was difficult for me getting a basic MicroService code model that made it clear how things operated, I'm sure a lot can be done for this example but it should be a good starting spot.
As David Thomas explained in his answer, I needed to specify a body format in my router code (Foxx side).
In short :
const bodySchema = joi.object().required().description('Data Format');
router.post('/path/to/:param', function (req, res) {
var data = req.body ;
var result = api.DoSomething (req.stateParams.param, data)
res.send(result)
})
.body(bodySchema, 'Body data')
.response(joi.object().required(), 'Entry stored in the collection.')
.summary('Summary')
.description('Description')