I'm trying to write a script that will cancel all my orders on GDAX. According to the documentation for Cancel an Order I need to send a DELETE request to /delete. But I assume before I can do that I need to sign the message first.
When I submit the request using fetch in Node, I get this response: { message: 'Invalid API Key' }
Here is the a code sample I am working on, with the confidential stuff replaced of course:
var crypto = require('crypto');
var fetch = require('fetch');
const coinbaseSecret = 'abc...';
const coinbaseAPIKey = 'abc...';
const coinbasePassword = 'abc...';
const coinbaseRestAPIURL = "https://api-public.sandbox.gdax.com";
function start(){
getTime(function(time){
cancelAll(time, function(){
console.log('done');
});
});
}
function getTime(callback){
fetch.fetchUrl(coinbaseRestAPIURL + '/time', null, function(error, meta, body){
var response = JSON.parse(body.toString());
console.log('response', response);
var timeStamp = response.epoch;
callback(timeStamp);
});
}
function cancelAll(timeStamp, callback) {
// Refer to https://docs.gdax.com/#cancel-an-order
var signature = getSignature('DELETE', '/delete', "");
console.log('signature', signature);
var headers = {
'Content-Type': 'application/json',
'CB-ACCESS-KEY': coinbaseAPIKey,
'CB-ACCESS-SIGN': signature,
'CB-ACCESS-TIMESTAMP': timeStamp, //Date.now() / 1000,
'CB-ACCESS-PASSPHRASE': coinbasePassword
};
console.log('headers', headers);
fetch.fetchUrl(coinbaseRestAPIURL + '/delete', {
method: 'DELETE',
headers: headers
}, function(error, meta, body){
var response = JSON.parse(body.toString());
console.log('response', response);
callback();
})
}
function getSignature(method, requestPath, body) {
// Refer to https://docs.gdax.com/#signing-a-message
const secret = coinbaseSecret;
const timestamp = Date.now() / 1000;
const what = timestamp + method + requestPath + body;
const key = Buffer(secret, 'base64');
const hmac = crypto.createHmac('sha256', key);
const signature = hmac.update(what).digest('base64');
return signature;
}
start();
Go to the Gdax-Node Github repo and take a look at their code and examples.
1) Create an authenticatedClient by configuring it with your api details,
2) Then simply use the authedClient object and calncelAllOrders method:
authedClient.cancelAllOrders({product_id: 'BTC-USD'}, callback);
You could wrap this with a function to call 'x' amount of times (it states in the documentation), or you cold think of something fancier if you'd like.
Note:- make sure you pull the github repo and do not install from npm directly as there are a few bugs and issues that have been fixed on the git repo but NOT pushed to npm.
...so use npm install coinbase/gdax-node when downloading your gdax package.
Hope that helps a little...
Related
Could not figure out why the coinbase v2 REST endpoint returns an invalid signature error, maybe someone see what I'm doing wrong. Everything I have found relates to the use of an old NPM package that is no longer maintained. There is still a Coinbase Pro package, but I don't want to communicate with the Pro API.
const { createHmac } = require('crypto');
const axios = require('axios');
(async () => {
const cbApiKey = 'xxx';
const apiSecret = 'xxx';
const method = 'GET';
const path = '/v2/user';
const body = '';
const timestamp = Math.floor(new Date().getTime() * 1e-3);
const message = timestamp + method + path + body;
const key = Buffer.from(apiSecret, 'base64');
const cbAccessSign = createHmac('sha256', key).update(message).digest('base64');
const instance = axios.create();
try {
const user = await instance.request({
method,
url: `https://api.coinbase.com${path}`,
headers: {
'CB-ACCESS-KEY': `${cbApiKey}`,
'CB-ACCESS-SIGN': `${cbAccessSign}`,
'CB-ACCESS-TIMESTAMP': `${timestamp}`,
"Content-Type": 'application/json',
},
});
console.log(user);
} catch (error) {
console.log(error);
}
})();
I'm going to add something here because it was driving me crazy earlier, ie. trying crypto-js to no avail, then having to fight a good amount and put several workarounds in place to make 'crypto' as it's used in the tutorial work with all the impediments it currently enjoys.
Most of the invalid signatures as far as I can tell go back to the CB-ACCESS-SIGN and the biggest challenge was figuring out what a working equivalent in crypto-js looked like and getting all of this working properly in Angular 10.
A stripped-down version of the API call and the hash string creation for the access sign:
import * as CryptoJS from 'crypto-js';
async getUserCreds(apk: string, aps: string): Promise<any> {
let access_sign = Access_Sign(getUnixTimestamp(), 'GET', '/v2/user','',aps)
let httpOptions = {
headers: new HttpHeaders({
"CB-ACCESS-KEY": apk,
"CB-ACCESS-SIGN": access_sign,
"CB-ACCESS-TIMESTAMP": getUnixTimestamp().toString(),
"Content-Type": "application/json"
})
}
return this.http.get<any>('https://api.coinbase.com/v2/user',httpOptions)
.pipe(shareReplay(), catchError((x) => { return this.handleErrorLog(x)
})).toPromise();
}
export function Access_Sign(timestamp: number, method: string, requestPath: string, body: string, secret: string) {
let prehash = timestamp + method.toUpperCase() + requestPath + body;
return CryptoJS.HmacSHA256(prehash, secret).toString(CryptoJS.enc.Hex);
}
export function getUnixTimestamp() {
return Math.floor(Date.now() / 1000)
}
I found the answer here https://github.com/coinbase/coinbase-node/blob/master/lib/ClientBase.js#L101
The correct code for the signature is
var signature = crypto.createHmac('sha256', this.apiSecret).update(message).digest('hex');
If you're still facing this issue, make sure the path includes /
I was facing this issue because the path portion, that's used on message was accounts instead of /accounts
I am trying to invoke Lambda through cloudfront viewer request . Here is my Lambda code
'use strict';
const AWS = require("aws-sdk");
const docClient = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
/* Get request */
const request = event.Records[0].cf.request;
const requestbody = Buffer.from(request.body.data, 'base64').toString();
const data = JSON.parse(requestbody);
const Id = data.Name;
console.log(Id);
/* Generate body for response */
const body =
'<html>\n'
+ '<head><title>Hello From Lambda#Edge</title></head>\n'
+ '<body>\n'
+ '<h1>You clicked more than 10 Times </h1>\n'
+ '</body>\n'
+ '</html>';
var params = {
TableName: "Test",
ProjectionExpression: "#V,#N",
KeyConditionExpression: "#N = :v1",
ExpressionAttributeNames: {
"#N" : "Name",
"#V" : "Value"
},
ExpressionAttributeValues: {
":v1": Id
}
};
var querydb = docClient.query(params).promise();
querydb.then(function(data) {
console.log(data.Items[0].Value);
if(data.Items[0].Value >= 11){
const response = {
status: '200',
body: body,
};
callback(null, response);
}else {
callback(null,request);
}
}).catch(function(err) {
console.log(err);
});
};
When i triggered the same lambda through console it is giving correct response. But when i deployed through Cloudfront it is giving 503 Error. But i had tried the same code withcode Dynamodb Client it worked perfectly fine. Here is the working one
'use strict';
const AWS = require("aws-sdk");
const docClient = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
/* Get request */
const request = event.Records[0].cf.request;
const requestbody = Buffer.from(request.body.data, 'base64').toString();
const data = JSON.parse(requestbody);
/* Generate body for response */
const body =
'<html>\n'
+ '<head><title>Hello From Lambda#Edge</title></head>\n'
+ '<body>\n'
+ '<h1>You clicked more than 10 Times </h1>\n'
+ '</body>\n'
+ '</html>';
if(data.Value >= 10){
const response = {
status: '200',
body: body,
};
callback(null, response);
}
else {
callback(null, request);
}
};
I had given full dynamodb permissions to the lambda#edge.
Any help is appreciated
Thanks
Where have you specified region for DyanamoDB?
It is possible that Lambda#Edge is executing in a region where your DDB table is missing.
Have a look at AWS doc on region's order of precedence. You can also look at this L#E workshop code and documentation for more details on calling DDB.
On a side note: A viewer facing Lambda function, making a call to a cross region dynamodb table will have negative effects on your latency. Not sure about your use case but see if it is possible to move this call to an origin facing event or make async call to ddb.
I need to download or process a file from a soap based web service in node.js.
can someone suggest me on how to handle this in node.js
I tried with 'node-soap' or 'soap' NPM module. it worked for normal soap web service. But, not for binary steam or MTOM based SOAP web service
I want to try to answer this... It's quite interesting that 2 years and 2 months later I can not figure it out how to easily solve the same problem.
I'm trying to get the attachment from a response like:
...
headers: { 'cache-control': 'no-cache="set-cookie"',
'content-type': 'multipart/related;boundary="----=_Part_61_425861994.1525782562904";type="application/xop+xml";start="";start-info="text/xml"',
...
body: '------=_Part_61_425861994.1525782562904\r\nContent-Type:
application/xop+xml; charset=utf-8;
type="text/xml"\r\nContent-Transfer-Encoding: 8bit\r\nContent-ID:
\r\n\r\n....\r\n------=_Part_61_425861994.1525782562904\r\nContent-Type:
application/octet-stream\r\nContent-Transfer-Encoding:
binary\r\nContent-ID:
\r\n\r\n�PNG\r\n\u001a\n\u0000\u0000\u0000\rIHDR\u0000\u0000\u0002,\u0000\u0000\u0005�\b\u0006\u0........binary....
I tried ws.js but no solution.
My solution:
var request = require("request");
var bsplit = require('buffer-split')
// it will extract "----=_Part_61_425861994.1525782562904" from the response
function getBoundaryFromResponse(response) {
var contentType = response.headers['content-type']
if (contentType && contentType.indexOf('boundary=') != -1 ) {
return contentType.split(';')[1].replace('boundary=','').slice(1, -1)
}
return null
}
function splitBufferWithPattern(binaryData, boundary) {
var b = new Buffer(binaryData),
delim = new Buffer(boundary),
result = bsplit(b, delim);
return result
}
var options = {
method: 'POST',
url: 'http://bla.blabal.../file',
gzip: true,
headers: {
SOAPAction: 'downloadFile',
'Content-Type': 'text/xml;charset=UTF-8'
},
body: '<soapenv: ... xml request of the file ... elope>'
};
var data = [];
var buffer = null;
var filename = "test.png"
request(options, function (error, response, body) {
if (error) throw new Error(error);
if (filename && buffer) {
console.log("filename: " + filename)
console.log(buffer.toString('base64'))
// after this, we can save the file from base64 ...
}
})
.on('data', function (chunk) {
data.push(chunk)
})
.on('end', function () {
var onlyPayload = splitBufferWithPattern(Buffer.concat(data), '\r\n\r\n') // this will get from PNG
buffer = onlyPayload[2]
buffer = splitBufferWithPattern(buffer, '\r\n-')[0]
console.log('Downloaded.');
})
I am not sure it will works in most of the cases. It looks like unstable code to my eyes and so I'm looking for something better.
Use ws.js
Here is how to fetch the file attachments:
const ws = require('ws.js')
const { Http, Mtom } = ws
var handlers = [ new Mtom(), new Http()];
var request = '<s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope">' +
'<s:Body>' +
'<EchoFiles xmlns="http://tempuri.org/">' +
'<File1 />' +
'</EchoFiles>' +
'</s:Body>' +
'</s:Envelope>'
var ctx = { request: request
, contentType: "application/soap+xml"
, url: "http://localhost:7171/Service/mtom"
, action: "http://tempuri.org/IService/EchoFiles"
}
ws.send(handlers, ctx, function(ctx) {
//read an attachment from the soap response
var file = ws.getAttachment(ctx, "response", "//*[local-name(.)='File1']")
// work with the file
fs.writeFileSync("result.jpg", file)
})
Two limitations:
No basic auth provided out-of-box, patch required https://github.com/yaronn/ws.js/pull/40
If the file name is an url, you need to apply another patch at mtom.js. Replace:
.
xpath = "//*[#href='cid:" + encodeURIComponent(id) + "']//parent::*"
with:
xpath = "//*[#href='cid:" + id + "']//parent::*"
This is a function in Node.js, which reads data from Analytics:
function getDataFromGA(Dimension, Metric, StartDate, EndDate, MaxResults) {
var fs = require('fs'),
crypto = require('crypto'),
request = require('request'); // This is an external module
var authHeader = {
'alg': 'RS256',
'typ': 'JWT'
},
authClaimSet = {
'iss': '***t#developer.gserviceaccount.com', // Service account email
'scope': 'https://www.googleapis.com/auth/analytics.readonly',
// We MUST tell them we just want to read data
'aud': 'https://accounts.google.com/o/oauth2/token'
},
SIGNATURE_ALGORITHM = '**',
SIGNATURE_ENCODE_METHOD = '**',
GA_KEY_PATH = '**',
//finds current directory then appends private key to the directory
gaKey;
function urlEscape(source) {
return source.replace(/\+/g, '-').replace(/\//g, '_').replace(/\=+$/, '');
}
function base64Encode(obj) {
var encoded = new Buffer(JSON.stringify(obj), 'utf8').toString('base64');
return urlEscape(encoded);
}
function readPrivateKey() {
if (!gaKey) {
gaKey = fs.readFileSync(GA_KEY_PATH, 'utf8');
}
return gaKey;
}
var authorize = function (callback) {
var self = this,
now = parseInt(Date.now() / 1000, 10), // Google wants us to use seconds
cipher,
signatureInput,
signatureKey = readPrivateKey(),
signature,
jwt;
// Setup time values
authClaimSet.iat = now;
authClaimSet.exp = now + 60; // Token valid for one minute
// Setup JWT source
signatureInput = base64Encode(authHeader) + '.' + base64Encode(authClaimSet);
// Generate JWT
cipher = crypto.createSign('RSA-SHA256');
cipher.update(signatureInput);
signature = cipher.sign(signatureKey, 'base64');
jwt = signatureInput + '.' + urlEscape(signature);
// Send request to authorize this application
request({
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
},
uri: 'https://accounts.google.com/o/oauth2/token',
body: 'grant_type=' + escape('urn:ietf:params:oauth:grant-type:jwt-bearer') +
'&assertion=' + jwt
}, function (error, response, body) {
if (error) {
console.log(error);
callback(new Error(error));
} else {
var gaResult = JSON.parse(body);
if (gaResult.error) {
callback(new Error(gaResult.error));
} else {
callback(null, gaResult.access_token);
// console.log(gaResult);
console.log("Authorized");
}
}
});
};
var request = require('request'),
qs = require('querystring');
authorize(function (err, token) {
if (!err) {
// Query the number of total visits for a month
var requestConfig = {
'ids': 'ga:72333024',
'dimensions': Dimension,
'metrics': Metric,
// 'sort': '-ga:users',
'start-date': StartDate,
'end-date': EndDate,
'max-results': MaxResults
};
request({
method: 'GET',
headers: {
'Authorization': 'Bearer ' + token // Here is where we use the auth token
},
uri: 'https://www.googleapis.com/analytics/v3/data/ga?' + qs.stringify(requestConfig)
}, function (error, resp, body) {
console.log(body);
var data = JSON.parse(body);
console.log(data.totalsForAllResults);
console.log(data.rows);
});
}
});
}
Here I try to access it from outside:
var gaJSON = utils.getDataFromGA("ga:country", "ga:pageviews", "2011-08-04", "2014-09-12", "50");
res.send(gaJSON);
My question is how I can access the variable data in the end of the first method? How can I call it from outside of the function?
You can assign data to a variable declared in the first function. But since the authorize method is asynchronous the variable data will still be undefined at the end of the first function. The best way to do this is handle with callbacks.
I think you wanna return something related to this variable, right? Try to put a callback parameter to the first function and then call this function passing the result.
callback(variable)
Why do you want to access if from outside. ??
Even though you want to desperately then you need to create a function pass the "data" as argument and then invoke the function .
console.log(body);
var data = JSON.parse(body);
myNewFunction(data);
Write all ur logic inside "myNewFunction" that uses data .
I'm trying to fetch an image from the web and encode it with base64.
What I have so far is this:
var request = require('request');
var BufferList = require('bufferlist').BufferList;
bl = new BufferList(),
request({uri:'http://tinypng.org/images/example-shrunk-8cadd4c7.png',responseBodyStream: bl}, function (error, response, body)
{
if (!error && response.statusCode == 200)
{
var type = response.headers["content-type"];
var prefix = "data:" + type + ";base64,";
var base64 = new Buffer(bl.toString(), 'binary').toString('base64');
var data = prefix + base64;
console.log(data);
}
});
This seems to be pretty close to the solution, but I can't quite get it to work. It recognizes the data type and gives out this output:
data:image/png;base64
However, the bufferlist 'bl' seems to be empty.
BufferList is obsolete, as its functionality is now in Node core. The only tricky part here is setting request not to use any encoding:
var request = require('request').defaults({ encoding: null });
request.get('http://tinypng.org/images/example-shrunk-8cadd4c7.png', function (error, response, body) {
if (!error && response.statusCode == 200) {
data = "data:" + response.headers["content-type"] + ";base64," + Buffer.from(body).toString('base64');
console.log(data);
}
});
If anyone encounter the same issue while using axios as the http client, the solution is to add the responseType property to the request options with the value of 'arraybuffer':
let image = await axios.get('http://aaa.bbb/image.png', {responseType: 'arraybuffer'});
let returnedB64 = Buffer.from(image.data).toString('base64');
Hope this helps
LATEST, AS OF 2017 ENDING
Well, after reading above answers and a bit research, I got to know a new way which doesn't require any package installation, http module(which is built-in) is enough!
NOTE: I have used it in node version 6.x, so I guess its also applicable to above versions.
var http = require('http');
http.get('http://tinypng.org/images/example-shrunk-8cadd4c7.png', (resp) => {
resp.setEncoding('base64');
body = "data:" + resp.headers["content-type"] + ";base64,";
resp.on('data', (data) => { body += data});
resp.on('end', () => {
console.log(body);
//return res.json({result: body, status: 'success'});
});
}).on('error', (e) => {
console.log(`Got error: ${e.message}`);
});
I hope it helps!
Also, check more about the http.get(...) here !
Another way of using node fetch, which breaks down the steps per variable:
const fetch = require('node-fetch');
const imageUrl = "Your URL here";
const imageUrlData = await fetch(imageUrl);
const buffer = await imageUrlData.arrayBuffer();
const stringifiedBuffer = Buffer.from(buffer).toString('base64');
const contentType = imageUrlData.headers.get('content-type');
const imageBas64 =
`data:image/${contentType};base64,${stringifiedBuffer}`;
If you know the image type, it's a one-liner with the node-fetch package. Might not suit everyone, but I already had node-fetch as a dependency, so in case others are in a similar boat:
await fetch(url).then(r => r.buffer()).then(buf => `data:image/${type};base64,`+buf.toString('base64'));
If you are using axios then you can follow below steps
var axios = require('axios');
const url ="put your url here";
const image = await axios.get(url, {responseType: 'arraybuffer'});
const raw = Buffer.from(image.data).toString('base64');
const base64Image = "data:" + image.headers["content-type"] + ";base64,"+raw;
you can check with decode base64.
You can use the base64-stream Node.js module, which is a streaming Base64 encoder / decoder. The benefit of this method is that you can convert the image without having to buffer the whole thing into memory, and without using the request module.
var http = require('http');
var base64encode = require('base64-stream').Encode;
http.get('http://tinypng.org/images/example-shrunk-8cadd4c7.png', function(res) {
if (res.statusCode === 200)
res.pipe(base64encode()).pipe(process.stdout);
});
I use for load and encode image into base64 string node-base64-image npm module.
Download and encode an image:
var base64 = require('node-base64-image');
var options = {string: true};
base64.base64encoder('www.someurl.com/image.jpg', options, function (err, image) {
if (err) {
console.log(err);
}
console.log(image);
});
Encode a local image:
var base64 = require('node-base64-image');
var path = __dirname + '/../test.jpg',
options = {localFile: true, string: true};
base64.base64encoder(path, options, function (err, image) {
if (err) { console.log(err); }
console.log(image);
});
Oneliner:
Buffer.from(
(
await axios.get(image, {
responseType: "arraybuffer",
})
).data,
"utf-8"
).toString("base64")
Old post but could help someone.
On the basis of Dmytro response that help me.
const base64FromUrl = async (url: string) => {
try {
return Buffer.from((await axios.get(url, { responseType: "arraybuffer", })).data, "utf-8").toString("base64")
} catch (error) {
return ""
}
}
I've just add error handling.
You can use image-to-base64 Node.js module
The benefit of using this module is that you convert your image hassle-free
const imageToBase64 = require('image-to-base64');
const imageLink = 'Your image link'
imageToBase64(imageLink);
.then((response) => {
const base64Image = `data:image/png;base64,${response}`
console.log(base64Image);
})
.catch((error) => {
console.log(error);
})