UrlFetchApp Google Scripts VS Node Fetch - node.js

I'm having some problems to do a simple POST request with UrlFetchApp on Google Scripts.
This code works fine on NodeJS with node-fetch lib.
const fetch = require('node-fetch');
const URL = "https://login.XXXXXXXXXXXXX.com.br/api/login"
fetch(URL, {
"body": "{'my_json_data': 'login data'}",
"method": "POST",
}).then(res => res.text())
.then(body => console.log(JSON.parse(body)));
The same request on a Google Scripts project using UrlFetchApp give me a 403 Forbidden HTTP error.
var url = 'https://login.XXXXXXXXX.com.br/api/login';
var data = {
'email':'EMAIL',
'password':'PASS'
}
var options = {
method: 'POST',
payload: JSON.stringify(data)
}
var response = UrlFetchApp.fetch(url, options);
Logger.log(response)
What am I missing here?
Edit: Already tried with payload.

I believe your goal as follows.
You want to convert the following Node.js script to Google Apps Script.
const fetch = require('node-fetch');
const URL = "https://login.XXXXXXXXXXXXX.com.br/api/login"
fetch(URL, {
"body": "{'my_json_data': 'login data'}",
"method": "POST",
}).then(res => res.text())
.then(body => console.log(JSON.parse(body)));
Modification points:
In your Node.js script, the data of "{'my_json_data': 'login data'}" is sent to data as text/plain. So in this case, when UrlFetchApp is used, the content type is required to be set. Because when the content type is not set, the data is sent as form-data.
When above points are reflected to your script, it becomes as follows.
Modified script:
From:
var options = {
method: 'POST',
payload: JSON.stringify(data)
}
To:
var options = {
method: 'POST',
payload: JSON.stringify(data),
contentType: "text/plain"
}
If contentType: "text/plain" occurs an error, please modify it to contentType: "text/plain;charset=UTF-8".
Although I'm not sure about the specification of your API, I thought that contentType: "application/json" might be able to be also used. But this is my guess.
Reference:
Class UrlFetchApp

Related

Grabbing an image from a URL, then passing it to an API as a file in multipart form data

So I have a URL that contains an image, and I want to pass that image as part of multipart form data to an API (to be specific, if it matters, the ClickUp API). I'm doing all of this inside of a Figma plugin, which is a browser environment.
The url looks something like https://s3-alpha-sig.figma.com....
The request works perfectly for a local image that I add manually, such as in Postman. Here is the code for a successful Postman request to this endpoint:
var axios = require('axios');
var FormData = require('form-data');
var fs = require('fs');
var data = new FormData();
data.append('attachment', fs.createReadStream('7UI7S5-pw/fdb54856-9c05-479f-b726-016ef252d9f5.png'));
data.append('filename', 'example.png');
var config = {
method: 'post',
url: 'https://api.clickup.com/api/v2/task/2phh5bf/attachment',
headers: {
'Authorization': '(my auth token)',
...data.getHeaders()
},
data : data
};
axios(config)
.then(function (response) {
console.log(JSON.stringify(response.data));
})
However, I don't have access to local files and need to upload from a URL, so here's what I've done so far:
var data = new FormData();
data.append('attachment', open(imgURL));
data.append('filename', 'screenshot.png');
fetch(`(the URL)`, {
"method": "POST",
"muteHttpExceptions": true,
"headers": {
'Authorization': '(my auth token)',
...data.headers
},
data: data
}).then(response => {
console.log(response)
})
How should I be converting the URL into something I can input as Form Data? Thanks so much in advance!

POST request to DeepL api through node.js not working

Can anyone spot any problems that may explain why the api client is giving me the forbidden error? I know the credentials are correct, as GET requests w the same info in the url work find.
Thank you in advance
app.get('/translate', (req, res) => {
var textToTranslate = "Hello friend"
const targetLanguage = "ES"
var link = `https://api-free.deepl.com/v2/translate`
var options =
{
method: 'POST',
headers: {
"Host": 'api-free.deepl.com',
"Content-Length": 54,
"Content-Type": 'application/x-www-form-urlencoded',
"User-Agent": "YourApp",
"Accept": "*/*",
},
body: JSON.stringify({
'auth_key': deeplAccessCode,
'text': textToTranslate,
'target_lang': targetLanguage
}),
}
return fetch(link, options)
.then((response) => {
console.log(response)
return response.json(); //Transform http body to json
})
.then((json)=> {
res.send(json) //return json to browser
})
.catch(e => {
console.log(e)
return res.sendStatus(400);
});
})
It's probably failing because you're setting your Content-Type of your body to be application/x-www-form-urlencoded (which is correct as per the DeepL API specification) but then you provide a JSON body (which would require content type to be application/json).
You need to provide a URL-encoded body instead, like the part you can also append to the URL after the ?. See also this answer on SO.

Making a POST request using puppeteer with JSON payload

I'm trying to make a POST request using puppeteer and send a JSON object in the request, however, I'm getting a timeout... if I'm trying to send a normal encoded form data that at least a get a reply from the server of invalid request...
here is the relevant part of the code
await page.setRequestInterception(true);
const request = {"mac": macAddress, "cmd": "block"};
page.on('request', interceptedRequest => {
var data = {
'method': 'POST',
'postData': request
};
interceptedRequest.continue(data);
});
const response = await page.goto(configuration.commandUrl);
let responseBody = await response.text();
I'm using the same code to make a GET request (without payload) and its working
postData needs to be encoded as form data (in the format key1=value1&key2=value2).
You can create the string on your own or use the build-in module querystring:
const querystring = require('querystring');
// ...
var data = {
'method': 'POST',
'postData': querystring.stringify(request)
};
In case you need to submit JSON data:
'postData': JSON.stringify(request)
If you are sending json, you need to add "content-type": "application/json". If you don't send it you can receive an empty response.
var data = {
method : 'POST',
postData: '{"test":"test_data"}',
headers: { ...interceptedRequest.headers(), "content-type": "application/json"}
};
interceptedRequest.continue(data);

NodeJS, Axios - post file from local server to another server

I have an API endpoint that lets the client post their csv to our server then post it to someone else server. I have done our server part which save uploaded file to our server, but I can't get the other part done. I keep getting error { message: 'File not found', code: 400 } which may mean the file never reach the server. I'm using axios as an agent, does anyone know how to get this done? Thanks.
// file = uploaded file
const form_data = new FormData();
form_data.append("file", fs.createReadStream(file.path));
const request_config = {
method: "post",
url: url,
headers: {
"Authorization": "Bearer " + access_token,
"Content-Type": "multipart/form-data"
},
data: form_data
};
return axios(request_config);
Update
As axios doc states as below and the API I'm trying to call requires a file
// data is the data to be sent as the request body
// Only applicable for request methods 'PUT', 'POST', and 'PATCH'
// When no transformRequest is set, must be of one of the following types:
// - string, plain object, ArrayBuffer, ArrayBufferView, URLSearchParams
// - Browser only: FormData, File, Blob
// - Node only: Stream, Buffer
Is there any way to make axios send a file as a whole? Thanks.
The 2 oldest answers did not work for me. This, however, did the trick:
const FormData = require('form-data'); // npm install --save form-data
const form = new FormData();
form.append('file', fs.createReadStream(file.path));
const request_config = {
headers: {
'Authorization': `Bearer ${access_token}`,
...form.getHeaders()
}
};
return axios.post(url, form, request_config);
form.getHeaders() returns an Object with the content-type as well as the boundary.
For example:
{ "content-type": "multipart/form-data; boundary=-------------------0123456789" }
I'm thinking the createReadStream is your issue because its async. try this.
Since createReadStream extends the event emitter, we can "listen" for when it finishes/ends.
var newFile = fs.createReadStream(file.path);
// personally I'd function out the inner body here and just call
// to the function and pass in the newFile
newFile.on('end', function() {
const form_data = new FormData();
form_data.append("file", newFile, "filename.ext");
const request_config = {
method: "post",
url: url,
headers: {
"Authorization": "Bearer " + access_token,
"Content-Type": "multipart/form-data"
},
data: form_data
};
return axios(request_config);
});
This is what you really need:
const form_data = new FormData();
form_data.append("file", fs.createReadStream(file.path));
const request_config = {
headers: {
"Authorization": "Bearer " + access_token,
"Content-Type": "multipart/form-data"
},
data: form_data
};
return axios
.post(url, form_data, request_config);
In my case, fs.createReadStream(file.path) did not work.
I had to use buffer instead.
const form = new FormData();
form.append('file', fs.readFileSync(filePath), fileName);
const config = {
headers: {
Authorization: `Bearer ${auth.access_token}`,
...form.getHeaders(),
},
};
axios.post(api, form.getBuffer(), config);
I have made an interceptor you can connect to axios to handle this case in node: axios-form-data. Any feedback would be welcome.
npm i axios-form-data
example:
import axiosFormData from 'axios-form-data';
import axios from 'axios';
// connect axiosFormData interceptor to axios
axios.interceptors.request.use(axiosFormData);
// send request with a file in it, it automatically becomes form-data
const response = await axios.request({
method: 'POST',
url: 'http://httpbin.org/post',
data: {
nonfile: 'Non-file value',
// if there is at least one streamable value, the interceptor wraps the data into FormData
file: createReadStream('somefile'),
},
});
// response should show "files" with file content, "form" with other values
// and multipart/form-data with random boundary as request header
console.log(response.data);
I had a same issue, I had a "pdf-creator-service" for generate PDF document from html.
I use mustache template engine for create HTML document - https://www.npmjs.com/package/mustache
Mustache.render function returns html as a string what do I need to do to pass it to the pdf-generator-service ? So lets see my suggestion bellow
//...
async function getPdfDoc(props: {foo: string, bar: string}): Promise<Buffer> {
const temlateFile = readFileSync(joinPath(process.cwd(), 'file.html'))
mustache.render(temlateFile, props)
const readableStream = this.getReadableStreamFromString(htmlString)
const formData = new FormData() // from 'form-data'
formData.append('file', options.file, { filename: options.fileName })
const formHeaders = formData.getHeaders()
return await axios.send<Buffer>(
{
method: 'POST',
url: 'https://pdf-generator-service-url/pdf',
data: formData,
headers: {
...formHeaders,
},
responseType: 'arraybuffer', // ! important
},
)
}
getReadableStreamFromString(str: string): Readable {
const bufferHtmlString = Buffer.from(str)
const readableStream = new Readable() // from 'stream'
readableStream._read = () => null // workaround error
readableStream.push(bufferHtmlString)
readableStream.push(null) // mark end of stream
return readableStream
}
For anyone who wants to upload files from their local filesystem (actually from anywhere with the right streams architecture) with axios and doesn't want to use any external packages (like form-data).
Just create a readable stream and plug it right into axios request function like so:
await axios.put(
url,
fs.createReadStream(path_to_file)
)
Axios accepts data argument of type Stream in node context.
Works fine for me at least in Node v.16.13.1 and with axios v.0.27.2

Node JS upload file streams over HTTP

I'm switching one of my projects from request over to something a bit more light-weight (such as got, axios, or fetch). Everything is going smoothly, however, I'm having an issue when attempting to upload a file stream (PUT and POST). It works fine with the request package, but any of the other three return a 500 from the server.
I know that a 500 generally means an issue on the server's end, but it is consistent only with the HTTP packages that I'm testing out. When I revert my code to use request, it works fine.
Here is my current Request code:
Request.put(`http://endpoint.com`, {
headers: {
Authorization: `Bearer ${account.token.access_token}`
},
formData: {
content: fs.createReadStream(localPath)
}
}, (err, response, body) => {
if (err) {
return callback(err);
}
return callback(null, body);
});
And here is one of the attempts using another package (in this case, got):
got.put(`http://endpoint.com`, {
headers: {
'Content-Type': 'multipart/form-data',
Authorization: `Bearer ${account.token.access_token}`,
},
body: {
content: fs.createReadStream(localPath)
}
})
.then(response => {
return callback(null, response.body);
})
.catch(err => {
return callback(err);
});
Per the got documentation, I've also tried using the form-data package in conjunction with it according to its example and I still get the same issue.
The only difference between these 2 I can gather is with got I do have to manually specify the Content-Type header otherwise the endpoint does give me a proper error on that. Otherwise, I'm not sure how the 2 packages are constructing the body with the stream, but as I said, fetch and axios are also producing the exact same error as got.
If you want any of the snippets using fetch or axios I'd be happy to post them as well.
I know this question was asked a while ago, but I too am missing the simple pipe support from the request package
const request = require('request');
request
.get("https://res.cloudinary.com/demo/image/upload/sample.jpg")
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
and had to do some experimenting to find similar features from current libraries.
Unfortunately, I haven't worked with "got" but I hope the following 2 examples help someone else that are interested in working with the Native http/https libraries or the popular axios library
HTTP/HTTPS
Supports piping!
const http = require('http');
const https = require('https');
console.log("[i] Test pass-through: http/https");
// Note: http/https must match URL protocol
https.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
(imageStream) => {
console.log(" [i] Received stream");
imageStream.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
);
}
);
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
)
Axios
Note the usage of imageStream.data and that it's being attached to data in the Axios config.
const axios = require('axios');
(async function selfInvokingFunction() {
console.log("[i] Test pass-through: axios");
const imageStream = await axios.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
{
responseType: "stream", // Important to ensure axios provides stream
}
);
console.log(" [i] Received stream");
const upload = await axios({
method: "post",
url: "http://127.0.0.1:8000/api/upload/stream/",
data: imageStream.data,
headers: {
"Content-Type": imageStream.headers["content-type"],
},
});
console.log("Upload response", upload.data);
})();
Looks like this was a headers issue. If I use the headers directly from FormData (i.e., headers: form.getHeaders()) and just add in my additional headers afterwards (Authorization), then this ends up working just fine.
For me just works when I added other parameters on FormData.
before
const form = new FormData();
form.append('file', fileStream);
after
const form = new FormData();
form.append('file', fileStream, 'my-whatever-file-name.mp4');
So that way I can send stream from my backend to another backend in node, waiting a file in multipart/form-data called 'file'

Resources