Attaching a file using Resumable upload w/ Gmail API - gmail

I am attempting to use Gmail's Resumable option for uploading attachments to an email. Documentation reference: https://developers.google.com/gmail/api/guides/uploads#resumable.
Currently I am able to send the email with the resumable URI, but without an attachment (using Postman). Documentation doesn't provide very clear examples of what the request should specifically look like, and there don't seem to be many examples after scouring the internet.
My requests are in two parts:
Initial request -
Request URL:
POST /upload/gmail/v1/users/me/messages/send?uploadType=resumable
Host: www.googleapis.c om (can't post links so I interrupted the url)
Headers:
Authorization: Bearer my_token_here
Content-Length: 113
Content-Type: application/json
X-Upload-Content-Length: 67
X-Upload-Content-Type: message/rfc822
Body:
{"raw":"VG86IG5pcnZhbmEucm9ja2VyQGdtYWlsLmNvbQpTdWJqZWN0OiBUZXN0RW1haWxTdWJqZWN0MwoKTWVzc2FnZSBjb250ZW50cyAjMy4"}
The body is a 64bit encoded string that include the To, Subject, and email message contents. Then gmail returns a response with an empty body, and a 'location' header that looks like the following: googleapis.com/upload/gmail/v1/users/me/messages/send?uploadType=resumable&upload_id=BRnB2UoAsKwzNMoQAy-JtmP6mu5agltqOWZ9uerI3k-KNTDJ73PWEjKuAHpko4RN6weSEysddH2kjj4G24uFw6E9oPv1XP69l7_KcmNuW-RAoz_5oS1T_4_E. (removed https:// because this account can only have one link in a post)
I then follow up with a PUT request to that URL returned in the location header.
The second request looks like the following:
Request URL:
PUT /upload/gmail/v1/users/me/messages/send?uploadType=resumable&upload_id=BRnB2UoAsKwzNMoQAy-JtmP6mu5agltqOWZ9uerI3k-KNTDJ73PWEjKuAHpko4RN6weSEysddH2kjj4G24uFw6E9oPv1XP69l7_KcmNuW-RAoz_5oS1T_4_E
Host: www.googleapis.c om
Headers:
Content-Length: 67
Content-Type: message/rfc822
Body:
{"raw":"VG86IG5pcnZhbmEucm9ja2VyQGdtYWlsLmNvbQpTdWJqZWN0OiBUZXN0RW1haWxTdWJqZWN0MwoKTWVzc2FnZSBjb250ZW50cyAjMy4"}
--- OR ---
I choose the binary option, and attach the file I am looking to upload via Postman.
I receive a response from Gmail with an object like this:
{
"id": "159d7ded3125e255",
"threadId": "159d7ded3125e255",
"labelIds": [
"SENT"
]
}
And an email is sent successfully, however there isn't an attachment with the email. When I show the original email in Gmail, there isn't any evidence of an attachment. Original looks like the following:
Received: from 325276275830 named unknown by gmailapi.google.com with HTTPREST; Wed, 25 Jan 2017 15:03:33 -0800
To: some.name#gmail.com
Subject: TestEmailSubject3
Date: Wed, 25 Jan 2017 15:03:33 -0800
Message-Id: <CEROA6F=0ohk33RD9XyC_gW1DZO88xYF4bXYqrCSct62MUuytDw#mail.gmail.com>
From: name_here#gmail.com
Message contents #3.
What am I missing? Do I need to encode some particular contents in a different way, or put some data in a different location? I'm not receiving any errors. I've been working on this for a few days now and I just can't figure it out.

I ran into the same problem, i made it work by using Nodemailer to create the email with the attachments, save the result to a file, then upload it with this.

Instead of send email in two parts you can send all your email data and attachment data as a mime message. like this.
I am using JavaScript client so you can do like this:
// MIME Mail Message data.
let mail = [
'Content-Type: multipart/mixed; boundary="foo_bar_baz"\r\n',
"MIME-Version: 1.0\r\n",
"to: to#gmail.com\r\n",
"from: from#gmail.com\r\n",
"subject: i am subject\r\n\r\n",
"--foo_bar_baz\r\n",
'Content-Type: text/plain; charset="UTF-8"\r\n',
"MIME-Version: 1.0\r\n",
"Content-Transfer-Encoding: 7bit\r\n\r\n",
"The actual message text goes here\r\n",
"--foo_bar_baz\r\n",
"Content-Type: application/json; name=package.json\r\n",
"Content-Transfer-Encoding: base64\r\n",
"Content-Disposition: attachment; filename=package.json\r\n\r\n",
"<base64 file data. data according to content type>",
"\r\n",
"--foo_bar_baz--",
].join("");
// get resumable upload link.
let resumableURL = "";
gapi.client
.request({
path: "/upload/gmail/v1/users/me/messages/send?uploadType=resumable",
headers: {
"X-Upload-Content-Type": "message/rfc822",
},
method: "post",
})
.then(
(res) => {
resumableURL = res.headers.location;
console.log(res);
},
(err) => console.log(err)
);
// send email
gapi.client
.request({
path: resumableURL,
headers: {
"Content-Type": "message/rfc822",
},
method: "post",
body: mail,
})
.then(
(res) => {
console.log(res);
},
(err) => console.log(err)
);
To convert gapi.client.request to Fetch API call you just need to add Authorization: Bearer <access_token> to header field. I have tried using Fetch API but response were blocked due to cors error so api client like Postman should be used.
To do more with resumable upload method check documentation: Upload Attachment

Related

post request successful with Postman - unsucessful with fetch-api

I have been bashing my head against the wall for the last 2 days with the following problem.
This is the scenario: When I make a GET request by browsing to a particular website, this website sends a cookie called PHPSESSION="xyz" it then prompts the user to enter a password and subsequently makes a post request to the same URL sending this particular cookie and a hidden form element alongside for verification and upon success sends a pdf.
I can successfully replicate this in Postman.
I make a get request - it sets the cookie - I have password filled into my form-data responds body and manually add the secret string that is added to the form for verification -> send... and I get the pdf - so far so good.
However, I would like to automate this process so that I don't have to painstakingly extract the value of the hidden form by hand but use node.js to make these requests so I wrote the following code:
// making the get request to the URL above
// extract the cookie PHPSESSION value
const sessionString = String(response.headers.get('set-cookie')).substring(10,36)
// parse the body
const htmlBody = await response.text()
let doc = new DOMParser().parseFromString(htmlBody)
// extract the verification token from the form
const formToken = await doc.getElementById('verification__token').getAttribute('value')
let formData = new FormData();
formData.append('verification[char_1]',0)
formData.append('verification[char_2]',6)
formData.append('verification[char_3]',4)
formData.append('verification[char_4]',5)
formData.append('verification[char_5]',8)
formData.append('verification[char_6]',1)
formData.append('verification[char_7]',7)
formData.append('verification[char_8]',6)
formData.append('verification[_token]',formToken)
const obj = {
headers:{
"Cookie" : `PHPSESSID=${sessionString};`,
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": "PostmanRuntime/7.29.2",
"Accept-Encoding": "gzip, deflate, br",
"credentials": "include"
},
method: "POST",
body: formData
}
const postResponse = await fetch("https://url...",obj)
const r = await postResponse.text()
Unfortunately, the post requests fails in node.js - the website is simply redirecting me to back to the form in which I have to type in the password.
I am suspecting it has something to do with the headers / cookie but I simply don't know.
Does anyone spots an obvious mistake?
Thank you
Solved... after sacrificing the entire weekend to this lovely task.
If anyone comes across a similar problem here is the solution - or at lest what helped me.
https://reqbin.com/curl
https://curlconverter.com
So basically make your request work with curl and then port it.
In my case that looked like this:
const x = await fetch('https://yourURL', {
method: 'POST',
headers: {
'Cookie': 'PHPSESSID=lfjdd2uba1bmecr064rt7chvu3; Path=/; Secure; HttpOnly;',
'Content-Type': 'application/x-www-form-urlencoded'
},
body: 'verification[_token]=5d5e4d8783daf952d5.UZ661yMyOUtJSQeG1Td7cUtxWqnI2Oaot-xMQevly4o.acH9hXR9SRkwGm30kE9WIggDNpqdl6Ln2rQnOIG9pcEp1tOiYnNLJggZcA&verification[char_1]=0&verification[char_2]=6&verification[char_3]=4&verification[char_4]=5&verification[char_5]=8&verification[char_6]=1&verification[char_7]=7&verification[char_8]=6'
});

How to set Content-Type of Supertest attachment

I came across a problem in my app that the Content-Type associated with an uploaded file was wrong if the file was sent from Microsoft Windows, giving application/octet-stream, while it was okay if sent from Mac, giving text/csv. After fixing my code to not rely on the mime type from the request, I would like to simulate that condition in one of my tests.
Given the following request, that includes JSON-stringified form fields and a file attachment, using supertest:
request(app)
.post('/some/where')
.field('someFormData', JSON.stringify(formData))
.attach('someFile', 'someFile.csv')
.expect(400)
.end(done);
How can I change the Content-Type for the attached file? Looking at Edge's Network tab, I would like to see the following for the request above:
-----------------------------7e13121340602
Content-Disposition: form-data; name="someFormData"
{.............}
-----------------------------7e13121340602
Content-Disposition: form-data; name="someFile"; filename="someFile.csv"
Content-Type: application/octet-stream
ÿØÿà
(The JSON string has been omitted with dots)
(Instead of showing text/csv, I want to simulate the wrong content type.)
Using attach's contentType option works for me:
request(app)
.post('/validateCertificate')
.set('Content-type', 'multipart/form-data')
.field('passphrase', '1234')
.attach(
'pfx',
'./src/build-request/test-certs/correct.pfx',
{ contentType: 'application/x-pkcs12' },
)
.expect(200))
Or if you're sending a buffer as the file:
request(app)
.post('/validateCertificate')
.set('Content-type', 'multipart/form-data')
.field('passphrase', '1234')
.attach(
'pfx',
buffer,
{ contentType: 'application/x-pkcs12', filename: 'correct.pfx' },
)
.expect(200))
request(app)
.post('/some/where')
.field('someFormData', JSON.stringify(formData))
.set('Content-Type', 'application/octet-stream')
.attach('someFile', 'someFile.csv')
.expect(400)
.end(done);
Use .set() to set whatever Content-Type you want. Or any kind of header.

mailgun incoming mail event fetch attachment url

I have a node endpoint that receives an incoming email in json, complete with any attachments from mailgun.
The attachments are in a json array (xxx.com is used for privacy)
attachments: '[{"url": "https://sw.api.mailgun.net/v3/domains/xxx.com/messages/eyJwIjpmYWxzZSwiayI6ImZhMTU0NDkwLWVmYzgtNDVlNi1hYWMyLTM4M2EwNDY1MjJlNCIsInMiOiI2NmU1NmMzNTIwIiwiYyI6InRhbmtiIn0=/attachments/0", "content-type": "image/png", "name": "ashfordchroming_logo.png", "size": 15667}]
But if i type the url in the browser:
https://sw.api.mailgun.net/v3/domains/xxx.com/messages/eyJwIjpmYWxzZSwiayI6ImZhMTU0NDkwLWVmYzgtNDVlNi1hYWMyLTM4M2EwNDY1MjJlNCIsInMiOiI2NmU1NmMzNTIwIiwiYyI6InRhbmtiIn0=/attachments/0
I get
{
"message": "Domain not found: xxx.com"
}
I wanted the simplest way to show the image attachment in HTML, I was hoping the URL would just work since mailgun store the attachment.
So I was just trying to render the url in a template from Node.
Do I need to attach auth / API key credentials to the front of the URL to do this to test and make work?
If you want to access the raw json, go to
https://sw.api.mailgun.net/v3/domains/xxx.com/messages/eyJwIjpmYWxzZSwiayI6ImZhMTU0NDkwLWVmYzgtNDVlNi1hYWMyLTM4M2EwNDY1MjJlNCIsInMiOiI2NmU1NmMzNTIwIiwiYyI6InRhbmtiIn0=/attachments/0
using username 'api' and password 'your-mailgun-privatekey'.
To do this programmatically, use the request package to read the buffer.
const rp = require("request-promise");
let file = rp.get({
uri: "attachement-url",
headers: {
"Accept": "message/rfc2822"
}
}).auth("api", "your private key")
/**Access the buffer here**/
file.on('data', (s => {
console.log(s)
}))
file.pipe(fs.createWriteStream("./my-image.jpg"))
you can pipe the file to S3 or any cloud bucket.

Getting 401 uploading file into a table with a service account

I am using nodejs and the REST API to interact with bigquery. I am using the google-oauth-jwt module for JWT signing.
I granted a service account write permission. So far I can list projects, list datasets, create a table and delete a table. But when it comes to upload a file via multipart POST, I ran into two problems:
gzipped json file doesn't work, I get an error saying "end boundary missing"
when I use uncompressed json file, I get a 401 unauthorized error
I don't think this is related to my machine's time being out of sync since other REST api calls worked as expected.
var url = 'https://www.googleapis.com/upload/bigquery/v2/projects/' + projectId + '/jobs';
var request = googleOauthJWT.requestWithJWT();
var jobResource = {
jobReference: {
projectId: projectId,
jobId: jobId
},
configuration: {
load: {
sourceFormat: 'NEWLINE_DELIMITED_JSON',
destinationTable: {
projectId: projectId,
datasetId: datasetId,
tableId: tableId
},
createDisposition: '',
writeDisposition: ''
}
}
};
request(
{
url: url,
method: 'POST',
jwt: jwtParams,
headers: {
'Content-Type': 'multipart/related'
},
qs: {
uploadType: 'multipart'
},
multipart: [
{
'Content-Type':'application/json; charset=UTF-8',
body: JSON.stringify(jobResource)
},
{
'Content-Type':'application/octet-stream',
body: fileBuffer.toString()
}
]
},
function(err, response, body) {
console.log(JSON.parse(body).selfLink);
}
);
Can anyone shine some light on this?
P.S. the documentation on bigquery REST api is not up to date on many things, wish the google guys can keep it updated
Update 1:
Here is the full HTTP request:
POST /upload/bigquery/v2/projects/239525534299/jobs?uploadType=multipart HTTP/1.1
content-type: multipart/related; boundary=71e00bd1-1c17-4892-8784-2facc6998699
authorization: Bearer ya29.AHES6ZRYyfSUpQz7xt-xwEgUfelmCvwi0RL3ztHDwC4vnBI
host: www.googleapis.com
content-length: 876
Connection: keep-alive
--71e00bd1-1c17-4892-8784-2facc6998699
Content-Type: application/json
{"jobReference":{"projectId":"239525534299","jobId":"test-upload-2013-08-07_2300"},"configuration":{"load":{"sourceFormat":"NEWLINE_DELIMITED_JSON","destinationTable":{"projectId":"239525534299","datasetId":"performance","tableId":"test_table"},"createDisposition":"CREATE_NEVER","writeDisposition":"WRITE_APPEND"}}}
--71e00bd1-1c17-4892-8784-2facc6998699
Content-Type: application/octet-stream
{"practiceId":2,"fanCount":5,"mvp":"Hello"}
{"practiceId":3,"fanCount":33,"mvp":"Hello"}
{"practiceId":4,"fanCount":71,"mvp":"Hello"}
{"practiceId":5,"fanCount":93,"mvp":"Hello"}
{"practiceId":6,"fanCount":92,"mvp":"Hello"}
{"practiceId":7,"fanCount":74,"mvp":"Hello"}
{"practiceId":8,"fanCount":100,"mvp":"Hello"}
{"practiceId":9,"fanCount":27,"mvp":"Hello"}
--71e00bd1-1c17-4892-8784-2facc6998699--
You are most likely sending duplicate content-type headers to the Google API.
I don't have the capability to effortlessly make a request to Google BigQuery to test, but I'd start with removing the headers property of your options object to request().
Remove this:
headers: {
'Content-Type': 'multipart/related'
},
The Node.js request module automatically detects that you have passed in a multipart array, and it adds the appropriate content-type header. If you provide your own content-type header, you most likely end up with a "duplicate" one, which does not contain the multipart boundary.
If you modify your code slightly to print out the actual headers sent:
var req = request({...}, function(..) {...});
console.log(req.headers);
You should see something like this for your original code above (I'm using the Node REPL):
> req.headers
{ 'Content-Type': 'multipart/related',
'content-type': 'multipart/related; boundary=af5ed508-5655-48e4-b43c-ae5be91b5ae9',
'content-length': 271 }
And the following if you remove the explicit headers option:
> req.headers
{ 'content-type': 'multipart/related; boundary=49d2371f-1baf-4526-b140-0d4d3f80bb75',
'content-length': 271 }
Some servers don't deal well with multiple headers having the same name. Hopefully this solves the end boundary missing error from the API!
I figured this out myself. This is one of those silly mistakes that would have you stuck for the whole day and at the end when you found the solution you would really knock on your own head.
I got the 401 by typing the selfLink URL in the browser. Of course it's not authorized.

IronMQ empty message body from push queue when read from Node.JS / Express.JS

I'm playing with node + express + IronMQ and I'm encountering a little problem.
In my express.js POST callback I'm getting {} as request body but I'm sure that the message content is being pushed from my IronMQ message queue.
Any hint ?
Ok I've found both the reason of my problem and its solution. So to answer my own question:
Problem:
1) I'm receiving POST messages from an IronMQ push queue (http://dev.iron.io/mq/reference/push_queues/), their content type is text/plain.
2) I'm using connect.js middleware (express.connect) and it parses only application/json,application/x-www-form-urlencoded, and multipart/form-data.
http://www.senchalabs.org/connect/bodyParser.html
So the body gets parsed and as its content type is not supported the result is {}
Solution:
In order to get the body of my text/plain request I had to parse it by myself as in https://stackoverflow.com/a/9920700
IronMQ have now updated their push queues to send custom headers. If you set the headers to 'Content-Type': 'application/json' in the list of subscribers when creating the queue, then the body gets parsed correctly. eg
# update groups queue
payload =
subscribers: [
{
url: "#{process.env.ROOT_URL}/groups/update"
headers:
'Content-Type': 'application/json' # this fixes request parsing issue
}
]
push_type: 'multicast'
retries: 3
retries_delay: 10
error_queue: 'groups_errors'
url = "https://mq-aws-us-east-1.iron.io/1/projects/#{process.env.IRON_MQ_PROJECT_ID}/queues/groups"
headers =
'Authorization': "OAuth #{process.env.IRON_MQ_TOKEN}"
'Content-Type': 'application/json'
result = HTTP.post url, {headers: headers, content: JSON.stringify(payload)}
Here's the relevant change on github

Resources