I'm trying to upload a remote file via Slack API https://slack.com/api/files.upload using Node with axios library.
https://api.slack.com/methods/files.upload
async filesUpload(token, channel, content, filename) {
const form = new FormData()
form.append('token', token)
form.append('channels', channel)
form.append('content', content)
form.append('filename', filename)
form.append('filetype', 'auto')
const { data } = await axios.post(
'https://slack.com/api/files.upload',
form,
{
headers: form.getHeaders(),
}
)
}
// url is a publicly available remote jpg image
const { data } = await axios.get(url, {
responseType: 'blob',
})
filesUpload('XXXX', 'XXXXX', data, 'foo.jpg')
Slack API says all good, and post its content (some gibberish) to the channel and in the response I get plain text filetype:
...
mimetype: 'text/plain',
filetype: 'text',
...
I'm pretty sure is about encoding I'm sending, but I'm out of options. I was trying to downloading the file with responseType: 'blob', responseType: 'arraybuffer' but no luck.
Please help.
I didn't manage to fix it using axios, but I switched the request to https://github.com/request/request-promise and everything works fine.
Related
The Scenario
I am running a VueJs client, a NodeJs Restify API Server, and a Tika-server out of the official Docker Image. A user makes a POST call with formData containing a PDF file to be parsed. The API server receives the POST call and I save the PDF on the server. The API server should PUT the file to the unpack/all endpoint on the Tika-server and receive a zip containing a text file, a metadata file, and the set of images in the PDF. I would then process the zip and pass some data back to the client.
The Problem
I create a buffer containing the file to be parsed using let parsingData = fs.createReadStream(requestFilename); or let parsingData = fs.readFileSync(requestFilename);, set the axios data field to parsingData, then make my request. When I get the response from the Tika-server, it seems the Tika-server has treated the request as empty; within the zip, there are no images, the TEXT file is empty, the METADATA.
When I make the following request to the Tika-server via CURL curl -T pdf_w_images_and_text.pdf http://localhost:9998/unpack/all -H "X-Tika-PDFExtractInlineImages: true" -H "X-Tika-PDFExtractUniqueInlineImagesOnly: true"> tika-response.zip, I get a response zip file containing accurate text, metadata, stripped images.
The Code
let parsingData = fs.createReadStream('pdf_w_images_and_text.pdf');
axios({
method: 'PUT',
url: 'http://localhost:9998/unpack/all',
data: parsingData,
responseType: 'arraybuffer',
headers: {
'X-Tika-PDFExtractInlineImages': 'true',
'X-Tika-PDFExtractUniqueInlineImagesOnly': 'true'
},
})
.then((response) => {
console.log('Tika-server response recieved');
const outputFilename = __dirname+'\\output.zip';
console.log('Attempting to convert Tika-server response data to ' + outputFilename);
fs.writeFileSync(outputFilename, response.data);
if (fs.existsSync(outputFilename)) {
console.log('Tika-server response data saved at ' + outputFilename);
}
})
.catch(function (error) {
console.error(error);
});
The Question
How do I encode and attach my file to my PUT request in NodeJs such that the Tika-server treats it as it does when I make the request through CURL?
Axios is sending the request with a content type of application/x-www-form-urlencoded and therefore the file content isn't being detected and parsed.
You can change this by passing either the known content type of the file, or a content type of application/octet-stream to allow Apache Tika Server to auto-detect.
Below is a sample based on your question's code that illustrates this:
#!/usr/bin/env node
const fs = require('fs')
const axios = require('axios')
let parsingData = fs.createReadStream('test.pdf');
axios({
method: 'PUT',
url: 'http://localhost:9998/unpack/all',
data: parsingData,
responseType: 'arraybuffer',
headers: {
'X-Tika-PDFExtractInlineImages': 'true',
'X-Tika-PDFExtractUniqueInlineImagesOnly': 'true',
'Content-Type': 'application/octet-stream'
},
})
.then((response) => {
console.log('Tika-server response recieved');
const outputFilename = __dirname+'/output.zip';
console.log('Attempting to convert Tika-server response data to ' + outputFilename);
fs.writeFileSync(outputFilename, response.data);
if (fs.existsSync(outputFilename)) {
console.log('Tika-server response data saved at ' + outputFilename);
}
})
.catch(function (error) {
console.error(error);
});
I have two microservices: 1) in which I am generating pdf using Puppeteer, which is essentially a Buffer object. From this service I want to send the pdf to another microservice, 2) which receives the pdf in request and attaches it in email using mailgun (once I am able to send pdf from one service to another, attaching as an email wont be difficult).
The way I m sending the pdf in requestpromise is this:
import requestPromise from "request-promise";
import {Readable} from "stream";
//pdfBuffer is result of 'await page.pdf({format: "a4"});' (Puppeteer method).
const stream = Readable.from(pdfBuffer);
/*also tried DUPLEX and Readable.from(pdfBuffer.toString()) and this code too.
const readable = new Readable();
readable._read = () => {}
readable.push(pdf);
readable.push(null);
*/
requestPromise({
method: "POST",
url: `${anotherServiceUrl}`,
body: {data},
formData: {
media: {
value: stream,
options: {
filename: "file.pdf",
knownLength: pdfBuffer.length,
contentType: "application/pdf"
}
}
},
json: true
}
});
But doing so, I get "ERR_STREAM_WRITE_AFTER_END" error. How can I send this pdf from one service to another, since the other service sends the email to the user?
I've done it from the frontend with:
fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/pdf'
},
body: pdfData
}
In this case pdfData is a blob so you would need a polyfill for that plus node-fetch
const buffer = Buffer.from(pdfBuffer).toString("base64").toString();
sending the buffer in the body.
On Receiving service:
const pdf = Buffer.from(body.buffer, "base64");
fs.write("file.pdf", pdf, ()=> {});
I have a React component where I ask the user to insert an image using react-dropze. On drop, I save the image into an image state.
Like this:
const handleOnDrop = (files) => {
setimage(files[0]);
}
Once I submit, I send a request to my back-end in order to get the URL with this function:
export const generateUploadURL = async () => {
const rawBytes = await randomBytes(16);
const imageName = rawBytes.toString('hex');
const params = ({
Bucket: process.env.S3_BUCKET_NAME,
Key: imageName,
ContentType: 'image/*',
Expires: 60
})
const uploadUrl = await s3.getSignedUrlPromise('putObject', params);
return uploadUrl;
}
I get the URL and finally execute a put into the s3 with the URL:
await axios.put(url, {
headers: {
"Content-Type": "multipart/form-data"
},
body: image
});
And then I save the data into my database but that's not important.
The thing is, after that, I can't render the image from the link I stored so I went into the link and encountered this:
{"headers":{"Content-Type":"multipart/form-data"},"body":{"path":"asdasdsadtest.jpg"}}
I tried setting the content-type to the imageType but that didn't work either. I have no clue on how I could make it work.
Why are you using the s3.getSignedUrlPromise('putObject ") api? Using s3.upload would allow you to send the file in one go and would make it much simpler in my opinion. See: https://stackabuse.com/uploading-files-to-aws-s3-with-node-js/ for an example of this solution
Solved it the following way:
url = await getS3Url(image);
await axios.put(url, image, {
headers: {
"Content-Type": image.type
}
});
Insted of placing it into the body, this worked.
What I'm trying to accomplish is using a Firebase Cloud Function (Node.js) to:
First download an image from an url (f.eg. from unsplash.com) using an axios.get() request
Secondly take that image and upload it to a Wordpress site using the Wordpress Rest API
The problem seems (to me) to be that the formData doesnt actually append any data, but the axios.get() request actually does indeed retrieve a buffered image it seems. Maybe its something wrong I'm doing with the Node.js library form-data or maybe I get the image in the wrong encoding? This is my best (but unsuccessfull) attempt:
async function uploadMediaToWordpress() {
var FormData = require("form-data");
var formData = new FormData();
var response = await axios.get(
"https://images.unsplash.com/photo-1610303785445-41db41838e3e?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=634&q=80"
{ responseType: "arraybuffer" }
);
formData.append("file", response.data);
try {
var uploadedMedia = await axios.post("https://wordpresssite.com/wp-json/wp/v2/media",
formData, {
headers: {
"Content-Disposition": 'form-data; filename="example.jpeg"',
"Content-Type": "image/jpeg",
Authorization: "Bearer <jwt_token>",
},
});
} catch (error) {
console.log(error);
throw new functions.https.HttpsError("failed-precondition", "WP media upload failed");
}
return uploadedMedia.data;
}
I have previously successfully uploaded an image to Wordpress with Javascript in a browser like this:
async function uploadMediaToWordpress() {
let formData = new FormData();
const response = await fetch("https://images.unsplash.com/photo-1610303785445-41db41838e3e?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=634&q=80");
const blob = await response.blob();
const file = new File([blob], "image.jpeg", { type: blob.type });
formData.append("file", file);
var uploadedMedia = await axios.post("https://wordpresssite.com/wp-json/wp/v2/media",
formData, {
headers: {
"Content-Disposition": 'form-data; filename="example.jpeg"',
"Content-Type": "image/jpeg",
Authorization: "Bearer <jwt_token>",
},
});
return uploadedMedia.data;
},
I have tried the last couple of days to get this to work but cannot for the life of me seem to get it right. Any pointer in the right direction would be greatly appreciated!
The "regular" JavaScript code (used in a browser) works because the image is sent as a file (see the new File in your code), but your Node.js code is not really doing that, e.g. the Content-Type value is wrong which should be multipart/form-data; boundary=----...... Nonetheless, instead of trying (hard) with the arraybuffer response, I suggest you to use stream just as in the axios documentation and form-data documentation.
So in your case, you'd want to:
Set stream as the responseType:
axios.get(
'https://images.unsplash.com/photo-1610303785445-41db41838e3e?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=634&q=80',
{ responseType: 'stream' }
)
Use formData.getHeaders() in the headers of your file upload request (to the /wp/v2/media endpoint):
axios.post( 'https://wordpresssite.com/wp-json/wp/v2/media', formData, {
headers: {
...formData.getHeaders(),
Authorization: 'Bearer ...'
},
} )
And because the remote image from Unsplash.com does not use a static name (e.g. image-name.jpg), then you'll need to set the name when you call formData.append():
formData.append( 'file', response.data, 'your-custom-image-name.jpeg' );
I hope that helps, which worked fine for me (using the node command for Node.js version 14.15.4, the latest release as of writing).
I need to upload an image/video to Linkedin through API. I am using axios and have Content-Type set to multipart/form-data and all my images/videos that need to be uploaded are stored with a url to the file. All files are stored remotely on cloudinary.
let bodyFormData = new FormData();
bodyFormData.append(
"fileupload",
request(file.url).pipe(fs.createWriteStream("video.mp4"))
);
axios
.post("https://api.linkedin.com/media/upload", bodyFormData, {
headers: {
Authorization: "Bearer " + account.accessToken,
"Content-Type": "multipart/form-data"
}
})
.then(linkedinImageResult => {
I am following this documentation here:
https://learn.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/rich-media-shares#
One of the common errors I have gotten is:
UnhandledPromiseRejectionWarning: TypeError: source.pause is not a function
If I change
request(file.url).pipe(fs.createWriteStream("video.mp4"))
to just
file.url
I get this error:
'java.io.IOException: Missing initial multi part boundary'
If I remove
"Content-Type": "multipart/form-data"
I get this error:
"Unable to parse form content"
Note:
file.url is a url to a cloudinary file, example of file.url: "https://res.cloudinary.com/dnc1t9z9o/video/upload/v1555527484/mn3tyjcpg1u4anlma2v7.mp4"
Any help is greatly appreciated :)
Please note that using Rich Media is being deprecated:
Uploading image using https://api.linkedin.com/media/upload is being
deprecated. Partners are recommended to use assets API that returns
response such as urn:li:digitalmediaAsset:C5522AQHn46pwH96hxQ to post
shares.
If you need help with the asset API, please see my answer here
I found the solution for this specific situation!
Here is my example code.
const postRichMedia = async (mediaURL, accessToken, fileName) => {
const formData = new FormData();
const key = S3Service.getSocialKeyFromUrl(mediaURL);
const file = await S3Service.downloadFileFromS3(key);
formData.append("fileupload", file.Body, fileName);
try {
const {data} = await axios.post('https://api.linkedin.com/media/upload', formData, {
headers: {
...formData.getHeaders(),
Authorization: `Bearer ${accessToken}`
},
});
const {location} = data;
return location;
} catch (err) {
console.error(err);
}
};