Telegram Bot - How to upload local files with absolute/ dynamic URL - node.js

I'm trying to send photos through telegram bot using 'sendPhoto' method with relative url (Image at file level). I'm not using any library, here is my call function:
let axiosImage = async (chatId, caption, res) => {
try {
await axios.post(`${TELEGRAM_API}/sendPhoto`,
{
headers:{'Content-Type': 'multipart/form-data'}
},{
body: {
'chat_id': chatId,
'caption': caption,
'photo': './image.jpeg'
}
})
return res.send()
} catch (e) {
console.log('\nSTATUS RESPONSE: ' + e.response.status)
console.log('\nMESSAGE RESPONSE: ' + e.response.statusText)
}}
but I'm getting this message back: {"ok":false,"error_code":400,"description":"Bad Request: there is no photo in the request"}
I tried with a web url and it sends normally.
What could I be missing? Do I have to upload the local images in some repository?

I had a similar issue recently, I managed to solve the problem using form-data npm package and built-in fs module.
const FormData = require('form-data');
const fs = require('fs');
const axiosImage = async(chatId, caption, res) => {
try {
const formData = new FormData();
formData.append('chat_id', chatId);
formData.append('photo', fs.createReadStream('./image.jpeg'));
formData.append('caption', caption);
const response = await axios.post(`${TELEGRAM_API}/sendPhoto`, formData, {
headers: formData.getHeaders(),
});
return res.send();
} catch (err) {
console.log(err);
}
}

From the Telegram api docs
If the file is already stored somewhere on the Telegram servers, you don't need to reupload it: each file object has a file_id field, simply pass this file_id as a parameter instead of uploading. There are no limits for files sent this way.
Provide Telegram with an HTTP URL for the file to be sent. Telegram will download and send the file. 5 MB max size for photos and 20 MB max for other types of content.
Post the file using multipart/form-data in the usual way that files are uploaded via the browser. 10 MB max size for photos, 50 MB for other files.
What you want to send is a file via file upload (3.). This is the answer to what you are trying to achieve:
https://stackoverflow.com/a/59177066/4668136

Related

Send blob-data along with a string to backend

I´ve got a weird problem.
Using Node, React, Express, MongoDB -> MERN Stack.
So my page generates a PDF file which then gets send to the backend (as blob data) and is being stored on there.
The problem I have, now I need to send a payment ID along with that blob data to save the order in the data base. I need both in one post request, to make it as smooth as possible:
await axios
.post(process.env.REACT_APP_SERVER_API + '/payment/cash', {
blobData,
paymentId
})
.then(async (res) => ...
like so.
Before, when I just sent the blob data, I could simply access the data in the backend by writing:
exports.createCashOrder = async (req, res) => {
const { filename } = req.file; // THIS RIGHT HERE
const fileHash = await genFileHash(filename);
try {
await saveOrder(filename, fileHash, "cash", paymentId);
//await sendOrderCreatedEmail(req.body, fileHash);
//await sendOrderReceivedConfirmEmail(req.body, fileHash);
res.send({ filename: filename });
}
But that doesn't work anymore. I dont have access to that file object anymore when sending that request object.
Neither by trying
req.body.blobData
req.body.blobData.file
req.file
Any idea how to achieve that, except from making two seperate post requests?
Glad for any help, cheers!
Send the data as a form
await axios
.postForm(process.env.REACT_APP_SERVER_API + '/payment/cash', {
blobData,
paymentId
})
.then(async (res) => ...
And then use multer middleware to handle the form in express.

Facebook API: cannot upload video thumbnail

I'm calling the Facebook API from a Node.js server in order to upload a video. The video has a thumbnail hosted on another server. I want to read the file from the server, pass it as a FormData to /{page-id}/videos.
I'm able to get the file and convert it to base64 data, however, everytime that I call the API I get the following error:
(#100) Invalid image format. It should be an image file data.
Here's my code:
try {
const data = await httpUtilBase.get(thumbnailUrl, { responseType: 'arraybuffer' });
thumbnailData = `data:${data.headers['content-type']};base64,${Buffer.from(data.data).toString('base64')}`;
} catch {
return next(new ErrorResponse('Cannot fetch thumbnail url', httpStatus.BAD_REQUEST));
}
const formData = new FormData();
formData.append('url', url);
formData.append('title', title);
formData.append('published', 'false');
formData.append('thumb', thumbnailData);
formData.append('access_token', facebookUtil.decryptToken(page));
I make the call to /{page-id}/videos just after and it always fails. I don't understand what the format should be.

Trying to retrieve an mp3 file stored in AWS S3 and load it into my React client as a Blob...it's not working

I have a react web app that allows users to record mp3 files in the browser. These mp3 files are saved in an AWS S3 bucket and can be retrieved and loaded back into the react app during the user's next session.
Saving the file works just fine, but when I try to retrieve the file with getObject() and try to create an mp3 blob on the client-side, I get a small, unusable blob:
Here's the journey the recorded mp3 file goes on:
1) Saving to S3
In my Express/Node server, I receive the uploaded mp3 file and save to the S3 bucket:
//SAVE THE COMPLETED AUDIO TO S3
router.post("/", [auth, upload.array('audio', 12)], async (req, res) => {
try {
//get file
const audioFile = req.files[0];
//create object key
const userId = req.user;
const projectId = req.cookies.currentProject;
const { sectionId } = req.body;
const key = `${userId}/${projectId}/${sectionId}.mp3`;
const fileStream = fs.createReadStream(audioFile.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: key,
ContentType: "audio/mp3"
}
const result = await s3.upload(uploadParams).promise();
res.send(result.key);
} catch (error) {
console.error(error);
res.status(500).send();
}
});
As far as I know, there are no problems at this stage. The file ends up in my S3 bucket with "type: mp3" and "Content-Type: audio/mp3".
2) Loading file from S3 Bucket
When the react app is loaded up, an HTTP GET Request is made in my Express/Node server to retrieve the mp3 file from the S3 Bucket
//LOAD A FILE FROM S3
router.get("/:sectionId", auth, async(req, res) => {
try {
//create key from user/project/section IDs
const sectionId = req.params.sectionId;
const userId = req.user;
const projectId = req.cookies.currentProject;
const key = `${userId}/${projectId}/${sectionId}.mp3`;
const downloadParams = {
Key: key,
Bucket: bucketName
}
s3.getObject(downloadParams, function (error, data) {
if (error) {
console.error(error);
res.status(500).send();
}
res.send(data);
});
} catch (error) {
console.error(error);
res.status(500).send();
}
});
The "data" returned here is as such:
3) Making a Blob URL on the client
Finally, in the React client, I try to create an 'audio/mp3' blob from the returned array buffer
const loadAudio = async () => {
const res = await api.loadAudio(activeSection.sectionId);
const blob = new Blob([res.data.Body], {type: 'audio/mp3' });
const url = URL.createObjectURL(blob);
globalDispatch({ type: "setFullAudioURL", payload: url });
}
The created blob is severely undersized and appears to be completely unusable. Downloading the file results in a 'Failed - No file' error.
I've been stuck on this for a couple of days now with no luck. I would seriously appreciate any advice you can give!
Thanks
EDIT 1
Just some additional info here: in the upload parameters, I set the Content-Type as audio/mp3 explicitly. This is because when not set, the Content-Type defaults to 'application/octet-stream'. Either way, I encounter the same issue with the same result.
EDIT 2
At the request of a commenter, here is the res.data available on the client-side after the call is complete:
Based on the output of res.data on the client, there are a couple of things that you'd need to do:
Replace uses of res.data.Body with res.data.Body.data (as the actual data array is in the data attribute of res.data.Body)
Pass a Uint8Array to the Blob constructor, as the existing array is of a larger type, which will create an invalid blob
Putting that together, you would end up replacing:
const blob = new Blob([res.data.Body], {type: 'audio/mp3' });
with:
const blob = new Blob([new Uint8Array(res.data.Body.data)], {type: 'audio/mp3' });
Having said all that, the underlying issue is that the NodeJS server is sending the content over as a JSON encoded serialisation of the response from S3, which is likely overkill for what you are doing. Instead, you can send the Buffer across directly, which would involve, on the server side, replacing:
res.send(data);
with:
res.set('Content-Type', 'audio/mp3');
res.send(data.Body);
and on the client side (likely in the loadAudio method) processing the response as a blob instead of JSON. If using the Fetch API then it could be as simple as:
const blob = await fetch(<URL>).then(x => x.blob());
Your server side code seems alright to me. I'm not super clear about the client-side approach. Do you load this into the blob into the HTML5 Audio player.
I have a few approaches, assuming you're trying to load this into an audio tag in the UI.
<audio controls src="data:audio/mpeg;base64,blahblahblah or html src" />
Assuming that the file you had uploaded to S3 is valid here are two approaches:
Return the data as a base64 string instead of as a buffer directly from S3. You can do this in your server side by returning as
const base64MP3 = data.Body.toString('base64');
You can then pass this in to the MP3 player in the src property and it will play the audio. Prefix it with data:audio/mpeg;base64
Instead of returning the entire MP3 file, have your sectionID method return a presigned S3 URL. Essentially, this is a direct link to the object in S3 that is authorized for say 5 minutes.
Then you should be able to use this URL directly to stream the audio
and set it as the src. Keep in mind that it will expire.
const url = s3.getSignedUrl('getObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds
});
You stated: "The created blob is severely undersized and appears to be completely unusable"
This appears to me that you have an encoding issue. Once you read the MP3 from the Amazon S3 bucket, you need to encode it properly so it functions in a web page.
I did a similar multimedia use case that involved MP4 and a Java app. That is, i wanted a MP4 obtained from a bucket to play in the web page - as shown in this example web app.
Once I read the byte stream from the S3 bucket, I had to encode it so it would play in a HTML Video tag. Here is a good reference to properly encode a MP3 file.

nodejs: Retrieving base64 Images from Mongodb using Postman

Looking for help on Uploading and Retrieving Images from MongoDb using multer.
My front end is ReactNative.(Not sure if this is needed but just to be sure.)
Multer
Problem: After looking and following tutorials i'm able to encode my path to base64 and upload it to my DB but now i'm confused how to retrieve the file from my DB. I saw some tutorials about decoding it from base64 but I don't quite understand how do I go about retrieving an image and displaying it in postman. (I tried looking but haven't found anything that gives me an answer. I'm sorry if this is a duplicated question. If you could point me in a direction or give me some advice I would be really greatful.)
**POST**
route.post("/sad", upload.single("image"), (req, res, next) => {
console.log(req.file);
const img = fs.readFileSync(req.file.path);
const img_enc = img.toString('base64');
const obj = {
usrImage: {
data: new Buffer.from(img_enc, 'base64'),
contentType: "image/jpg",
},
};
console.log(obj);
const newAccout = new account(obj);
newAccout.save();
});
**RETRIEVE**
route.get('/sad',(req,res)=>{
img.find({}).then((img)=>{
res.json(img)
//How do decode my buffer to show an image in Postman?
})
}
)
I am trying to create a userprofile where a username,password and image is saved. If you can help save an Image and then retrieve it from my accounts collection.
Hey I would advise that you start using a 3rd party for file upload like cloudinary very good way of managing files i.e images or video...
I am not that well of with multer but I can give a quick code example using Formidable does the same work as multer
Before you can start you'd need to make an account on cloudinary.com(don't worry its free)
Code below is how you could handle file upload
const Formidable = require("formidable"); //Meant for body parsing
const cloudinary = require("cloudinary").v2; // file uploader
//This below is your connection/configuration to get access to your cloudinary account so cloud_name, api_key and api_secret you'll get in your home dashboard(Cloudinary)
cloudinary.config({
cloud_name: process.env.CLOUD_NAME,
api_key: process.env.API_KEY,
api_secret: process.env.API_SECRET,
});
router.post('/api/file-upload', (req, res)=>{
const form = new Formidable.InconmingForm();
form.parse(req, (error, fields, files)=>{
const {file} = files
cloudinary.uploader.upload(file.path, {folder:"/"}, (err, res)=>{
const file_url = res.secure_url //This would be the url for your file given back by cloudinary
})
})
})
This script should upload your file and the file_url will be having the url of the file that you upload having ssl then after that you can now continue saving to mongoDB
Cloudinary docs for NodeJS
https://cloudinary.com/documentation/node_integration
Nice clear and understandable docs
Shameless plug
If you get lost you can check this video out on YouTube that I made handling file upload with cloudinary then save url given back to mongoDB
https://youtu.be/mlu-tbr2uUk
First call api find one
you will need fs module to complete following query
const fs = require('fs');
let data = await db.user.findOne({
where: {
id = req.body.id
}
})
// _________________ base 64 string data from findone query data
// |
let buff = new Buffer(data.image, 'base64');
let name = name.jpeg
let path = `tmp/${name}`; // <--- destination and file name you want to give to your file
fs.writeFileSync(path, buff);// < --this will write file to given path
fs.readFile(path, function (err, content) {// <------to send file in postman response
if (err) {
res.writeHead(400)
console.log(err);
res.end("No such image");
} else {
//specify the content type in the response will be an image
res.writeHead(200);
res.end(content);
}
});
fs.unlink(path, (err) => { // <-----to delete file from tmp directory
if (err) {
console.log(err)
}
})
Try this and switch to preview tab in postman.
I haven't tried it but maybe it helps.
route.get('/sad',(req,res)=>{
img.find({}).then((img)=>{
res.setHeader('contentType','image/jpg').send(img)
})
})

Upload images into a file server and store the url to the image using nodejs

I am implementing a web app using MEAN Stack and Angular 6. There I want to submit a form with file upload. '.png' files should be uploaded.
I want to save the file in a different file server and send the url to the image.Currently I upload files into a folder in my project and save the image in db (I used ng2fileupload and multer for that.). Then it saves like this.
"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAV4AAAFUCAYAAABssFR8AAAK..."
But I want to save the image url and the image should be retrived by the url. Does anyone can explain a proper method for that?
I faced the same problem a month ago and find out a solution to this problem. Though I haven't used multer in the app.
From my frontend, I will be sending an object to Node API endpoint /event which will look like:-
let img = {
content: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUg...",
filename: 'yourfile.png'
}
At the backend, I'm using Cloudinary to store my images (Its free plan allows 10GB storage) and returns secure https URLs. So install it using npm i cloudinary and require in your api.js file.
And add the below configuration
cloudinary.config({
cloud_name: 'yourapp',
api_key: 'YOUR_KEY',
api_secret: 'YOUR_SECRET_KEY'
});
Last Step:- (Not so optimized code)
Let say I have an event Schema which has images array, where I'll be storing the URLs returned by cloudinary.
app.post('/event', (req, res) => {
try {
if (req.body.images.length > 0) {
// Creating new Event instance
const event = new Event({
images: [],
});
// Looping over every image coming in the request object from frontend
req.body.images.forEach((img) => {
const base64Data = img.content.split(',')[1];
// Writing the images in upload folder for time being
fs.writeFileSync(`./uploads/${img.filename}`, base64Data, 'base64', (err) => {
if (err) {
throw err;
}
});
/* Now that image is saved in upload folder, Cloudnary picks
the image from upload folder and store it at their cloud space.*/
cloudinary.uploader.upload(`./uploads/${img.filename}`, async (result) => {
// Cloudnary returns id & URL of the image which is pushed into the event.images array.
event.images.push({
id: result.public_id,
url: result.secure_url
});
// Once image is pushed into the array, I'm removing it from my server's upload folder using unlinkSync function
fs.unlinkSync(`./uploads/${img.filename}`);
// When all the images are uploaded then I'm sending back the response
if (req.body.images.length === event.images.length) {
await event.save();
res.send({
event,
msg: 'Event created successfully'
});
}
});
});
}
} catch (e) {
res.status(400).send(e);
}
});
P.S. Go ahead and suggest some optimization solution for this code here

Resources