Here is images coming in request ( from react app)
req.body = {
imageData: [ 'blob:http://localhost:3000/0bdca87f-47dc-4e56-94ae-ea24e86b6530' ]
}
Now I need to save this on ImageKit ( Node.Js)
imagekit
.upload ({
file : req.body.imageData[0], // not working
// file : 'convert_blob_to_base64' // not working either
fileName: 'my_file_name1.jpg'
})
.then (response => {
console.log (response);
})
.catch (error => {
console.log (error);
});
It does save the image but corrupted one, I tried by converting blob to base64, then also it didn't worked out.
if I directly convert the image to base64 using https://www.base64encode.org/ site and use that code as file : 'base_64_directly_encoded_image' then it works
Do I need to store the images first on Node. Js Something, Can Any one guide please.
Related
I have been working on a project using next.js and sanity but I can't seem to find a way to upload images to Sanity. I have been following the Sanity tutorial on how to upload assets and it works only if I set the filepath manually or if the assets are in the same project folder.
Below is the sanity method I have been using the upload the files.
const filePath = '/Users/mike/images/bicycle.jpg'
client.assets
.upload('image', createReadStream(filePath), {
filename: basename(filePath)
})
.then(imageAsset => {
return client
.patch('some-document-id')
.set({
theImageField: {
_type: 'image',
asset: {
_type: "reference",
_ref: imageAsset._id
}
}
})
.commit()
})
.then(() => {
console.log("Done!");
})
The main issue for me is the onchange handler returns a fake path which I understand is due to browser security and for images to successfully upload there should be an actual filepath like C:/Users/user/downloads/image.jpg instead of C:/fakepath/image.jpg
I have also attempted to change the onChange handler to get the filename only but I still can't upload the images because of the filepath issue.
const [image, setImage] = React.useState(null)
function handleImage(e){
const selectedFile = e.target.files[0]
if(selectedFile){
return setImage(selectedFile.name)
}
}
I have tried to use formidable but I didn't succeed as well. Please assist with a method on how to upload images.
I have created my backend with Nodejs and Apollo Server. I am trying to upload an image from my React Native client. Here is my backend code
Mutation: {
async singleUpload(_, { fileInput: { file, fileName } }) {
try {
const { createReadStream, filename, mimetype } = await file;
const stream = createReadStream();
const path = `./upload/${filename}`;
const tempFile = { id: 123, filename, mimetype, path };
await new Promise((resolve, reject) => {
const writeStream = createWriteStream(path);
// When the upload is fully written, resolve the promise.
writeStream.on("finish", resolve);
// If there's an error writing the file, remove the partially written file
// and reject the promise.
writeStream.on("error", (error) => {
unlink(path, () => {
reject(error);
});
});
// In node <= 13, errors are not automatically propagated between piped
// streams. If there is an error receiving the upload, destroy the write
// stream with the corresponding error.
stream.on("error", (error) => writeStream.destroy(error));
// Pipe the upload into the write stream.
stream.pipe(writeStream);
});
...then upload it to Firebase Storage
When I use Altair plugin for Firefox browser to upload image , it works fine.
Now for my React Native client, I am using apollo-upload-client lib. After a user picks image using the Image Picker plugin for React Native , I am creating a file
const file = new ReactNativeFile({
uri: image.uri,
type: image.type,
name: 'Hi there',
});
ReactNativeFile is from apollo-upload-client lib. Now when I upload it I get an error saying from my backend saying TypeError: createReadStream is not a function which I understand as there is no createReadStream parameter in my ReactNativeFile
so I decided to change my backend code to something like this
const { name, type } = await file;
console.log("Mime" + type);
const stream = fs.createReadStream(name);
fs module is from nodejs
Now when I try uploading image from Altair or from React Native , I get an error saying
[Error: ENOENT: no such file or directory,
Due to my lack of knowledge of backend I am not sure what is causing the issue.
I think if I try uploading it from ReactJS instead of React Native, my code might work for the backend.
But the image picker for browser and native mobile are very much different and I have not tried doing with ReactJS for now
I am following this article https://moonhighway.com/how-the-upload-scalar-works for my backend code
I am implementing a web app using MEAN Stack and Angular 6. There I want to submit a form with file upload. '.png' files should be uploaded.
I want to save the file in a different file server and send the url to the image.Currently I upload files into a folder in my project and save the image in db (I used ng2fileupload and multer for that.). Then it saves like this.
"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAV4AAAFUCAYAAABssFR8AAAK..."
But I want to save the image url and the image should be retrived by the url. Does anyone can explain a proper method for that?
I faced the same problem a month ago and find out a solution to this problem. Though I haven't used multer in the app.
From my frontend, I will be sending an object to Node API endpoint /event which will look like:-
let img = {
content: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUg...",
filename: 'yourfile.png'
}
At the backend, I'm using Cloudinary to store my images (Its free plan allows 10GB storage) and returns secure https URLs. So install it using npm i cloudinary and require in your api.js file.
And add the below configuration
cloudinary.config({
cloud_name: 'yourapp',
api_key: 'YOUR_KEY',
api_secret: 'YOUR_SECRET_KEY'
});
Last Step:- (Not so optimized code)
Let say I have an event Schema which has images array, where I'll be storing the URLs returned by cloudinary.
app.post('/event', (req, res) => {
try {
if (req.body.images.length > 0) {
// Creating new Event instance
const event = new Event({
images: [],
});
// Looping over every image coming in the request object from frontend
req.body.images.forEach((img) => {
const base64Data = img.content.split(',')[1];
// Writing the images in upload folder for time being
fs.writeFileSync(`./uploads/${img.filename}`, base64Data, 'base64', (err) => {
if (err) {
throw err;
}
});
/* Now that image is saved in upload folder, Cloudnary picks
the image from upload folder and store it at their cloud space.*/
cloudinary.uploader.upload(`./uploads/${img.filename}`, async (result) => {
// Cloudnary returns id & URL of the image which is pushed into the event.images array.
event.images.push({
id: result.public_id,
url: result.secure_url
});
// Once image is pushed into the array, I'm removing it from my server's upload folder using unlinkSync function
fs.unlinkSync(`./uploads/${img.filename}`);
// When all the images are uploaded then I'm sending back the response
if (req.body.images.length === event.images.length) {
await event.save();
res.send({
event,
msg: 'Event created successfully'
});
}
});
});
}
} catch (e) {
res.status(400).send(e);
}
});
P.S. Go ahead and suggest some optimization solution for this code here
Hi I would like to store images in amazon s3. I am making a react application with node js and express at the back end. I have a code which is saving the images locally, in images folder as desired. I am using jimp library to convert the images into black and white. What i want is to store these black and white images directly to aws instead of saving to local hdd. I need to do this because in the end the app has to be deployed to heroku, and heroku is not able to read images from local hdd.
Here is the code through which i was able to store images in a particular directory as required.
const input = req.body.input;
google.list({
keyword: input,
num: 15,
detail: true,
})
.then(function (res) {
res.map((data,index)=>{
const url = data.url;
const extension = url.split('.')[url.split('.').length-1]
const foldername=input
Jimp.read(url, function (err, image) {
image.resize(250, 250)
.greyscale()
.write(path.join(__dirname,"../../public/images/"+foldername+"/"+foldername+index+"."+extension));
});
});
})
}).catch(function(err) {
res.send('There was some error')
})
I need to store images in the same path ie., awsbucketname/foldername/foldername.jpg. I tried converting the image to buffer but still i don't understand how to proceed with it. Some one please help me :(
(Disclaimer: I have no practical experience with Jimp!)
It seems like you are on the right track with writing the image to a buffer instead of a local file. Once you have initialized the AWS SDK and instantiated the S3 interface, it should be easy to pass the buffer to the upload function. Something along the lines of:
const s3 = new AWS.S3({ params: { Bucket: 'yourBucketName' } });
// ...
Jimp.read(url, (err, image) => {
const bucketPath = `/${foldername}/${index}.${extension}`;
image.resize(250, 250)
.greyscale()
.getBuffer(Jimp.AUTO).then(buffer => {
s3.upload({ Key: bucketPath, Body: buffer })
.then(() => console.log('yay!'));
});
}
);
This is just a sketch of course, missing error handling etc.
I am trying to send a picture from my mobile hybrid app (Ionic 3) to my Heroku backend (Node.js) and have the backend upload the picture to Firebase Storage and return the newly uploaded fil download url to the mobile app.
Keep in mind that I am using the Firebase Admin SDK for Node.js.
So I send the base64 encoded image to Heroku (I check the encoded string with an online base64 decoder and it is alright) which is handle by the following function:
const uploadPicture = function(base64, postId, uid) {
return new Promise((resolve, reject) => {
if (!base64 || !postId) {
reject("news.provider#uploadPicture - Could not upload picture because at least one param is missing.");
}
let bufferStream = new stream.PassThrough();
bufferStream.end(new Buffer.from(base64, 'base64'));
// Retrieve default storage bucket
let bucket = firebase.storage().bucket();
// Create a reference to the new image file
let file = bucket.file(`/news/${uid}_${postId}.jpg`);
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg'
}
}))
.on('error', error => {
reject(`news.provider#uploadPicture - Error while uploading picture ${JSON.stringify(error)}`);
})
.on('finish', (file) => {
// The file upload is complete.
console.log("news.provider#uploadPicture - Image successfully uploaded: ", JSON.stringify(file));
});
})
};
I have 2 major issues:
Upload succeeds but I when I go to Firebase Storage console, there is an error when I try to display the preview of the picture and I cannot open it from my computer when I download it. I guess it is an encoding thing....?
How can I retrieve the newly uploaded file download url ? I was expecting an object to be returned in the .on('finish), like in the upload() function, but none is returned (file is undefined). How could I retrieve this url to send it back in the server response?
I want to avoid using the upload() function because I don't want to host files on the backend as it is not a dedicated server.
My problem was that I add data:image/jpeg;base64,at the beginning of the base64 object string ; I just had to remove it.
For the download url, I did the following:
const config = {
action: 'read',
expires: '03-01-2500'
};
let downloadUrl = file.getSignedUrl(config, (error, url) => {
if (error) {
reject(error);
}
console.log('download url ', url);
resolve(url);
});