I want to be able to send the file downloaded from google cloud directly to the client and not have to first save on my server then create a download to client from the saved version on my server, cause this make the process slow, as the file is downloaded two times, first from google cloud to my own server then from my own server to client.
router.get("/:filename", async(req, res) => {
try {
// Grab filename from request parameter
const fetchURL =req.params.filename;
const file = await File.findOne({fetchURL});
const srcFileName = file.originalname;
// Call GCS with bucketName and check the file method with srcFileName and check again with download method which takes download path as argument
storage
.bucket(bucketName)
.file(srcFileName)
.download({
destination: path.join(process.cwd(), "downloads", srcFileName)
})
.then(() =>
res.download(path.join(process.cwd(), "downloads", srcFileName), err =>
err ? console.log(err) : null
)
)
.catch(err =>res.status(400).json({
message: err.message
}));
} catch (err) {
res.status(res.statusCode).json({
message: `There was an error downloading your file. ${err.message}`
});
}
});
This works for me in NodeJS+Express server:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({projectId, keyFilename});
router.get('/:id', async function (req, res) {
let fileName = 'test.jpg'; //For example
let contetType = 'image/jpg;' //For example
res.writeHead(200, {
'Content-Disposition': `attachment;filename=${filename}`,
'Content-Type': `${contetType}`
});
await storage
.bucket('my-bucket')
.file(`Images/${req.params.id}/${filename}`)
.createReadStream() //stream is created
.pipe(res);
});}
Related
I'm trying to send a blob image, but I'm getting Error: Unexpected end of form using multer with Serverless Framework.
From console.log
My understanding is I have to append it to FormData before sending it in the body, but I haven't been able to get backend to accept file without crashing
uploadImage(imageData: File) {
console.log('IMAGE DATA', imageData);
let formData = new FormData();
formData.append('file', imageData, 'file.png');
let headers = new HttpHeaders();
headers.append('Content-Type', 'multipart/form-data');
headers.append('Accept', 'application/json');
let options = { headers: headers };
const api = environment.slsLocal + '/add-image';
const req = new HttpRequest('PUT', api, formData, options);
return this.http.request(req);
}
backend
const multerMemoryStorage = multer.memoryStorage();
const multerUploadInMemory = multer({
storage: multerMemoryStorage
});
router.put(
'/add-image',
multerUploadInMemory.single('file'),
async (req, res: Response) => {
try {
if (!req.file || !req.file.buffer) {
throw new Error('File or buffer not found');
}
console.log(`Upload Successful!`);
res.send({
message: 'file uploaded'
});
} catch (e) {
console.error(`ERROR: ${e.message}`);
res.status(500).send({
message: e.message
});
}
console.log(`Upload Successful!`);
return res.status(200).json({ test: 'success' });
}
);
app.ts
import cors from 'cors';
import express from 'express';
import routers from './routes';
const app = express();
import bodyParser from 'body-parser';
app.use(cors({ maxAge: 43200 }));
app.use(
express.json({
verify: (req: any, res: express.Response, buf: Buffer) => {
req.rawBody = buf;
}
})
);
app.use('/appRoutes', routers.appRouter);
app.use(
bodyParser.urlencoded({
extended: true // also tried extended:false
})
);
export default app;
From my understanding with serverless framework I have to install
npm i serverless-apigw-binary
and add
apigwBinary:
types: #list of mime-types
- 'image/png'
to the custom section of the serverless template yaml file.
The end goal is not to save to storage like S3, but to send the image to discord.
What am I missing? I appreciate any help!
I recently encountered something similar in a react native app. I was trying to send a local file to an api but it wasn't working. turns out you need to convert the blob file into a base64 string before sending it. What I had in my app, took in a local file path, converted that into a blob, went through a blobToBase64 function, and then I called the api with that string. That ended up working for me.
I have this code snippet to help you but this is tsx so I don't know if it'll work for angular.
function blobToBase64(blob: Blob) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result as string);
};
reader.readAsDataURL(blob);
});
}
Hope this helps!
You can convert your Blob to a File using
new File([blob], "filename")
and then you should be able pass that file to your existing uploadImage method.
Looks like you are passing Blob instead of File based on your console.log(). So you should convert Blob to a File before calling the server. You can change your frontend code like this:
uploadImage(imageData: File) {
// Convert Blob to File
const file = new File([imageData], "file_name", { type: imageData.type });
let formData = new FormData();
formData.append('file', file, 'file.png');
const api = environment.slsLocal + '/add-image';
return this.http.put(api, formData);
}
Note: For more info about converting Blob to File, you can check this StackOverflow question.
The thing that got it working for me was this article.
There might be something different about using Express through Serverless Framework so things like mutler and express-fileupload might not work. Or could be because it's an AWS Lambda function. I don't know this for sure though. I just know I never got it working. This article was the only thing that worked for Serverless Framework + Express.
I also had to install version 0.0.3 of busboy ie npm i busboy#0.0.3. The newer version didn't work for busboy. Newer version was saying Busboy is not a constructor
Since I'm sending the file to discord and not S3 like this article does, I had to tweak the parser.event part in this part of the article for the handler.ts
export const uploadImageRoute = async (
event: any,
context: Context
): Promise<ProxyResult> => {
const parsedEvent: any = await parser(event);
await sendImageToDiscord(parsedEvent.body.file);
const response = {
statusCode: 200,
body: JSON.stringify('file sent successfully')
};
return response;
};
comes in as a Buffer which I was able to send as a file like this
const fs = require('fs-extra');
const cwd = process.cwd();
const { Webhook } = require('discord-webhook-node');
const webhook = new Webhook('<discord-webhook-url>');
export async function sendImageToDiscord(arrayBuffer) {
var buffer = Buffer.from(arrayBuffer, 'base64');
const newFileName = 'nodejs.png';
await fs.writeFile(`./${newFileName}`, buffer, 'utf-8').then(() => {
webhook.sendFile(`${cwd}/${newFileName}`);
});
}
});
I hope this helps someone!
So I'm trying to make the html form:
<form action="blahblah" encblah="multipart/form-data" whatever>
Thats not the problem, I need to make that form send the blob to express
app.post('/upload/avatars', async (req, res) => {
const body = req.body;
console.log(req.file);
console.log(body);
res.send(body);
});
So I can access the blob, create a read stream, pipe it to the cloud, and bam, upload the file without downloading anything on the express server it self.
Is that possible?
If yes, please tell me how.
If no, please tell me other alternatives.
On the client we do a basic multi-part form upload. This example is setup for a single image but you could call uploadFile in sequence for each image.
//client.ts
const uploadFile = (file: File | Blob) => {
const formData = new FormData();
formData.append("image", file);
return fetch("/upload", {
method: "post",
body: formData,
});
};
const handleUpload = (event: any) => {
return event.target.files.length ? uploadFile(event.target.files[0]) : null;
};
On the server we can use multer to read the file without persisting it to disk.
//server.js
const express = require("express");
const app = express();
const multer = require("multer");
const upload = multer();
app.post(
"/upload",
upload.fields([{ name: "image", maxCount: 1 }]),
(req, res, next) => {
console.log("/upload", req.files);
if (req.files.image.length) {
const image = req.files.image[0]; // { buffer, originalname, size, ...}
// Pipe the image.buffer where you want.
res.send({ success: true, count: req.files.image.originalname });
} else {
res.send({ success: false, message: "No files sent." });
}
}
);
For larger uploads I recommend socket.io, but this method works for reasonably sized images.
it is possible, but when you have a lot of traffic it would overwhelm your express server (in case you are uploading videos or big files ) but if it's for uploading small images (profile image, etc...) you're fine. either way you can use Multer npm
I'd recommend using client-side uploading on ex: s3-bucket, etc..., which returned a link, and therefore using that link.
I've tried to upload files to ipfs. But it's not uploading in ipfs. I'm getting hash in return after uploading the file. But it's not accessible using https://ipfs.io/ipfs/+hash but I can access the file localhost:8080/ipfs/+hash.
What am I doing wrong? What do I upload files in https://ipfs.io/ipfs
here is my app.js:
const express = require("express");
const app = express();
const ipfsClient = require("ipfs-http-client");
const ipfs = new ipfsClient();
const expFileUpload = require("express-fileupload");
app.use(expFileUpload());
app.post("/upload", (req, res) => {
let fileObj = {};
if (req.files.inputFile) {
const file = req.files.inputFile;
const fileName = file.name;
const filePath = __dirname + "/files/" + fileName;
file.mv(filePath, async (err) => {
if (err) {
console.log("Error: failed to download file.");
return res.status(500).send(err);
}
const fileHash = await addFile(fileName, filePath);
console.log("File Hash received __>", fileHash);
fs.unlink(filePath, (err) => {
if (err) {
console.log("Error: Unable to delete file. ", err);
}
});
fileObj = {
file: file,
name: fileName,
path: filePath,
hash: fileHash
};
res.render("transfer", { fileObj });
});
}
});
const addFile = async (fileName, filePath) => {
const file = fs.readFileSync(filePath);
const filesAdded = await ipfs.add({ path: fileName, content: file }, {
progress: (len) => console.log("Uploading file..." + len)
});
console.log(filesAdded);
const fileHash = filesAdded.cid.string;
return fileHash;
};
app.listen(3000);
Need help. Thank you.
As #deltab said, your local IPFS node must be reachable from the gateway. It's not possible to push or upload files to the IPFS gateway. When you make an http request to the gateway it looks up the content on the IPFS network for you.
Your local IPFS node is hosting the data you added to it. If the gateway's IPFS nodes can't connect to your local IPFS node, then it won't be able to find the data for the hash you requested. (Unless other nodes are also hosting it... co-hosting FTW \o/)
If your local IPFS node is running, it may be stuck behind a NAT or Firewall. If you run ipfs id, you'll see an array of Addresses your node is listening on. If you see one that looks like a public IP address, then grab the IP and port and see if the port is open using an online service like https://portchecker.co/
The https://docs.ipfs.io has good articles on:
IPFS and NATs https://docs.ipfs.io/how-to/nat-configuration/
IPFS Gateways: https://docs.ipfs.io/concepts/ipfs-gateway/
I'm trying to upload files from a MERN application I'm working on. I'm almost done with the NodeJS back end part.
Said application will allow users to upload images(jpg, jpeg, png, gifs, etc) to an Amazon AWS S3 bucket that I created.
Well, lets put it this way. I created a helper:
const aws = require('aws-sdk');
const fs = require('fs');
// Enter copied or downloaded access ID and secret key here
const ID = process.env.AWS_ACCESS_KEY_ID;
const SECRET = process.env.AWS_SECRET_ACCESS_KEY;
// The name of the bucket that you have created
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
const s3 = new aws.S3({
accessKeyId: ID,
secretAccessKey: SECRET
});
const uploadFile = async images => {
// Read content from the file
const fileContent = fs.readFileSync(images);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
// Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
s3.upload(params, function(err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};
module.exports = uploadFile;
That helper takes three of my environment variables which are the name of the bucket, the keyId and the secret key.
When adding files from the form(that will eventually be added in the front end) the user will be able to send more than one file.
Right now my current post route looks exactly like this:
req.body.user = req.user.id;
req.body.images = req.body.images.split(',').map(image => image.trim());
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
That right there works great but takes the req.body.images as a string with each image separated by a comma. What would the right approach be to upload(to AWS S3) the many files selected from the Windows directory pop up?. I tried doing this but did not work :/
// Add user to req,body
req.body.user = req.user.id;
uploadFile(req.body.images);
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
Thanks and hopefully your guys can help me out with this one. Right now I'm testing it with Postman but later on the files will be sent via a form.
Well you could just call the uploadFile multiple times for each file :
try{
const promises= []
for(const img of images) {
promises.push(uploadFile(img))
}
await Promise.all(promises)
//rest of logic
}catch(err){ //handle err }
On a side note you should warp S3.upload in a promise:
const AWS = require('aws-sdk')
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
})
module.exports = ({ params }) => {
return new Promise((resolve, reject) => {
s3.upload(params, function (s3Err, data) {
if (s3Err) return reject(s3Err)
console.log(`File uploaded successfully at ${data.Location}`)
return resolve(data)
})
})
}
Bonus, if you wish to avoid having your backend handle uploads you can use aws s3 signed urls and let the client browser handle that thus saving your server resources.
One more thing your Post object should only contain Urls of the media not the media itself.
// Setting up S3 upload parameters
const params = {
Bucket: bucket, // bucket name
Key: fileName, // File name you want to save as in S3
Body: Buffer.from(imageStr, 'binary'), //image must be in buffer
ACL: 'public-read', // allow file to be read by anyone
ContentType: 'image/png', // image header for browser to be able to render image
CacheControl: 'max-age=31536000, public' // caching header for browser
};
// Uploading files to the bucket
try {
const result = await s3.upload(params).promise();
return result.Location;
} catch (err) {
console.log('upload error', err);
throw err;
}
I want to upload profile picture of a user sent from web app and mobile app via Base64 form.
On the POST request they need to send a JSON on the body that looks something like this.
{
"name":"profile-pic-123.jpg",
"file":"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAkGBxQTEhIUEhIUFBUV…K9rk8hCAEkjFMUYiEAI+nHIpsQh0AkisDYRTOiCAbWVtgCtI6IlkHh7LDTQXLH0EIQBj//2Q==" // the base64 image
}
Now on the server side using Node and Express, I used this npm module called azure-storage which offers a nice way of uploading files to azure blob storage using web service.
But there's something that I cannot understand on this. Here's a part of the code from my controller. I successfully created all neccessary connections and keys and whatnot to create a working blobService :
controllers.upload = function(req, res, next){
// ...
// generated some sastoken up here
// etc.
// ...
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name, // im not sure about this
path: req.body.file // im not sure about this either
}
sharedBlobService.createBlockBlobFromLocalFile(uploadOptions.container, uploadOptions.blob, uploadOptions.path, function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
}
Im getting this error:
{
"errno": 34,
"code": "ENOENT",
"path": "iVBORw0KGgoAAAANSUhEUgAAABgAAAAYCAIAAAB..."
}
if you use javascript sdk v12 You can use this sample code. It's just that simple. I have this implemented in a function and it works great when all I need it to trigger an HTTP event.
index.js
const file = await require('./file')();
uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
const fileUploader = await file(uploadOptions.text, uploadOptions.blob,
uploadOptions.container);
You can use a separate module for your logic and call this from the index.js above
file.js
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const Promise = require('bluebird');
module.exports = Promise.method(async function() {
return async function (data, fileName, container) {
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const matches = data.match(/^data:([A-Za-z-+\/]+);base64,(.+)$/);
const buffer = new Buffer(matches[2], 'base64');
return await blockBlobClient.upload(buffer, buffer.byteLength );
};
});
In this case, you should not use createBlockBlobFromLocalFile. Instead, you should use createBlockBlobFromText, because you are not uploading a local file, but content in the request body.
Here is the code:
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
sharedBlobService.createBlockBlobFromText(uploadOptions.container,
uploadOptions.blob,
uploadOptions.text,
{
contentType: 'image/jpeg',
contentEncoding: 'base64'
},
function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
The blob is just the file name, which is "profile-pic-123.jpg" this case, and path is the local path to your file. Since you are not storing the file locally in the server side, path is meaningless in the case.
If you need more information about Storage, see this, and this