Cutted photo off when is big one. Nodejs, express, graphql - node.js

When I upload a big photo then it's cut off. These are a few lines where I save the photo on the backend side. It happens only when a photo is big (more than 2MB). I use nodejs, express, graphql with apollo-upload-client. All is running on docker with nginx.
const { filename, createReadStream } = await input.photo;
if (isFunction(createReadStream)) {
const fileStream = createReadStream();
const trimedFilename = trim(filename);
fileStream.pipe(fs.createWriteStream(`${eventFolderPath}/${trimedFilename}`));
}
The photo looks like that:
corrupted photo

Related

Create a Blob for Video in NodeJs18 and Use it on Client Side

Learning NodeJs and Blob, and I Try to Create a Blob from a Video File in NodeJS. And send this in a Json file to my Client. Data will be Fetch inside GetStaticProps in NextJS.
Here is What I Created in NodeJS Server :
const fileBuffer = fs.readFileSync(filePath); // Path is something like example.com/video.mp4
const blob = new Blob([fileBuffer], { type: 'video/mp4' });
blobUrl = URL.createObjectURL(blob); // Return blob:nodedata:c3b1baf2-fba8-404f-8d3c-a1184a3a6db2
It retrun this :
blob:nodedata:c3b1baf2-fba8-404f-8d3c-a1184a3a6db2
But How can I use it in Client ? <video src="blob:nodedat..." is not Working, so I am doing something wrong for sure
Can you help me uderstand what is wrong ?

Node.js Express Temporary File Serving

I'm trying to do a reverse image search using googlethis on an image the user uploads. It supports reverse image searching, but only with a Google-reachable image URL. Currently, I upload the image to file.io, which deletes it after it gets downloaded.
This is the current application flow:
User POSTs file -> Server uploads file to file.io -> Google downloads the file -> Server does things with the reverse image search
However, I want to skip the middleman and have Google download files directly from the server:
User POSTs file -> Server serves file at unique URL -> Google downloads the file -> Server deletes the file -> Server does things with the reverse image search
I've looked at Serving Temporary Files with NodeJs but it just shows how to serve a file at a static endpoint. If I added a route to /unique-url, the route would stay there forever (a very slow memory leak! Probably! I'm not really sure!)
The only way I can think of is to save each file with a UUID and add a parameter: /download?id=1234567890, which would probably work, but if possible, I want to do things in memory.
So:
How do I do this using normal files?
How do I do this in-memory?
Currently working (pseudo) code:
app.post('/', (req, res) => {
const imagePath = saveImageTemporarily(req)
const tempUrl = uploadToFileIo(imagePath)
const reverseImageResults = reverseGoogleSearch(tempUrl)
deleteFile(imagePath)
doThingsWithResults(reverseImageResults).then((result) => { res.send(result) })
}
The other answer is a good one if you are able to use Redis -- it offers lots of helpful features like setting a time-to-live on entries so they're disposed of automatically. But if you can't use Redis...
The basic idea here is that you want to expose a (temporary) URL like example.com/image/123456 from which Google can download an image. You want to store the image in memory until after Google accesses it. So it sounds like there are two (related) parts to this question:
Store the file in memory temporarily
Rather than saving it to a file, why not create a Buffer holding the image data. Once you're done with it, release your reference to the buffer and the Node garbage collector will dispose of it.
let image = Buffer.from(myImageData);
// do something with the image
image = null; // the garbage collector will dispose of it now
Serve the file when Google asks for it
This is a straightforward route which determines which image to serve based on a route parameter. The query parameter you mention will work, and there's nothing wrong with that. Or you could do it as a route parameter:
app.get('/image/:id', (req, res) => {
const id = req.params.id;
res.status(200).send(/* send the image data here */);
});
Putting it all together
It might look something like this:
// store image buffers here
const imageStore = {};
app.post('/image', (req, res) => {
// get your image data here; there are a number of ways to do this,
// so I leave it up to you
const imageData = req.body;
// and generate the ID however you want
const imageId = generateUuid();
// save the image in your store
imageStore[imageId] = imageData;
// return the image ID to the client
res.status(200).send(imageId);
});
app.get('/image/:id', (req, res) => {
const imageId = req.params.id;
// I don't know off the top of my head how to correctly send an image
// like this, so I'll leave it to you to figure out. You'll also need to
// set the appropriate headers so Google recognizes that it's an image
res.status(200).send(imageStore[imageid]);
// done sending? delete it!
delete imageStore[imageId];
});
I would use REDIS for the in-memory DB, and on the server, I would transform the image to base64 to store it in Redis.
In Redis, you can also set TTL on the images.
Check my code below
import {
nanoid
} from 'nanoid'
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString('base64');
}
app.post('/', async(req, res) => {
const client = redisClient;
const imagePath = saveImageTemporarily(req)
//const tempUrl = uploadToFileIo(imagePath)
var base64str = base64_encode(imagePath);
const id = nanoid()
await client.set(id, JSON.stringify({
id,
image: base64str
}));
const reverseImageResults = reverseGoogleSearch(JSON.parse(await client.get(id)).image)
await client.del(id);
doThingsWithResults(reverseImageResults).then((result) => {
res.send(result)
})
}

FastAPI: file uploaded bytes on client ProgressEvent done twice before completing upload

I'm trying to implement a multiple large file uploader within a Vue.js application. The backend is a FastAPI application.
The issue is about a strange behavior for the ProgressEvent associated with an axios POST for uploading a file to the backend.
The ProgressEvent.loaded value is not incremental and resets when the file is almost entirely uploaded into the backend. It starts back from a low number of uploaded bytes and finally completes the upload. It seems like the file is uploaded twice.
I have this simple FastAPI path operation function implementing the file upload endpoint:
#router.post(
"/upload/",
status_code=status.HTTP_200_OK,
summary="Upload job-specific file",
description=(
"Accepts file uploads. Files can be uploaded in chunks to allow pausing/ resuming uploads"
),
dependencies=[Depends(get_current_user)]
)
async def upload_file_chunk(chunk: UploadFile, custom_header_job_id=Header(...), settings: Settings = Depends(get_settings)):
filename = compose_upload_filename(custom_header_job_id, chunk.filename)
#filename = '_'.join([custom_header_job_id, chunk.filename])
path_to_file = os.path.join(settings.UPLOAD_FOLDER, filename)
try:
async with aiofiles.open(path_to_file, "ab") as input_file:
while content := await chunk.read(1024):
await input_file.write(content)
except FileNotFoundError:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="File chunk not found",
)
return {"hello": "world"}
The endpoint is not completed yet, since it is supposed to do other things besides receiving the file.
The frontend request starts from a Vue component:
uploadFileChunk({ file, jobId, startByte, onProgressUpdate = undefined }) {
const chunk = file.slice(startByte);
const formData = new FormData();
formData.append("chunk", chunk, file.name);
//formData.append("jobId", jobId);
/* return axios.post(`http://localhost:1234/upload`, formData, {
headers: {
"Custom-Header-Job-Id": jobId,
"Content-Disposition": `form-data; name="chunk"; filename="${file.name}"`,
"Content-Range": `bytes=${startByte}-${startByte + chunk.size}/${
file.size
}`,
},
onUploadProgress: onProgressUpdate,
}); */
return instance.post(`/files/upload`, formData, {
headers: {
"Custom-Header-Job-Id": jobId,
"Content-Disposition": `form-data; name="chunk"; filename="${file.name}"`,
"Content-Range": `bytes=${startByte}-${startByte + chunk.size}/${
file.size
}`,
},
onUploadProgress: onProgressUpdate,
});
}
const onProgressUpdate = (progress) => {
console.log("loaded: ", progress.loaded);
const percentage = Math.round(
(progress.loaded * 100) / file.size);
};
console.log("percentage: ", percentage);
The commented request points to a different Node.js backend with a file upload endpoint exclusively made to assess if the current issue I'm facing up is dependent on the client code or the backend. Here is the implementation (not an expert in Node.js and express):
const express = require("express");
const cors = require("cors");
const multer = require("multer");
const app = express();
app.use(express.json());
app.use(cors());
const upload = multer({ dest: "uploads/" });
app.post("/upload", upload.single("chunk"), (req, res) => {
res.json({ message: "Successfully uploaded files" });
});
app.listen(1234);
console.log("listening on port 1234");
In addition, the client code is actually put into a much more articulated pattern involving XState for managing the uploader component. Nevertheless, attached snippets should be enough to have an idea of the main parts to be discussed here.
Here is a screenshot for the request the FastAPI endpoint:
FastAPI
Where we can see the file is almost entirely uploaded and then the uploaded parameter drops to a lower upload percentage, eventually finalizing the upload (not shown here).
The same experiment repeated on the Node.js endpoint does not create issues, which splits the upload in much less packets:
Node.js express
It seems like the Node.js backend works fine, whereas the FastAPI doesn't. In my opinion there are some issues with how FastAPI/ starlette manages large files. It could be something related to the spooled file that starlette creates, or maybe something happening when passing from storing the file in the main memory to the mass memory. Unfortunately, starlette UploadFile class seems very hermetic and not easy to be customized/ inspected.
DETAILS
FastAPI backend running on a Debian Bullseye docker image
FastAPI version 0.78.0
python-multipart version 0.0.5
client and server running in localhost and tested with Chrome 103.0.5060.53 (Official Build) (x86_64)
system: Mac OSX 11.3.1
Thank you so much for your help!

How to upload video or image file to database

I am working in a project in which i need to create api using nodejs to upload video or image file to the postgres database
How i can do this
Thanks a lot
You can convert an image or video to base64, and then upload the base64 encoded string to your database.
const fs = require('fs');
function encode_base64(file) {
const bitmap = fs.readFileSync(file);
return new Buffer(bitmap).toString('base64');
}
const image = encode_base64('image.jpg');
const video = encode_base64('image.mp4');

Storing and downloading pdf files from mongoDB using Node.js

I am using NodeJs, mongoose, ejs
I want the user to be able to upload his CV (Less the 16 MB) as a pdf file, then the admin to be able to download the pdf or view the CV on the website.
I did the upload part where I stored the cv like this in the database (which might be a wrong way):
cv: Object {
fileName: "Test Results All in testing-homework.pdf"
filePath: "cvs\1650985306448-.pdf"
fileType: "application/pdf"
}
and the pdf file is stored in the "/cvs" folder after upload.
I'm seeking a way to be able to download/view the pdf file from the database.
I will suggest you to use GridFs from mongodb.
refer:https://www.mongodb.com/docs/drivers/node/current/fundamentals/gridfs/
To Download from DB
bucket.openDownloadStreamByName('your_file_name').
pipe(fs.createWriteStream('./desired_name_of_output_file'));
Some of the Basic Upload and Download Operations using GridFs in MongoDB
const mongoUrl='mongodb://127.0.0.1:27017/db'
const mongodb = require('mongodb');
const fs = require('node:fs');
const client = new mongodb.MongoClient(mongoUrl)
const db = client.db('db');
Create Bucket with user-defined name
const bucket = new mongodb.GridFSBucket(db,{bucketName:'name_of_bucket' });
Upload to DB
fs.createReadStream('./filename.pdf').
pipe(bucket.openUploadStream('filename', {
chunkSizeBytes: 1048576,
metadata: { field: 'filename', value: 'myValue' }
}))
Find
const cursor = bucket.find({});
cursor.forEach(doc => console.log(doc));
Delete
cursor.forEach(doc => bucket.delete(doc._id));

Resources