I have created a Node JS server which does the following:
Uploads media files (videos and images) to the server using multer
If the media is an image, then resize it using sharp
It the media is a video , then resize and compress it using fluent-ffmpeg
Upload files to Firebase storage for backup
All this is working know fluently. The problem is that, when the size of an uploaded file is big, the request processing takes long time. So I want to show some progress on the client side as below:
State 1. The media is uploading -> n%
State 2. The media is compessing
State 3. The media is uploading to cloud -> n%
State 4. Result -> JSON = {status: "ok", uri: .., cloudURI: .., ..}
Firebase storage API has a functionality like this when we creating an upload task as shown below:
let uploadTask = imageRef.put(blob, { contentType: mime });
uploadTask.on('state_changed', (snapshot) => {
if (typeof snapshot.bytesTransferred == "number") {
let progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
}
});
I have found that, it is possible to realize this using websockets, I am interested if there is other methods to do that.
The problem is described also here: http://www.tugberkugurlu.com/archive/long-running-asynchronous-operations-displaying-their-events-and-progress-on-clients
And there is one of the methods Accessing partial response using AJAX or WebSockets? but I am looking for a more flexible and professional solution.
I have solved this problem using GraphQL Subscriptions. The same approach can be realized using WebSockets. The steps to solve this problem are as below:
Post files to upload server
Generate operation unique ID and send it as response to the client
Ex: response = {op: "A78HNDGS89NSNBDV7826HDJ"}
Create a subscription by opID
Ex: subscription { uploadStatus(op: "A78HNDGS89NSNBDV7826HDJ") { status }}
Every time on status change send request to the GraphQL endpoint, which which publishes the data to the pubsub. To send GraphQL request from nodejs server you can use https://github.com/prisma-labs/graphql-request
Ex:
const { request } = require('graphql-request');
const GQL_URL = "YOUR_GQL_ENDPOINT";
const query = `query {
notify ("Status text goes here")
}`
request(GQL_URL, query).then(data =>
console.log(data)
)
notify resolver function publishes the data to the pubsub
context.pubsub.publish('uploadStatus', {
status: "Status text"
});
If you have more complicated architecture, you can use message brokers like RabbitMQ, Kafka etc.
If someone knows other solutions, please let us know )
Related
Okay so I have a Nodejs/Express app that has an endpoint which allows users to receive notifications by opening up a connection to said endpoint:
var practitionerStreams = [] // this is a list of all the streams opened by pract users to the
backend
async function notificationEventsHandler(req, res){
const headers ={
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache'
}
const practEmail = req.headers.practemail
console.log("PRACT EMAIL", practEmail)
const data = await ApptNotificationData.findAll({
where: {
practEmail: practEmail
}
})
//console.log("DATA", data)
res.writeHead(200, headers)
await res.write(`data:${JSON.stringify(data)}\n\n`)
// create a new stream
const newPractStream = {
practEmail: practEmail,
res
}
// add the new stream to list of streams
practitionerStreams.push(newPractStream)
req.on('close', () => {
console.log(`${practEmail} Connection closed`);
practitionerStreams = practitionerStreams.filter(pract => pract.practEmail !== pract.practEmail);
});
return res
}
async function sendApptNotification(newNotification, practEmail){
var updatedPractitionerStream = practitionerStreams.map((stream) =>
// iterate through the array and find the stream that contains the pract email we want
// then write the new notification to that stream
{
if (stream["practEmail"]==practEmail){
console.log("IF")
stream.res.write(`data:${JSON.stringify(newNotification)}\n\n`)
return stream
}
else {
// if it doesnt contain the stream we want leave it unchanged
console.log("ELSE")
return stream
}
}
)
practitionerStreams = updatedPractitionerStream
}
Basically when the user connects it takes the response object (that will stay open), will put that in an Object along with a unique email, and write to it in the future in sendApptNotification
But obviously this is slow for a full app, how exactly do I replace this with Redis? Would I still have a Response object that I write to? Or would that be replaced with a redis stream that I can subscribe to on the frontend? I also assume I would store all my streams on redis as well
edit: from what examples I've seen people are writing events from redis to the response object
Thank you in advance
If you want to use Redis Stream as notification system, you can follow this official guide:
https://redis.com/blog/how-to-create-notification-services-with-redis-websockets-and-vue-js/ .
To get this data as real time you need to create a websocket connection. I prefer to send to you an official guide instead of create it for you it's because the quality of this guide. It's perfect to anyone understand how to create it, but you need to adapt for your reality, of course.
However like I've said to you in the comments, I just believe that it's more simple to do requests in your api endpoint like /api/v1/notifications with setInterval in your frontend code and do requests each 5 seconds for example. If you prefer to use a notification system as real time I think you need to understand why do you need it, so in the future you can change your system if you think you need it. Basically it's a trade-off you must to do!
For my example imagine two tables in a relational database, one as Users and the second as Notifications.
The tables of this example:
UsersTable
id name
1 andrew
2 mark
NotificationTable
id message userId isRead
1 message1 1 true
2 message2 1 false
3 message3 2 false
The endpoint of this example will return all cached notifications that isn't read by the user. If the cache doesn't exists, it will return the data from the database, put it on the cache and return to the user. In the next call from API, you'll get the result from cache. There some points to complete in this example, for example the query on the database to get the notifications, the configuration of time expiration from cache and the another important thing is: if you want to update all the time the notifications in the cache, you need to create a middleware and trigger it in the parts of your code that needs to notify the notifications user. In this case you'll only update the database and cache. But I think you can complete these points.
const redis = require('redis');
const redisClient = redis.createClient();
app.get('/notifications', async (request, response) => {
const userId = request.user.id;
const cacheResult = await redisClient.get(`user:${userId}:notifications`)
if (cacheResult) return response.send(cacheResult);
const notifications = getUserNotificationsFromDatabase(userId);
redisClient.set(`user:${userId}:notifications`, notifications);
response.send(notifications);
})
Besides that there's another way, you can simple use only the redis or only the database to manage this notification. Your relational database with the correct index will send to your the results as faster as you expect. You'll only think about how much notifications you'll have been.
I have a Angular JS application, from this application I send data to my AWS API endpoint
/**
* Bulk sync with master
*/
async syncDataWithMaster(): Promise<AxiosResponse<any> | void> {
{
axios.defaults.headers.post.Authorization = token;
const url = this.endpoint;
return axios.post(url, compressed, {
onUploadProgress: progressEvent => {
console.log('uploading')
},
onDownloadProgress: progressEvent => {
console.log('downloading')
},
}).then((response) => {
if (response.data.status == 'success') {
return response;
} else {
throw new Error('Could not authenticate user');
}
});
} catch (e) {
}
return;
}
the api gateway triggers my Lambda function (NodeJS) with the data it received:
exports.handler = async (event) => {
const localData = JSON.parse(event.body);
/**
Here get data from master and compare with local data and send back any new data
**/
const response = {
statusCode: 200,
body: JSON.stringify(newData),
};
return response;
};
the lambda function will call the database and get the master data for a user (not shown in the example) and then this data is compare using various logic with the local data and it determines if we need to send any new rows back to the local device to be store/ updated. (Before anyone asks, the nature of the application needs full data)
This principle works great for 90% of my users. However some users have fairly large amounts of data the current maximum being around 17mb of data.
So my question is is it possible to stream the data to and from the lambda function? So stream the data to the function, process and stream back? So that it is not affected by payload limits from AWS?
Or is it possible to somehow, begin sending data to the function as a stream and then as data becomes available it starts streaming data back at the same time?
(Data is JSON format)
I am wondering what alternatives to this solution (as it need to be fairly quick as well max 30sec)
(One other idea I had was for certain data above a certain size, frist client saves to s3 using signed url. The calls the api gateway for lambda. Lambda gets the saved file and compare to master. New data to be returned saved to s3 if over certain size. Then signed url returned to client. Client downloads the new data and processes) - However I am not sure if this of cost effective and it sounds live execution time may be long
Thanks for any help, been trying to figure this out for a while now
I want to make a progress bar kind of telling where the user where in process of fetching the API my backend is. But it seems like every time I send a response it stops the request, how can I avoid this and what should I google to learn more since I didn't find anything online.
React:
const {data, error, isError, isLoading } = useQuery('posts', fetchPosts)
if(isLoading){<p>Loadinng..</p>}
return({data&&<p>{data}</p>})
Express:
app.get("api/v1/testData", async (req, res) => {
try {
const info = req.query.info
const sortByThis = req.query.sortBy;
if (info) {
let yourMessage = "Getting Data";
res.status(200).send(yourMessage);
const valueArray = await fetchData(info);
yourMessage = "Data retrived, now sorting";
res.status(200).send(yourMessage);
const sortedArray = valueArray.filter((item) => item.value === sortByThis);
yourMessage = "Sorting Done now creating geojson";
res.status(200).send(yourMessage);
createGeoJson(sortedArray)
res.status(200).send(geojson);
}
else { res.status(400) }
} catch (err) { console.log(err) res.status(500).send }
}
You can only send one response to a request in HTTP.
In case you want to have status updates using HTTP, the client needs to poll the server i.e. request status updates from the server. Keep in mind though that every request needs to be processed on the server side and will take resources away which are then not available for other (more important) requests from other clients. So don't poll too frequently.
If you want to support long running operations using HTTP have a look at the following API design pattern.
Alternatively you could also use a WebSockets connection to push updates from the server to the client. I assume your computation on the backend will not be minutes long and you want to update the client in real-time, so probably WebSockets will be the best option for you. A WebSocket connection has, once established, considerably less overhead than sending huge HTTP requests/ responses between client and server.
Have a look at this thread which dicusses abovementioned and other possibilites.
I can't seem to find any up-to-date answers on how to cancel a file upload using Mongo, NodeJS & Angular. I've only come across some tuttorials on how to delete a file but that is NOT what I am looking for. I want to be able to cancel the file uploading process by clicking a button on my front-end.
I am storing my files directly to the MongoDB in chuncks using the Mongoose, Multer & GridFSBucket packages. I know that I can stop a file's uploading process on the front-end by unsubscribing from the subsribable responsible for the upload in the front-end, but the upload process keeps going in the back-end when I unsubscribe** (Yes, I have double and triple checked. All the chunks keep getting uploaded untill the file is fully uploaded.)
Here is my Angular code:
ngOnInit(): void {
// Upload the file.
this.sub = this.mediaService.addFile(this.formData).subscribe((event: HttpEvent<any>) => {
console.log(event);
switch (event.type) {
case HttpEventType.Sent:
console.log('Request has been made!');
break;
case HttpEventType.ResponseHeader:
console.log('Response header has been received!');
break;
case HttpEventType.UploadProgress:
// Update the upload progress!
this.progress = Math.round(event.loaded / event.total * 100);
console.log(`Uploading! ${this.progress}%`);
break;
case HttpEventType.Response:
console.log('File successfully uploaded!', event.body);
this.body = 'File successfully uploaded!';
}
},
err => {
this.progress = 0;
this.body = 'Could not upload the file!';
});
}
**CANCEL THE UPLOAD**
cancel() {
// Unsubscribe from the upload method.
this.sub.unsubscribe();
}
Here is my NodeJS (Express) code:
...
// Configure a strategy for uploading files.
const multerUpload = multer({
// Set the storage strategy.
storage: storage,
// Set the size limits for uploading a file to 120MB.
limits: 1024 * 1024 * 120,
// Set the file filter.
fileFilter: fileFilter
});
// Add new media to the database.
router.post('/add', [multerUpload.single('file')], async (req, res)=>{
return res.status(200).send();
});
What is the right way to cancel the upload without leaving any chuncks in the database?
So I have been trying to get to the bottom of this for 2 days now and I believe I have found a satisfying solution:
First, in order to cancel the file upload and delete any chunks that have already been uploaded to MongoDB, you need to adjust the fileFilter in your multer configuration in such a way to detect if the request has been aborted and the upload stream has ended. Then reject the upload by throwing an error using fileFilter's callback:
// Adjust what files can be stored.
const fileFilter = function(req, file, callback){
console.log('The file being filtered', file)
req.on('aborted', () => {
file.stream.on('end', () => {
console.log('Cancel the upload')
callback(new Error('Cancel.'), false);
});
file.stream.emit('end');
})
}
NOTE THAT: When canceling a file upload, you must wait for the changes to show up on your database. The chunks that have already been sent to the database will first have to be uploaded before the canceled file gets deleted from the database. This might take a while depending on your internet speed and the bytes that were sent before canceling the upload.
Finally, you might want to set up a route in your backend to delete any chunks from files that have not been fully uploaded to the database (due to some error that might have occured during the upload). In order to do that you'll need to fetch the all file IDs from your .chunks collection (by following the method specified on this link) and separate the IDs of the files whose chunks have been partially uploaded to the database from the IDs of the files that have been fully uploaded. Then you'll need to call GridFSBucket's delete() method on those IDs in order to get rid of the redundant chunks. This step is purely optional and for database maintenance reasons.
Try using try catch way.
There can be two ways it can be done.
By calling an api which takes the file that is currently been uploaded as it's parameter and then on backend do the steps of delete and clear the chunks that are present on the server
By handling in exception.
By sending a file size as a validation where if the backend api has received the file totally of it size then it is to be kept OR if the size of the received file is less that is due to cancellation of upload bin between then do the clearance steps where you just take the id and mongoose db of the files chuck and clear it.
I am trying to pass the Microsoft Cognitive services facial API an image which the user has uploaded. The image is available on the server in the uploads folder.
Microsoft is expecting the image to be 'application/octet-stream' and passed as binary data.
I am currently unable to find a way to pass the image to the API that is satisfactory for it to be accepted and keep receiving "decoding error, image format unsupported". As far as im aware the image must be uploaded in blob or file format but being new to NodeJs im really unsure on how to achieve this.
So far i have this and have looked a few options but none have worked, the other options i tried returned simmilar errors such as 'file too small or large' but when ive manually tested the same image via Postman it works fine.
image.mv('./uploads/' + req.files.image.name , function(err) {
if (err)
return res.status(500).send(err);
});
var encodedImage = new Buffer(req.files.image.data, 'binary').toString('hex');
let addAPersonFace = cognitive.addAPersonFace(personGroupId, personId, encodedImage);
addAPersonFace.then(function(data) {
res.render('pages/persons/face', { data: data, personGroupId : req.params.persongroupid, personId : req.params.personid} );
})
The package it looks like you're using, cognitive-services, does not appear to support file uploads. You might choose to raise an issue on the GitHub page.
Alternative NPM packages do exist, though, if that's an option. With project-oxford, you would do something like the following:
var oxford = require('project-oxford'),
client = new oxford.Client(YOUR_FACE_API_KEY),
uuid = require('uuid');
var personGroupId = uuid.v4();
var personGroupName = 'my-person-group-name';
var personName = 'my-person-name';
var facePath = './images/face.jpg';
// Skip the person-group creation if you already have one
console.log(JSON.stringify({personGroupId: personGroupId}));
client.face.personGroup.create(personGroupId, personGroupName, '')
.then(function(createPersonGroupResponse) {
// Skip the person creation if you already have one
client.face.person.create(personGroupId, personName)
.then(function(createPersonResponse) {
console.log(JSON.stringify(createPersonResponse))
personId = createPersonResponse.personId;
// Associate an image to the person
client.face.person.addFace(personGroupId, personId, {path: facePath})
.then(function (addFaceResponse) {
console.log(JSON.stringify(addFaceResponse));
})
})
});
Please update to version 0.2.0, this should work now.