Node / Express generate calendar URL - node.js

I have a database with a bunch of dates and an online overview where you can view them, now I know I can copy a URL from my Google Agenda and import this in other calendar clients so I can view the events there.
I want to generate an Express endpoint where I fetch every event every time the endpoint is called and return it in a format that can be imported by other calendar clients. Now with packages like iCal-generator I could generate, read, and return the file whenever a user requests the URL. but it feels redudent to write a file to my storage to then read it, return it and delete it every time it's requested.
What is the most effiecent way to go about this?

Instead of generating the file/calendar data on every request, you could implement a simple caching mechanism. That is, upon start of your node app you generate the calendar data and put it in your cache with corresponding time to live value. Once the data has expired or new entries are inserted into your DB you invalidate the cache, re-generate the data and cache it again.
Here's a very simple example for an in-memory cache that uses the node-cache library:
const NodeCache = require('node-cache');
const cacheService = new NodeCache();
// ...
const calendarDataCacheKey = 'calender-data';
// at the start of your app, generate the calendar data and cache it with a ttl of 30 min
cacheCalendarData(generateCalendarData());
function cacheCalendarData (calendarData) {
cacheService.set(calendarDataCacheKey, calendarData, 1800);
}
// in your express handler first try to get the value from the cache
// if not - generate it and cache it
app.get('/calendar-data', (req, res) => {
let calendarData = cacheService.get(calendarDataCacheKey);
if (calendarData === undefined) {
calendarData = generateCalendarData();
cacheCalendarData(calendarData);
}
res.send(calendarData);
});
If your app is scaled horizontally you should consider using redis.

100% untested, but I have code similar to this that exports to a .csv from a db query, and it might get you close:
const { Readable } = require('stream');
async function getCalendar(req, res) {
const events = await db.getCalendarEvents();
const filename = 'some_file.ics';
res.set({
'Content-Type': 'text/calendar',
'Content-Disposition': `attachment; filename=${filename}`,
});
const input = new Readable({ objectMode: true });
input.pipe(res)
.on('error', (err) => {
console.error('SOME ERROR', err);
res.status(500).end();
});
events.forEach(e => input.push(e));
input.push(null);
}
if you were going to use the iCal generator package, you would do your transforms within the forEach method before pushing to the stream.

Related

How should I go about using Redis for creating notifications with express/nodejs?

Okay so I have a Nodejs/Express app that has an endpoint which allows users to receive notifications by opening up a connection to said endpoint:
var practitionerStreams = [] // this is a list of all the streams opened by pract users to the
backend
async function notificationEventsHandler(req, res){
const headers ={
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache'
}
const practEmail = req.headers.practemail
console.log("PRACT EMAIL", practEmail)
const data = await ApptNotificationData.findAll({
where: {
practEmail: practEmail
}
})
//console.log("DATA", data)
res.writeHead(200, headers)
await res.write(`data:${JSON.stringify(data)}\n\n`)
// create a new stream
const newPractStream = {
practEmail: practEmail,
res
}
// add the new stream to list of streams
practitionerStreams.push(newPractStream)
req.on('close', () => {
console.log(`${practEmail} Connection closed`);
practitionerStreams = practitionerStreams.filter(pract => pract.practEmail !== pract.practEmail);
});
return res
}
async function sendApptNotification(newNotification, practEmail){
var updatedPractitionerStream = practitionerStreams.map((stream) =>
// iterate through the array and find the stream that contains the pract email we want
// then write the new notification to that stream
{
if (stream["practEmail"]==practEmail){
console.log("IF")
stream.res.write(`data:${JSON.stringify(newNotification)}\n\n`)
return stream
}
else {
// if it doesnt contain the stream we want leave it unchanged
console.log("ELSE")
return stream
}
}
)
practitionerStreams = updatedPractitionerStream
}
Basically when the user connects it takes the response object (that will stay open), will put that in an Object along with a unique email, and write to it in the future in sendApptNotification
But obviously this is slow for a full app, how exactly do I replace this with Redis? Would I still have a Response object that I write to? Or would that be replaced with a redis stream that I can subscribe to on the frontend? I also assume I would store all my streams on redis as well
edit: from what examples I've seen people are writing events from redis to the response object
Thank you in advance
If you want to use Redis Stream as notification system, you can follow this official guide:
https://redis.com/blog/how-to-create-notification-services-with-redis-websockets-and-vue-js/ .
To get this data as real time you need to create a websocket connection. I prefer to send to you an official guide instead of create it for you it's because the quality of this guide. It's perfect to anyone understand how to create it, but you need to adapt for your reality, of course.
However like I've said to you in the comments, I just believe that it's more simple to do requests in your api endpoint like /api/v1/notifications with setInterval in your frontend code and do requests each 5 seconds for example. If you prefer to use a notification system as real time I think you need to understand why do you need it, so in the future you can change your system if you think you need it. Basically it's a trade-off you must to do!
For my example imagine two tables in a relational database, one as Users and the second as Notifications.
The tables of this example:
UsersTable
id name
1 andrew
2 mark
NotificationTable
id message userId isRead
1 message1 1 true
2 message2 1 false
3 message3 2 false
The endpoint of this example will return all cached notifications that isn't read by the user. If the cache doesn't exists, it will return the data from the database, put it on the cache and return to the user. In the next call from API, you'll get the result from cache. There some points to complete in this example, for example the query on the database to get the notifications, the configuration of time expiration from cache and the another important thing is: if you want to update all the time the notifications in the cache, you need to create a middleware and trigger it in the parts of your code that needs to notify the notifications user. In this case you'll only update the database and cache. But I think you can complete these points.
const redis = require('redis');
const redisClient = redis.createClient();
app.get('/notifications', async (request, response) => {
const userId = request.user.id;
const cacheResult = await redisClient.get(`user:${userId}:notifications`)
if (cacheResult) return response.send(cacheResult);
const notifications = getUserNotificationsFromDatabase(userId);
redisClient.set(`user:${userId}:notifications`, notifications);
response.send(notifications);
})
Besides that there's another way, you can simple use only the redis or only the database to manage this notification. Your relational database with the correct index will send to your the results as faster as you expect. You'll only think about how much notifications you'll have been.

How to send a stream of data from a JS application to Lamda running on NodeJS and return concurrently

I have a Angular JS application, from this application I send data to my AWS API endpoint
/**
* Bulk sync with master
*/
async syncDataWithMaster(): Promise<AxiosResponse<any> | void> {
{
axios.defaults.headers.post.Authorization = token;
const url = this.endpoint;
return axios.post(url, compressed, {
onUploadProgress: progressEvent => {
console.log('uploading')
},
onDownloadProgress: progressEvent => {
console.log('downloading')
},
}).then((response) => {
if (response.data.status == 'success') {
return response;
} else {
throw new Error('Could not authenticate user');
}
});
} catch (e) {
}
return;
}
the api gateway triggers my Lambda function (NodeJS) with the data it received:
exports.handler = async (event) => {
const localData = JSON.parse(event.body);
/**
Here get data from master and compare with local data and send back any new data
**/
const response = {
statusCode: 200,
body: JSON.stringify(newData),
};
return response;
};
the lambda function will call the database and get the master data for a user (not shown in the example) and then this data is compare using various logic with the local data and it determines if we need to send any new rows back to the local device to be store/ updated. (Before anyone asks, the nature of the application needs full data)
This principle works great for 90% of my users. However some users have fairly large amounts of data the current maximum being around 17mb of data.
So my question is is it possible to stream the data to and from the lambda function? So stream the data to the function, process and stream back? So that it is not affected by payload limits from AWS?
Or is it possible to somehow, begin sending data to the function as a stream and then as data becomes available it starts streaming data back at the same time?
(Data is JSON format)
I am wondering what alternatives to this solution (as it need to be fairly quick as well max 30sec)
(One other idea I had was for certain data above a certain size, frist client saves to s3 using signed url. The calls the api gateway for lambda. Lambda gets the saved file and compare to master. New data to be returned saved to s3 if over certain size. Then signed url returned to client. Client downloads the new data and processes) - However I am not sure if this of cost effective and it sounds live execution time may be long
Thanks for any help, been trying to figure this out for a while now

How to load selected item from database?

So I have a default model set up for viewing my data, and a form for inputting the data. I want to know what the best practice is for retrieving the one item of selected data? it's for a MERN stack
Currently I am using window hash and adding the id onto the url and retrieving from database that way, I feel this is janky though and trying to add update functionality it seems like it might get confusing.
I've thought about adding a currentID to redux, but then I can see problems occurring when that is persisted and you go to create a recipe after viewing and end up editing instead of creating.
retrieving id from url
const recipeId = window.location.hash.substr(1);
const recipe = useSelector((state) =>
state.recipes.find((r) => r._id === recipeId)
);
I get my recipes from mongo
export const recipeList = async (req, res) => {
try {
const recipes = await recipeSheet.find();
res.status(200).json(recipes);
} catch (error) {
res.status(404).json({ message: error.message });
}
};
and store to redux
export const getRecipes = () => async (dispatch) => {
try {
const { data } = await api.fetchRecipes();
dispatch({ type: "FETCH_ALL_RECIPES", payload: data });
} catch (error) {
console.log(error.message);
}
};
It depends on how large is your data. It'd better define a new GET path to retrieve a single record, like BASE_URL/api/recipes/123 or you can add query acceptance for the current endpoint to find a specific id in DB and return it, like BASE_URL/api/recipes?id=123. The reason for that is besides the optimization (for large data sets), the record may change after you store all records to the redux store, and by the current solution, you show the old data to the user. Best practices tell us to choose the first way as your solution, the second way is usually for filtering the data. Then simply by sending the new URL by the user, trigger a new API call to the new endpoint and get the single record.

nodejs: create same value in a short period

I have a sample app, user can access some dynamic data via different URL.
The workflow is like this:
when user request get_data?id=1234567
first it checks the DB if there is data for it
if not, generate a random value
then if other users request the same url within a short time (say 10 min), it will return the value that already generated
if one of the users send a clear request, the value will be cleared from DB.
The bug is: if 2 users request the same url at the same time, since it needs time to query the DB, it would do 1 and 2 at the same time, then create different values for each user.
How to make sure that in a short period, it always generate same value for all users?
Although NodeJS is single threaded and does not have the problem of synchronization between multiple threads, its asynchronous event model still can require you to implement some kind of locking mechanism to synchronize the concurrent async operations in certain situations (like in your case).
There are a number of libraries that provide this functionality, e.g. async-mutex. Here's a very basic example of what your code could look like:
const express = require('express');
const app = express();
const Mutex = require('async-mutex').Mutex;
const locks = new Map();
app.get('/get_data', async (req, res) => {
const queryId = req.query.id;
if (!queryId) {
// handle empty queryid ...
}
if (!locks.has(queryId)) {
locks.set(queryId, new Mutex());
}
const lockRelease = await locks
.get(queryId)
.acquire();
try {
// do the rest of your logic here
} catch (error) {
// handle error
} finally {
// always release the lock
lockRelease();
}
});
app.listen(4000, function () {
console.log("Server is running at port 4000");
});

Nodejs deleting uploaded files after specific time

I am building a hosting server with Node and MongoDB. Process of working look something like this:
User open page with form which contains 4 inputs:
sender email
receiver email
message from sender to receiver
files (multiple)
User fills all inputs properly and sends POST request on server.
Server handles form with multer and saves files, then in callback the object with fields data where is stored info received from form is prepared and sent to database on MongoLab.
In callback of saving doc in database, server sends mails to sender and receiver with generated link from where they can download uploaded files.
Now I would like to implement additional input to form, where user can set date when his files should be deleted from the server.
So there are two things to do: delete files and delete doc in database on time set by the user.
Do you have some ideas how to implement such thing?
To delete a file, you can simply use fs.unlink()
const fs = require('fs');
const deleteFile = (file) => {
fs.unlink("path/to/file/folder/"+file, (err) => {
if (err) throw err;
}
}
You want to create a setTimeout(), but you need to find how much time is remaining before the date provided by the user, you should do something like this:
const time_remaining = (date_provided) => new Date(date_provided) - new Date();
Then just use setTimeout():
let timeOuts = []; // We create an array of timeouts in case we want to cancel one later
// I assume you use express and body-parser
app.post('/upload', (req, res) => {
const timer = setTimeout( () => deleteFile(req.body.file), time_remaining (req.body.date));
timeOuts.push(timer);
}

Resources