nodeJS Imap takes so much time to fetch emails with attachment - node.js

I am trying to create a webmail system with React and nodeJS, and I have no clue as to if I am doing it correctly...
I have searched on Google, and read everything I could, but still have no idea.
I purchased a domain for email, built a mail server with MailEnable, and currently using nodeJs libraries such as Imap, and nodemailer for both sending/receiving emails. Is this a correct way to make a webmail client? or is there a standard way of creating it?
I'm asking this because I'm having so many errors. For example, fetching emails with attachments takes so much time, especially when the file size is big.
and since it is not about reading a list of emails, I need to read the whole message to display email like this
The code for reading email is as below:
const {
simpleParser
} = require('mailparser');
imap.openBox(box_name, false, () => {
const criteria = ["uid", no];
imap.search([criteria], (err, results) => {
const f = imap.fetch(results, {
bodies: ''
});
f.on('message', msg => {
console.log("Flags: " + msg.flags);
msg.on('body', stream => {
simpleParser(stream, async (err, parsed) => {
// const {from, subject, textAsHtml, text} = parsed;
var obj = new Object();
obj["parsed"] = parsed;
view_data.push(obj);
});
f.once('error', ex => {
return Promise.reject(ex);
});
f.once('end', () => {
console.log('Done fetching!');
});
});
})
})
})
I get the correct results. It's just that it is too slow when the email has an attachment file with it.
Would there be any way to solve this problem? Thanks in advance

Related

node-telegram-bot-api / How to get user message on a question?

So, i'm actually tired to find any asks about that...
I need to get user message only after bot question and nowhere else like:
bot: What is your name?
user: Oleg
bot: Hi, Oleg
how it should work
I am also using require system with module.exports, so i'm really confused, how to deal with my problem
EXAMPLE CODE
const mw = require('./example_module');
bot.onText(/\/help/, async (data) => {
try {
mw.cout.userlog(data);
await cw.help.main(bot, data);
} catch (e) {
mw.cout.err(e.name)
}
});
you can do that with a database or just a JSON file by storing a user state property. For example, here you are asking the user their name. And you can set the state property for the user in your DB that "setName". And when the user replies, check DB and find what was the last state. Here we have set the state to "setName". Then do the rest.
Otherwise, just with the node-telegram-bot-api, you can do this but a slight difference is that you have to receive their name as a reply text.
Here's the code:
bot.onText(/\/help/, async msg => {
const namePrompt = await bot.sendMessage(msg.chat.id, "Hi, what's your name?", {
reply_markup: {
force_reply: true,
},
});
bot.onReplyToMessage(msg.chat.id, namePrompt.message_id, async (nameMsg) => {
const name = nameMsg.text;
// save name in DB if you want to ...
await bot.sendMessage(msg.chat.id, `Hello ${name}!`);
});
});
And that's it.

How to create a share link in Buildfire

I'm attempting to create a share link that users within a plugin can send to friends via email or sms. If their friend has the app, the goal would be to open the app to the plugin with a query string similar to navigation.navigateTo does so that it would open to specific content within the plugin. If they don't, the goal would be to send them to the app's webpage where they could download the app from the appropriate store (android or apple). Is this possible or at least a portion of it?
I've generated a share link using buildfire.deeplink.generateUrl but can't find the appropriate steps from there in the API documentation.
Yes, The steps as following:
generate the deeplink URL
buildfire.deeplink.generateUrl(options, callback)
make sure of passing data property to options where its representing the
deep link data that developers need to provide to the plugin once the user
opened the share link. For more information on how to reed this data from
plugin, see buildfrire.deeplink.getData.
buildfire.deeplink.generateUrl(
{
data: { videoId: "9Q-4sZF0_CE" },
},
(err, result) => {
if (err) {
console.error(err);
} else {
console.log(result.url);
}
}
);
after generate the deep link use the following function to open share device options
buildfire.device.share({ link: deepLinkUrl }, callback);
Finally you have to handle deeplink data in you plugin to be able to open desired content based deeplink data that you pass during generate the deeplink URL, check this buildfrire.deeplink.getData.
For more details check the doc.
Example
// share function
const share = () => {
let deeplinOptions= {};
deeplinOptions.title = 'Hello world';
deeplinOptions.type = "website";
deeplinOptions.description = 'First program';
deeplinOptions.imageUrl = '<IMAGE URL>';
deeplinOptions.data = {
"link": vidId
};
buildfire.deeplink.generateUrl(deeplinOptions, function (err, result) {
if (err) {
console.error(err);
} else {
let options = {
link: result.url
};
let callback = function (err, result) {
if (err) {
console.warn(err);
};
};
buildfire.device.share(options, callback);
};
});
};
// Handle Deeplink data in your plugin
const handleDeepLinkData = () => {
buildfire.deeplink.getData(function (data) {
if (data && data.link) {
let vidId= data.link;
// Do what you want
}
// Do what you want
});
}
Yeah, Just share the result URL

CSV File downloads from Node Express server

I have an API backend with Node and Express. I am trying to take some filtered data from the frontend and create a CSV file and download it for the user. I have been using json2csv. I am able to create the data file correctly and when I use that file in my express route I download a file that just says undefined. At first, I thought it was an asynchronous issue, but after using a setTimeout as a test to see if that was an issue I still get the undefined data file. Console logging the "csvData" shows the correct data.
Express route to download the file.
app.post('/api/downloads/filtered', (req, res) => {
let fields = [];
fields = Object.keys(req.body[0])
const filteredData = req.body;
const json2csvParser = new json2csv({fields: fields});
const csvData = json2csvParser.parse(filteredData);
console.log(csvData)
fs.writeFile('./report.csv', csvData, (err) => {
if (err) {
console.log(err);
}
else {
console.log('created report.csv');
res.download('./report.csv');
}
})
})
I'm using Vue on the frontend, I get the file when clicking a button not sure if that is something I should include.
I ended up figuring out my issue. I found that downloading in a post request didn't seem to be possible. I needed a get request. Since the data for the file came in the request body I ended up keeping the post request to create the file and creating a separate get request to download the file this seemed to work fine but didn't find it documented anywhere so I wasn't sure if a better way exists.
app.post('/api/downloads/filtered', (req, res) => {
console.log(req.body)
let fields = [];
fields = Object.keys(req.body[0])
const filteredData = req.body;
const json2csvParser = new json2csv({fields: fields});
const csvData = json2csvParser.parse(filteredData);
console.log(csvData)
fs.writeFile('./report.csv', csvData, (err) => {
if (err) {
console.log(err);
}
else {
console.log('created report.csv');
}
})
})
app.get('/api/downloads/filtered', (req, res) => {
setTimeout(() => {res.download('./report.csv')}, 1000)
})

How to update the user object in back4app?

I use Node.js and back4app.com
I try to update the user object. Therefore I have read a lot and found this promissing documentation:
let progressId = "xyz";
let userId = "12354"; //aka objectId
const User = new Parse.User();
const query = new Parse.Query(User);
// Finds the user by its ID
query.get(userId).then((user) => {
// Updates the data we want
user.set('progressId', progressId);
// Saves the user with the updated data
user.save()
.then((response) => {
console.log('Updated user', response);
})
.catch((error) => {
console.error('Error while updating user', error);
});
});
But there also is a warning. It states:
The Parse.User class is secured by default, you are not able to invoke save method unless the Parse.User was obtained using an authenticated method, like logIn, signUp or current
How would this look like in code?
My solution
Well, I got it to work. While I figured it out, I have found some small show stoppers. I list it for anyone it may concern.
Thanks #RamosCharles I added the Master Key in Parse._initialize. Only with that .save(null, {useMasterKey: true}) works. Take notice, without null it also won't work.
That's my working code:
let progressId = "xyz";
const User = Parse.Object.extend('User'); //instead of const User = new Parse.User();
const query = new Parse.Query(User);
query.equalTo("objectId", '123xyz');
query.get(userId).then((userObj) => {
// Updates the data we want
userObj.set('progressId', progressId);
// Saves the user with the updated data
userObj.save(null, {useMasterKey: true}).then((response) => {
console.log('Updated user', response);
}).catch((error) => {
console.error('Error while updating user', error);
});
});
Now I'm wondering
why my working code is different from documentation?
how secure is my code? And what is to do to get it more secure?
Yes, their API Reference is very helpful! On this section, there's a "try on JSFiddle" button, have you already seen that?
To update a user object, you must use the Master Key. On the frontend, it's not recommended, and it's better to create a cloud code function and call it on your frontend. However, for test purposes, you can keep using the API Reference, but on JSFiddle, you need to do some changes, here is their sample code, but with the adjustments:
Parse.serverURL = 'https://parseapi.back4app.com';
Parse._initialize('<your-appID-here>', '<your-JSKey-here>', '<Your-MasterKey-here>');
const MyCustomClass = Parse.Object.extend('User');
const query = new Parse.Query(MyCustomClass);
query.equalTo("objectId", "<object-ID-here>");
query.find({useMasterKey: true}).then((results) => {
if (typeof document !== 'undefined') document.write(`ParseObjects found: ${JSON.stringify(results)}`);
console.log('ParseObjects found:', results);
}, (error) => {
if (typeof document !== 'undefined') document.write(`Error while fetching ParseObjects: ${JSON.stringify(error)}`);
console.error('Error while fetching ParseObjects', error);
});
You'll need to insert the "_" before the "initialize" in your "Parse._initialize" and insert the Master Key in your query as I did on the query.find.

Yelp API - Too Many Request Per Second in Node.js

Experts,
It seems that yelp recently changed their REST API to limit the amount of requests you can make per second. I've tried using setTimeout and various sleep functions with no success. I believe it has to do with setTimeout although. I only get a few responses back and a slew of TOO_Many_Requests_Per_Second. Also, I'm using the Node.js Fusion API Client. Any help would be appreciated. Thanks in advance.
Here is the code below as I'm getting the Yelp URL from my Parse Server, and I want to get the Yelp Business Name response:
'use strict';
var Parse = require('parse/node');
Parse.initialize("ServerName");
Parse.serverURL = 'ParseServerURL';
const yelp = require('yelp-fusion');
const client = yelp.client('Key');
var object;
var Business = Parse.Object.extend("Business");
var query = new Parse.Query(Business);
query.notEqualTo("YelpURL", "Bus");
query.find({
success: function(results) {
for (var i = 0; i < results.length; i++) {
object = results[i];
//I belive a setTimeout block needs to come somewhere in here. Tried many places but with no success.
client.business(object.get('YelpURL')).then(response => {
console.log(response.jsonBody.name);
}).catch(e => {
console.log(e);
});
}
},
error: function(error) {
alert("Error" + error.code + " " + error.message);
}
});
Use query each, which will iterate over each object and perform the requests in a sequence rather than all more or less at once:
query.each(
function(object) {
return client.business(object.get('YelpURL')).then(response => {
console.log(response.jsonBody.name);
});
}
).catch( e => {
res.json('error');
});
One cool thing about this is that it'll automatically propagate the error from client.bussiness() call if there is one to the catch block at the bottom. It will iterate over the objects one at a time, and since we "return" the results of the client.business() call, it's not going to move on to the next object until you've gotten the response. query.each() will also iterate over every object in a collection that meets your query criteria, so you don't have to worry about limits.
Im not quite sure if this is what your looking for, but you can retrieve up to 50 records per request, in the example below will return 20 business names within that zip code, or you can tweak it a little to return all that data for those businesses, does this help:
app.get('/:id', (req, res) => {
let zipcode = req.params.id;
let names = [];
let searchRequest = {
term: 'Business', // or for ex. food
limit: 20, //set the number of responses you want up to 50
radius: 20000, // 20 miles
location: zipcode
};
client.search(searchRequest)
.then(response => {
response.jsonBody.businesses.map(elem => {
names.push(elem.name);
})
res.json(names); // business names only
//or
//res.json(response.jsonBody.businesses) //all details included with business name
}).catch(e => {
res.json('error');
});
})

Resources