How to limit filesize in NodeMailer? - node.js

I'm sending a numerous attached pdf files with nodemailer (if needed I can provide the code).
Now, I need to limit the file size of total sent files to 10 MB, and if it exceeds the 10MB, it should send the next email and continue where it stopped.
How can I do this?
Thanks.

I will give you the idea, you will write the code. Ask if you need some help later with the code already written.
// List all Attachments that must be sent
// For each Attachment, get the size of it using fs.statSync()
// Knowing the number of attachments and the size of each, divide them in groups where the sum of the file size of each group is less than 10MB
// Send one e-mail per group

Related

I need to measure the response time for a file export using Trueclient protocol in loadrunner

I need to measure the response time for a file export using Trueclient protocol in loadrunner.After i click on the export button the file will get downloaded. But i am not able to measure the time for the download accurately.
Pull that data from the HTTP request log, which will show the download request, and, if the w3c time-taken value is included in the log, the time required to fulfill the download.
You can process the log at the end of the test for the response time data. If you need to, you cam import a set of datapoints into analysis for representation with the rest of your data. You might want to consider a normalized value for your download, instead of a raw response time. I imagine that the files are of different sizes, so naturally they will have different download times. However, if you divide download bytes with time (in seconds), then you will have a normalized measurement of bytes per second which then allows you to compare one download to the next for consistent operation.
Also, keep in mind that since you are downloading a file, writing to a local disk, for (presumably) multiple users on a host, you will face the risk of turning your local file system into a bottleneck. You can see this same effect if you turn up logging on all users to the highest level and run your test. The wait for lock and wait for write, plus the actual writing of data, becomes a drag anchor to the performance of your virtual user. This is why the recommended log level is "log on error" or send the error to the output window of the controller via lr_output_message() or lr_vuser_status_message(). Consider a control load generator of the same hardware definition as the others with only a single virtual user of this type on it. If the control group and global group degrade together then you have an app issue. If your control user does not degrade, but your other users do, then you have a test bed induced influence on your results.
These are all issues independent of the tool you are using for the test.

How should I handle HEAD requests for large files in node.js?

Using my own node.js server I want to get the size of a large file (> 4gB) before making byte range requests on it. If, upon receiving a HEAD request, I use fs.readFile I get "RangeError: File size is greater than possible Buffer" errors; if I use fs.createReadStream I don't get that error but then I don't know how to respond to the request. For one thing, I don't see how to get the file size from the stream; for another, I don't how to fill out the response header even if I knew the file size. Any help would be greatly appreciated. Thanks.

Kafka-Node in Node fetchMaxBytes parameter

Does anyone know what 2 parameters in the fetchMaxBytes represent?
If its represented as 1024*1024, does that mean that the consumer will fetch 1024 messages of each 1Kb? Or will it jest fetch 1Mb of messages?
I was not able to find any relevant information from the documentation except this: "The maximum bytes to include in the message set for this partition. This helps bound the size of the response."
I need this parameter to get messages one by one rather than getting a couple of messages in a single shot.
I am not familiar with node.js but I assume fetchMaxBytes corresponds to replicate.fetch.max.bytes. For this case, the value is the maximum buffer size (in bytes, ie, 1024*1024 = 1MB) for fetching messages. A buffer can contain multiple messages of arbitrary size. It basically means, wait for fetching not longer as "until a buffer got filled up".

Message size limit with CDO?

I have an application in which I'm creating an email which I want the SMTP server (IIS) on the same box to deliver (OS is 2003 Server 32 bit). I send this using the "cdSendUsingPickup" method.
Using my IMessage interface, I copy the message to the servers pickup directory. All works great as long as my message is below ~150MB. The size is accounted for by attachments to the mail. But if I include attachments over this limit, IMessage::GetStream() fails with 0x8007000e - not enough storage space is available to complete this operation. The server has plenty of HD space. I'm running into a some kind of space limitation and I'm thinking it's more a memory limitation, not a HD space issue but I'm finding no clues as to what's going on. Pseudo code below - the call to GetStream fails with a message bigger than 150MB or so. Works fine with smaller messages.
DlvrMsg(IMessage piMsg)
{
_StreamPtr pStream = NULL;
HRESULT hr = piMsg->GetStream(&pStream);
pStream->put_type(adTypeBinary);
//.. then use pStream->Read() to read the bytes of the message
// and copy to an .eml file in the pickup directory.
...
}
Yes apparently there is a limit, though MS won't give hard and fast rules for what that limit is. They only say the call to GetStream() fails in a call to realloc. More and more memory is reallocated until it hits some artificial limit.
This occurs in 2003 server as well as 2008 both 32 and 64 bit. Only work arounds are to use something other than CDO to send your mail.

Limitation of Attachment size when using SMTP

I wrote a C++ program to send a mail using SMTP. But when I attach any files I notices that a single file's size always is limited to 808 bytes. As an example if I send a text file with 10 KBs, when I download the attachment it has only text worth 808 bytes. If the large file is a zip file, it gets corrupted in unzipping obviously due to CRC failure. I used a MAPI library to send larger files without a problem. Is this a network limitation of SMTP? Can someone please explain why this is happening??
Thank You!!!
How are you attaching and encoding the files? Are you using MIME? 8-bit clean?
SMTP has no built in limits, but has specific limits in how data is transferred (formatting, etc). In general, most mail systems reject mails with greater than 5-10MB of data.

Resources