How download 600k emails from a POP3 server - linux

How to download all emails (632,000 emails) from a POP server? Currently MacOS Mail limits me to 200,000 emails. Is there a client capable of doing the job without limitation? I do not have access to the server configuration, I am a user.

Wow, that quite a collection! Provided there's no way of using IMAP. although i haven't tried to do this myself thunderbird could quite possibly do it, as i don't believe there is a limit along as you don't run out of disk space or RAM, and Attachments will be compressed.

Related

Can we parallelize the email download using IMAP JavaMail API?

Can I use multiple threads to download the emails using the store?
Is there any sample available anywhere?
Each open folder uses a separate connection to the server and operations on that folder are serialized by that connection so there's a limit to how much parallelism you can get with a single folder. Some servers will allow you to open the same mailbox multiple times (using separate Folder objects), resulting in multiple connections.

Moving files from multiple Linux servers to a central windows storage server

I have multiple Linux servers with limited storage space that create very big daily logs. I need to keep these logs but can't afford to keep them on my server for very long before it fills up. The plan is to move them to a central windows server that is mirrored.
I'm looking for suggestions on the best way to this. What I've considered so far are rsync and writing a script in python or something similar.
The ideal method of backup that I want is for the files to be copied from the Linux servers to the Windows server, then verified for size/integrity, and subsequently deleted from the Linux servers. Can rsync do that? If not, can anyone suggest a superior method?
You may want to look into using rsyslog on the linux servers to send logs elsewhere. I don't believe you can configure it to delete logged lines with a verification step - I'm not sure you'd want to either. Instead, you might be best off with an aggressive logrotate schedule + rsyslog.

Sending files to remote server in the quickest manner

I have a separate server that processes the media uploaded to my main, web facing server. For now I upload files to it using FTP but the problem with this is that to ensure the files are done uploading I have a timeout running, which adds a delay in the overall processing time. I can't seem to get it to wait less than 5 seconds and still guarantee to pick up the media and this delay is no longer acceptable. So:
Is there a better way to implement this cleanly? I've considered sticking with FTP and sending another file after the initial upload that will indicate it's done but then there are two uploads for every upload = expensive. Another option I've considered is implementing a custom server that will just get a content-length header, do some authentication, and then receive the file and kickoff the processing as soon as its ready. Socket programming doesn't seem too intimidating but I have some worries about sending binary files and different formats, is this a valid concern? Also are there any other protocols out there I could implement to do this, rather than reinvent the wheel? Something like FTP but with a little verification.
I'd be glad for any pointers or tips you can share, thanks!
I suggest you use rsync. This runs over ssh, will move entire directories / hierarchies of files, do incremental copies, in short everything you could possibly want.

using torrents to back up vhd's

Hi it's a question and it may be redundant but I have a hunch there is a tool for this - or there should be and if there isn't I might just make it - or maybe I am barking up the wrong tree in which case correct my thinking:
But my problem is this: I am looking for some way to migrate large virtual disk drives off a server once a week via an internet connection of only moderate speed, in a solution that must be able to be throttled for bandwidth because the internet connection is always in use.
I thought about it and the problem is familar: large files that can moved that also be throttled that can easily survive disconnection/reconnection/large etc etc - the only solution I am familiar with that just does it perfectly is torrents.
Is there a way to automatically strategically make torrents and automatically "send" them to a client download list remotely? I am working in Windows Hyper-V Host but I use only Linux for the guests and I could easily cook up a guest to do the copying so consider it a windows or linux problem.
PS: the vhds are "offline" copies of guest servers by the time I am moving them - consider them merely 20-30gig dum files.
PPS: I'd rather avoid spending money
Bittorrent is an excellent choice, as it handles both incremental updates and automatic resume after connection loss very well.
To create a .torrent file automatically, use the btmakemetainfo script found in the original bittorrent package, or one from the numerous rewrites (bittornado, ...) -- all that matters is that it's scriptable. You should take care to set the "disable DHT" flag in the .torrent file.
You will need to find a tracker that allows you to track files with arbitrary hashes (because you do not know these in advance); you can either use an existing open tracker, or set up your own, but you should take care to limit the client IP ranges appropriately.
This reduces the problem to transferring the .torrent files -- I usually use rsync via ssh from a cronjob for that.
For point to point transfers, torrent is an expensive use of bandwidth. For 1:n transfers it is great as the distribution of load allows the client's upload bandwidth to be shared by other clients, so the bandwidth cost is amortised and everyone gains...
It sounds like you have only one client in which case I would look at a different solution...
wget allows for throttling and can resume transfers where it left off if the FTP/http server supports resuming transfers... That is what I would use
You can use rsync for that (http://linux.die.net/man/1/rsync). Search for the --partial option in man and that should do the trick. When a transfer is interrupted the unfinished result (file or directory) is kept. I am not 100% sure if it works with telnet/ssh transport when you send from local to a remote location (never checked that) but it should work with rsync daemon on the remote side.
You can also use that for sync in two local storage locations.
rsync --partial [-r for directories] source destination
edit: Just confirmed the crossed out statement with ssh

Vista's IIS Instance doesn't have SMTP (Solutions?)

Presently, I am working on a project using classic ASP. My development machine is Vista Enterprise. Although Vista does allow you to have multiple Web Sites (not without a workaround in XP), it has removed the SMTP service from IIS.
Is there a standard workaround for this issue?
As more web developers at my company receive new machines I am concerned that this issue will become a greater irritant. (Currently I am the only Web Dev using Vista)
I found a better suggestion over on serverfault.
This thread details it
http://smtp4dev.codeplex.com/ Nice tool.
You have two workarounds. You can direct all mail to your company's SMTP server. This often means that your development machines use a different config (remote SMTP vs local), so I find this less desirable.
You could also install another SMTP server on your dev machine. One option is the free Mercury Mail Transport System by the maker's of the venerable Pegasus Mail.
This is very similar to "What’s a good mail server for development use?"
I have tried 3 things:
sendmail from SUA community warehouse built with SASL (AUTH) and OpenSSL (SSL/TLS) for Interx/SFU/SUA. This works well but is quite slow to start a session for some reason. And of course it is sendmail so about as opaque to configure as humanly possible. (Services for UNIX 3.5 and the Subsystem for Unix Applications also come with an old-ish build of sendmail that does not have AUTH and ssl.)
Mercury Mail Server. The setup and managment feels obtuse and dated to me.
hMailServer. Very slick. Quick setup and intuitive to configure. I like it.
Take a look at Papercut. It works well for my development environment on Windows 7.
Description of Papercut from the CodePlex site:
Papercut is a simplified SMTP server designed to only receive messages
(not to send them on) with a GUI on top of it allowing you to see the
messages it receives. It doesn't enforce any restrictions on
addresses, it just takes the message and allows you see it. It is only
active while it is running, and if you want it in the background, just
minimize it to the system tray. When it receives a new message, a
balloon message will show up to let you know.
I use the built-in settings for SMTP mail to dump emails to a directory as shown in this post:
How can I use a local SMTP server when developing on Windows 7?

Resources