I’m planning to migrate my email to a different provider, but they don’t give much storage, so I was wondering what people would recommend for this kind of setup: basically I’d like to use the new provider as something like a relay. I’d want them to only store an email or two at a time and have some kind of self hosted solution that just grabs the emails from the provider and stores them after deleting them off the provider so it’s never storing my entire email history, and also keeps my sent emails somewhere so that I have a copy of it. Ideally I’d wanna be able to set this up with a mail client like NextCloud’s.
The good old fetchmail is probably what you’re looking for. Run your local/self-hosted email server and then use fetchmail as described here to fetch the email from the email provider and deliver into the local accounts. You also have getmail (does the same but is written in python), guide here, or go-getmail …
Alternatively, and probably way better:
Postfix has a feature called ETRN service, documented here. It can be used to incoming emails queued deliver it to another server when a connection is available:
The SMTP ETRN command was designed for sites that have intermittent Internet connectivity. With ETRN, a site can tell the mail server of its provider to “Please deliver all my mail now”. The SMTP server searches the queue for mail to the customer, and delivers that mail by connecting to the customer’s SMTP server.
From what I know about it you might be able to:
- Configure just a SMTP/Postfix server on the cloud provider;
- Configure a full IMAP/SMTP server on the self-hosted / local machine;
- Configure the “cloud” Postfix to deliver all incoming email into your local / self-hosted Postfix using
relay_domains
here and here. - Setup ETRN in the “cloud” provider to deal with your local server being offline / unavailable;
- On the local machine create a simple bash script + systemd timer / cron like this:
nc -c 'echo "ehlo selfhosted.example.org";sleep 1;echo "etrn your-domain.example.org";sleep 1;echo "quit"' remote-cloud-server.example.org 25
This command will connect to the cloud server and ask it to deliver all queued email to the self-hosted instance. This can be setup to run every x minutes, or if you want to get fancy, when the network goes up with the
network-online.target
target like described here. Note that the script isn’t strictly necessary, is just guarantees that if the connection between servers goes down when it comes back you’ll get all the queued email delivered right away.The following links may also be of interest so your local / self-hosted email server can send email:
- http://www.postfix.org/STANDARD_CONFIGURATION_README.html#dialup
- https://linuxhint.com/configuring_postfix_relayhost/
- https://www.cyberciti.biz/faq/how-to-configure-postfix-relayhost-smarthost-to-send-email-using-an-external-smptd/
Now a note about NextCloud: their webmail is the worst possible solution, I wrote very detailed description of the issues here. Do yourself a favor and use Roundcube.
I use them with neomutt for years, and am happy with it.
Agree about the Nextcloud client, but it’s easy enough to replace it with the SnappyMail plug in which works a treat.
Or simply run RoundCube without NC.
That sounds like POP3.
Unlike IMAP, where your inbox lives on the mail server, POP stores the messages only until you download them.
So you should be able to look for a provider that allows you to connect with POP3 and set your client up to fetch them periodically.
The older POP3 mail protocol downloads and deletes emails from the mailbox to your local program so if you can get next cloud’s to use that as a mail source it will start to work the way you want.