Re: Pulling mail for local access (was Re: [Rant] The Endless Search for a Mail Client That Doesn't Suck)
On Monday 29 August 2016 07:29:16 The Wanderer wrote:
> On 2016-08-29 at 07:00, tomas@tuxteam.de wrote:
> > -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
> >
> > On Mon, Aug 29, 2016 at 06:15:23AM -0400, brian wrote:
> >
> > [Gnus]
> >
> >> But will it download from multiple newsfeeds *simultaneously*, and
> >> combine the feeds if you subscribe to the same group from more
> >> than one source? [...]
> >
> > TBH I never tried that, because I separate mail handling from my
> > MUA. Fetching, sorting and classifying is left to fetchmail, exim
> > and procmail, the "MUA" sees the result locally.
> >
> > Much better system behaviour when working offline.
> >
> > Didn't yet integrate news into that, since the newsgroups I am
> > interested in are disjoint from my mailboxes.
>
> I've been interested in trying to set up a system like that for some
> years, but never had occasion to make a Project out of it, and never
> found an obvious place to get started - especially for when migrating
> away from a workflow which is already based on having the mail client
> configured to contact the remote server directly. (And even more
> especially when dealing with an IMAP account, and wanting to be able
> to seamlessly affect mail on the IMAP server from the UI, which is
> provided by the mail client.)
>
> Could you go into more detail on how you have / got this set up,
> and/or point to resources which explain the process (well enough for
> someone technically savvy to be able to pick it up)?
Its not THAT hard, I've been doing it for quite a few years. I suck using
fetchmail as a background daemon, waking up every there minutes to go
tap the pop3 port of my ISP's mail server and pull anything new. When a
message has been pulled, its handed off to procmail for having it run
thru clamscan, spamassassin etc, and what survives that gets dumped into
a mailfile in /var/spool/mail. Then a bash script I wrote has a
subdaemon called inotifywait, trained to watch for file closings, and
when that mailfile has been written to and closed by the writer, it
sends kmail a message over dbus (or dcop, depending on the system)
telling kmail to go get the mail from that local mailbox nd sort it thru
kmails filters and stashed in the appropriate folder.
The net result is that the pause in everything else that kmail does, is
only a fraction of a second, most of which is used by its making a noise
to tell me new mail has arrived. If kmail was doing its own fetching,
then the keyboard is dead (the keystrokes entered are saved and
displayed when it comes back to you, but that dead time could be a
minute of more when someone sends you a big picture, or an openoffice
presentation with lengthy video snippets)
With that much background automation, all I have to do is hit the plus
key to read the next unread email, reply to it if I can, hit ctrl+return
to send it, and + to read the next one. Thats it, everything else is
done for me. Fetchmail can tap as many pop3 servers as you have access
rights to, and I have had 3 at one time, but am down to my ISP's server
now. It does it serially of course.
Late at night, I stop fetchmail long enough to run sa-learn against
several folders to train spamassassin, that takes 10+ minutes, and
fetchmail is restarted when thats been done. Basically I am lazy, and
bash scripts handle a lot of stuff in the background here. Repetitive
stuff is soon turned into a script in my crontab.
Biggest PITA? The pop3 server is also an imap server, and because some
people mix-n-match, the ISP has disabled fetchmails ability to delete a
mail it has fetched. So I have to log into the ^%# webmail with a
browser and clean house, usually daily.
Cheers, Gene Heskett
--
"There are four boxes to be used in defense of liberty:
soap, ballot, jury, and ammo. Please use in that order."
-Ed Howdershelt (Author)
Genes Web page <http://geneslinuxbox.net:6309/gene>
Reply to: