[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Tin+Suck



> Gith writes:
> > I've installed tin and suck as way to read newsgroups locally.
> > The problem is, i haven't figured out how to use tin to read the
> > newly downloaded articles. If someone is using the suck+tin 
> I might be wrong on this, but isn't this a feature in tin that has yet to
> be implemented? From the man page:
> 
>      -R         read   news   saved   by   the  -S  option  (not  yet
>                    implemented).
> > combination for news, could you post here or e-mail me directly
> > on how to do it..
> Where is the advantage of using suck over using tin -S?

I don't know what tin -S does.  Whatever it does, it is for a single user.  It
in no way makes these articles available for other users on the same machine,
let alone other news programs even on other machines.

The advantage of using suck + innd is that you get a news server on your 
machine that allows you or anyone else on your machine, plus anyone outside
your machine you allow to connect, the ability to read any news that is
on your machine.  You literally setup your machine to be a news server, it is
just that suck downloads articles and uses either rnews or innxmit (your
choice) to post articles to the news server.

For me and my friends that live in a house, it is invaluable.  We have five
comp sci types who like to read news.  So I have setup my machine to be a
news server.  I follow all the linux.* newsgroups, and am planning on making
all of my mailing list subscriptions into a newsgroup as well.  It is alot
faster reading news, or grabbing binaries (my bandwidth waster is
alt.binaries.sounds.midi) from a local machine than over a modem.  And if you
set it up right, you can even post articles back to the mailing 
lists/newsgroups providing you have the ability to do so... For instance, at
UMR, like other colleges/universities, there is a set of 'local' newsgroups
that only machines with access to UMR can access.  Some of these are also
part of what I 'spool' on my news server because suck is just like tin when
it connects to a newsserver: it reads articles.  However, suck takes the
articles and allows either rnews or innxmit to transmit them to a newsserver
of your choice (usually your local machine), and thus you get your own
newsserver with the articles.

Personally, I have found suck's auto-recovery features lacking.  That, and
I have more than one news-server I wish to contact.  So, ever 1/2 hour as a
cron-job, I run the following script which grabs my mail from two different
news-servers and, from my experience, guarantees that all articles in are
eventually retrieved, even if on the next 1/2 hour run of the script.

   #!/bin/sh

   tmphome=/var/spool/news/in.coming/tmp
   servers="usenet.umr.edu news.fuller.edu"
   suck=/usr/opt/bin/suck
   lf=$tmphome/lock

   cd $tmphome
   [ -e $lf ] && exit 1
   touch $lf

   count=0

   grabit () {
   se=2
   while [ 0$se -gt 1 ] ;
   do
     if [ -s newgrps ]
     then
       for i in `cat newgrps`
       do
         /usr/lib/news/bin/ctlinnd newgroup $i y $0
         echo $i 1 >> sucknewsrc
       done
     fi
     rm -f newgrps
     count=$[ $count+1 ]
     $suck $1 -a -br spool ; se=$?
     [ -s spool ] && ls -l spool
     [ -s spool ] && /usr/lib/news/rnews -S localhost ./spool ; re=$?
     rm -rf Msgs suck.restart spool suck.sorted
     if [ -s suck.newrc -a 0$se -ne 255 ]
     then
       mv sucknewsrc.7 sucknewsrc.8
       mv sucknewsrc.6 sucknewsrc.7
       mv sucknewsrc.5 sucknewsrc.6
       mv sucknewsrc.4 sucknewsrc.5
       mv sucknewsrc.3 sucknewsrc.4
       mv sucknewsrc.2 sucknewsrc.3
       mv sucknewsrc.1 sucknewsrc.2
       mv sucknewsrc sucknewsrc.1
       sort suck.newrc | uniq > sucknewsrc
       rm suck.newrc
       touch spool
     fi
     echo Suck Error: $se, Rnews Error: $re count: $count
     if [ $count -gt 20 ];
     then
       rm $lf
       echo Suck Error: $se, Rnews Error: $re count: $count EXITING
       exit 1
     fi
   done
   }

   for i in $servers
   do
     cd $i ; echo $i
     grabit $i
     cd ..
   done
   rm $lf

The idea is that for each 'server' you have created a subdirectory in the
$tmphome subdirectory.  You can add newsgroups manually to the local news 
server by using ctlinnd manually, and then adding an entry into sucknewsrc, or
you can simply give the name each of the groups on a separate line in a file
'newgrps' in the server's subdir; if this file exists the script adds the
newsgroup to the local server and adds an entry in the 'sucknewsrc' file.

I'm also toying with a hack I've made of suck that, when I get it cleaned up,
I plan to send to the authors (or perhaps sooner, considering I've not worked
on it in a while).  The 'hack' is so I can, given an account on a remote
machine, telnet to that machine and then from that machine telnet to the
news server.  The reason being I have an account on the *.win.org machines in
my hometown, but am unable to read the newsgroups there unless I am logged in.
Obviously I would like to have them 'spooled' locally to read them, not telnet
through several dozen hops to their over-loaded machines and use lynx/tin from
remote to access the newsgroups.  Other applications for this 'feature' I am
sure you will recognize.

Hope this helps.
--
Todd Fries .. todd@miango.com



Reply to: