[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Idea for structure of Apt-Get



I like your idea Max.  With a subscription based system, people
wouldn't really be anonymous anymore.  I think the bandwidth issue is
probably the biggest problem right now.  I'm still sitting on a dialup
connection when I'm at home.  I wonder though, with widespread use,
would the average upload amount for a user (or node) go down?  With
some central tracker system, sending out a relatively small patch to
thousands of users would be fairly easy and quick no?  The .deb
mirrors that are currently up now would basically become "super
hosts", able to start the distribution of the patch very quickly and
then as it picks speed, look at other patches that need to get out. 
The .deb mirrors would send the patches to the people with the fastest
connections first.  What if people never needed to run ap-get update
or ap-get upgrade?  What if it was always on?  If a patch is released,
you will get it quick.  Maybe I'm thinking of something that can't
really be accomplished right now but with South Korea pumping a large
amount of money into their internet infrastructure, look where it has
taken them?  Maybe we just need to lobby for better internet access. 
Just a thought.  If this is the wrong place for this conversation, I'm
sorry.  Maybe it should be moved to off-topic or something.  Anyway,
thanks for listening. :)


On Sun, 20 Mar 2005 00:10:14 +0000, Max Dyckhoff <max@subnova.com> wrote:
> I'm going to start off by admitting that I don't know exactly how the
> mirror system works, from my understanding it literally relies on a user
> changing their sources.list file to point to a different download site?
> 
> Here's an idea that I've just whipped up as a hybrid of the mirror
> system and true p2p systems, taking into consideration points mentioned
> in this thread.
> 
> Rather than have each user become a peer on the download network, have
> it a subscription based thing, very similar to a mirror system. A user
> would sign up saying that they were willing to share their .deb files,
> and this would be registered with a central tracker. Other users would
> then point their sources.list to the tracker, and when requesting a .deb
> the tracker would provide them with a link to one of the subscribed
> sharing users.
> 
> Given that (correct me if I'm wrong) the size of .deb files is
> relatively small, I don't think it would make much sense to split them
> into different chunks and farm them out to different servers. Rather the
> tracker would just choose a server to point the user at, based either on
> historical traffic or a quick query of the server's speed. This would be
> the "difficult" part of the system to develop, and while I have some
> ideas I suspect someone with a network background would be better (mine
> is AI ;-) )
> 
> This would make it somewhat more user-friendly (as they would just have
> to add a single line to their sources.list; the address of the "tracker"
> for a set of packages, rather than choosing from a mirror), and it would
> have the benefits of distributed network use that you get from a p2p
> network.
> 
> Security would be dealt with easily, MD5 sums or some such solution, but
> there would potentially be some latency, waiting for the tracker to
> provide a download location.
> 
> I don't personally have any problems with the existing system, and any
> change would presumably be prompted by problems with the hosts of
> mirrors if bandwidth usage gets too great for them. I reckon it would be
> relatively easy to implement, and depending on whether current mirrors
> are feeling at all pressed for bandwidth, it might not be a bad idea for
> someone to at least prototype it.
> 
> Then again, this is probably completely the wrong list for discussing
> this, given its an AMD64 place... I'm not subscribed to anywhere else
> though, and its always fun to chat ;)
> 
> Max
> 
> 
> Nat Tuck wrote:
> > The security issues in this plan are solved pretty well. If you used the
> > actual bittorrent protocol then it would be as secure as the mirrors are now
> > - if not slightly more secure.
> >
> > The biggest issues here are
> > A.) unexpected bandwidth usage.
> > B.) horrible latency
> >
> > The first issue is mostly a real issue from a bad press perspective. People
> > will see not using upstream bandwidth as a feature and try to avoid/cheat
> > the system. I actually wish bittorrent-style update mechanisms were more
> > common - people might stop paying for connections with horrible upload
> > speeds.
> >
> > The second issue is most likely an engineering problem. The existing
> > bittorrent protocol has a bit of a delay finding peers and convincing them
> > to share - until you have a chunk or two of the file, you'll be stuck at a
> > super-low download rate (typically 1kb/sec). Since a bittorrent "chunk" is a
> > good percentage of the size of the average Debian package, some sort of
> > custom bittorrent-like protocol would need to be developed.
> >
> > I guess the real question is as follows:
> > - Is there a big enough shortage in donated mirror bandwidth to put the effort
> > into developing a peer to peer package distribution system and convincing a
> > large percentage of users to share their bandwidth?
> 
> --
> To UNSUBSCRIBE, email to debian-amd64-REQUEST@lists.debian.org
> with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org
> 
>



Reply to: