[Freedombox-discuss] co-existence with the havenots (was: Introductions)
I really liked the points you make about how we shouldn't scare away from
using businesses to the benefit of our cause, and i agree with you on that,
only don't entirely agree with the last one.
On Sun, Sep 5, 2010 at 3:59 PM, Matthew Johnson <mjj29 at debian.org> wrote:
> In fact, for some data we can go even further. I have data I want to
> without restriction. Photos on my gallery, my website. I also want that to
> publically available and not slow to access because it's on my ADSL upload
> bandwidth. For this I have decided I want the world to access it. I don't
> if some company can read it all - it's on my public website, of course they
> read it all - it's indexed in Google. If I want to pay someone to arrange
> they deal with the uptime and transfer rates for that, that's fine. What we
> need to arrange is that noone is tied to a single provider for it and the
> is control of what's published and what's not.
that hosting company still has the serverlogs of you, and of your website
visitors. Also, google has the logs of people that get to your website
I think the key question (with http hosting, but definitely also smtp) is
how we can achieve gradual adoption. In both cases I think we need a two-way
connector between the haves and the havenots. So:
WEB, OUT: I (FB user, 'haves') publish my blog. I want havenots to also be
able to read it. They should access a gateway url to look into the free
world, and through that looking hole they will be able to see all public
content. Obviously, the organization hosting this gateway will still be able
to collect logs about the havenots that visit my blog, but that's inevitable
during the transition, until these people also come inside the free world.
SMTP, OUT: I send out an email to a havenot. Same case, there should be an
exit point that delivers my emails to havenots.
SMTP, IN: A havenot wants to email me. There should be an entry point (which
I set as my MX record), where emails from the rainy outside world are
received, encrypted, and let inside so that i can store and read it inside
the free world.
WEB, IN: I want to browse a havenot's blog. i think most aspects of this
case are covered by Tor, right? And at some point, the havenot will move his
blog inside, and that's it.
PLATFORM-WEB, IN/OUT: I want to browse "trapped content" - that is, content,
generated by (regrettably) either a have or a havenot, but trapped in one of
the big monopoly 'platforms' (Google search, Youtube, Facebook, Twitter,
Google docs). This is the tricky one.
Small websites can be seen as just havenot-bloggers that just need to
purchase their FreedomBoxes one-by-one. And probably even, say, the website
of a newspaper can at some point decide to set up their presence in the free
Big ("platform") websites need special attention though. So we probably need
to include one application to specifically attack each 'platform website'.
Each one of them will have to also provide a transition strategy that
seamlessly integrates 'already freed' content, with content from the
specific website that that app is attacking (as diaspora does for instance
with the facebook 'aggregator' thing). That same app can then maybe also
allow me to publish to the specific platform that it covers, so that I can
push content out to those platforms (e.g. to my Twitter account which I will
still have if I want my public to include havenots).
Anyway, this is my view of it, it may be entirely flawed (I'm quite new to
all of this), but hopefully it can help contribute in your discussion in
some way. ;)
-------------- next part --------------
An HTML attachment was scrubbed...