[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: block websites



On Sun, Dec 01, 2002 at 06:17:38PM +0100, Willem-Jan Meijer wrote:
| Hello again,
| 
| I posted this mail earlier today and I have had some reactions but they
| don't work or I don't understand it (I'm a dummy)
| 
| I want to block http://www.grolle.nl for all computers in my network.
| 
| Internal Ip addresses of the computers are:
| 192.168.0.1 	for the server
| 192.168.0.11	for ordinary use
| 192.168.0.12	too for ordinary use
| 
| The site has to be blocked for 192.168.0.11 and 192.168.0.12
| 
| How do I do that step by step, I'm a dummy, remember?

First you need to decide how you want to block stuff.  There are a few
different ways.

One possibility is to use firewall rules to REJECT all stuff destined
for that netblock.  With iptables the rule would look like this on the
server :
    iptables -A FORWARDING -d 80.84.234.195 -j REJECT
If you don't have a firewall yet, set one up.  If you don't know how,
read the HOWTOs (look on tldp.org or netfilter.samba.org).


The other possibility is to set up Squid (an http proxy) as a
transparent proxy.  First you would need to install squid.  Then you
need to tweak its configuration to support transparent proxying (I
don't remember how I did it, but if you email me the default config
I can diff it against mine).  Then you need to set up an iptables rule
to redirect all traffic involving port 80 to squid.  Then set up a
redirector in squid to redirect forbidden URLs to a warning page.
This is where the "squidguard" and/or "chastity" packages come in.
This is how I set up my system, in large part to prevent ads from
wasting my resources.


You will need to read some manuals and HOWTOs for setting up either
one of those.


Note, however, that both of these solutions do not provide complete
protection from offensive or forbidden material.  Google likely has a
cache of that site.  If you want to try and block that you would need
to use the squid redirector.  You would need a way of determining,
from the URL only, whether or not the site is acceptable.  (I don't
know if google's cache urls are amenable for this or not)  Then you
still have the potential problem of your users getting outside help on
the 'net -- someone could mirror that site on their server.  You would
need to discover that, and then block that site as well.  The admin of
that site could decide to open up other ports (besides the standard
port 80) so that your users could bypass the firewall or proxy
restrictions.  You fight a losing battle there because you may want to
allow services like FTP or SSH, and they can't be universally
redirected through squid like HTTP can.  It comes down first to how
responsible the end-users are in terms of following the rules and
behaving on their own, and secondly to how much enforcement is "good
enough".

HTH,
-D

-- 
Dishonest money dwindles away,
but he who gathers money little by little makes it grow.
        Proverbs 13:11
 
http://dman.ddts.net/~dman/

Attachment: pgpQjgBIOpRMz.pgp
Description: PGP signature


Reply to: