6

I have used my hosts file (located in /private/etc/hosts) several months to block distracting websites during the work day. This worked all well until now. Today it suddenly stopped working.

Some sample lines from the hosts file:

127.0.0.1 facebook.com
127.0.0.1 www.facebook.com

I placed that text in the hosts file by the following steps:

sudo nano /etc/hosts
wrote the lines above, then ^O to write the file, Enter to confirm the filename and ^X to exit the editor.

Between the localhost IP and the domain name I have a tab. The line endings are Unix style (LF), and the weird part is that when I use the ping command it seems to do its job properly:

ping facebook.com
PING facebook.com (127.0.0.1): 56 data bytes
64 bytes from 127.0.0.1: icmp_seq=0 ttl=64 time=0.137 ms
64 bytes from 127.0.0.1: icmp_seq=1 ttl=64 time=0.122 ms
64 bytes from 127.0.0.1: icmp_seq=2 ttl=64 time=0.118 ms
64 bytes from 127.0.0.1: icmp_seq=3 ttl=64 time=0.110 ms
^C
--- facebook.com ping statistics ---
4 packets transmitted, 4 packets received, 0.0% packet loss
round-trip min/avg/max/stddev = 0.110/0.122/0.137/0.010 ms

But when I try to access facebook.com in Safari or Firefox I am still able to get to the web site. This is also the case for other website that I have blocked in a similar way. I have emptied the cache for both browsers, but this didn't solve the problem.

What can I do to solve this problem?

Update 1: I'm now checking all the websites I've blocked this way and found out that the behaviour is not consistent across different domains. These are the "time-waster" I'm blocking in /private/etc/hosts:

#Block time-killers
127.0.0.1 9gag.com
127.0.0.1 flabber.nl
127.0.0.1 geenstijl.nl
127.0.0.1 dumpert.nl
127.0.0.1 facebook.com
127.0.0.1 www.9gag.com
127.0.0.1 www.flabber.nl
127.0.0.1 www.geenstijl.nl
127.0.0.1 www.dumpert.nl
127.0.0.1 www.facebook.com
##

All sites from this list ping to 127.0.0.1, however 9gag.com and flabber.nl are unreachable by any browser, but geenstijl.nl, dumpert.nl and facebook.com are reachable.

I have tried restarting, this did not solve the problem. Before this problem I have not changed the system configuration by an update of some sort.

Update 2: Three hours ago I could access facebook.com through Safari and Firefox, now I can't anymore. geenstijl.nl and dumpert.nl are still accessible though. I haven't changed anything in the past three hours, just used Word and browsed the web with Safari.

Update 3: Now, four hours after the 2nd update the hosts file works as normal again. In the process off fumbling with the hosts file I removed the non-working entries and re-added them one by one, testing each one after it was added. I have no idea what was happening and can't run wireshark on the traffic anymore as there is no faulty behaviour I can observe.

Update 4: And the problem is back again. The same sites as in update 1 show the erroneous behaviour.

Update 5:
Everything again works as it should. I'll keep the solutions posted here in mind when I encounter the error again.

10
  • Are Safari and Firefox using a proxy? Does your osx have a /etc/hosts.conf file, and if so, what's in it?
    – ott--
    Commented Feb 18, 2013 at 12:28
  • No proxy, I don't have a hosts.conf file, I do have a hostconfig file which has the following text in it: # This file is going away AFPSERVER=-NO- AUTHSERVER=-NO- TIMESYNC=-NO- QTSSERVER=-NO- And I have a hosts.equiv file which is empty. Commented Feb 18, 2013 at 12:40
  • Check superuser.com/questions/313128/lion-name-resolution-order too.
    – ott--
    Commented Feb 18, 2013 at 12:47
  • Thank you @ott--, but the solution presented in that answer did not solve the problem for me. I already use single hostnames per line. Commented Feb 18, 2013 at 13:08
  • I wonder if ltrace is available for OSX (cannot check before this evening), but with that tool you could check if Firefox is using DNS before looking at /etc/hosts.
    – ott--
    Commented Feb 18, 2013 at 15:39

5 Answers 5

4

DNS resolution in OS X went haywire in the update from Snow Leopard to Lion. After a clean install everything should work properly, but if you have gone the update route, things might be haywire.

Option 1: IPv6 addressing

Many websites and ISP's support IPv6 if IPv4 is unreachable. Put the definitions in the beginning of your /etc/hosts like this:

# Block Facebook IPv4
127.0.0.1   www.facebook.com
127.0.0.1   facebook.com

# Block Facebook IPv6
fe80::1%lo0 www.facebook.com
fe80::1%lo0 facebook.com

Option 2: Use DNSMasq

If the previous advice fails, you can install DNSMasq.

2

After any change to /etc/hosts run dscacheutil -flushcache at the command line to clear the local DNS cache. This works for me every time, with one exception: Firefox has its own DNS cache, so you'll have to restart it.

1

The OSX system does not use /etc/hosts for most of its network operations. For the most part, the terminal/command-line commands (Unixy stuff) MOSTLY use /etc/hosts, while anything Maccy (!) uses the internal plist type tables held elsewhere.

The usage is not consistent and problematic because it makes the OSX "unix" non-deterministic. As you've discovered.

I haven't a mac any more to find out exactly where the Mac OSX stores its emulation of the hosts file, but hopefully this information will point you in the right direction.

I know it'll be in the /Library directory (and/or ~/Library), and the plist files are compressed, so you can't just grep for things. Um, the 'plutil' command can uncompress/display the contents of .plist files (I think that's the name). Perhaps start with a

find ~/Library /Library -iname "*host*" -ls

to see what's hiding in that morass of windows-like complexity.

It's not exactly unix (netbsd)... but it's not exactly ... whatever else you might call it (GUI?). Even Windows is consistent. Perhaps wrong... but consistent.

2
  • 2
    This is incorrect: "The OSX system does not use /etc/hosts for most of its network operations". Every browser and CLI command I've ever used will do a standard system DNS lookup which puts /etc/hosts ahead of remote lookups. I've thoroughly tested this as I use it for local web development.
    – Matt S
    Commented Aug 13, 2013 at 17:52
  • MattS: Then can you explain why, when I edit the host file, dump my cache, and run the lookup again, it still fails? Even using tabs, and making sure my name is at the top of the file? So far my experience says that OPs comment is entirely correct. Commented Feb 20, 2014 at 22:28
1

Just adding another bit of voodoo here. My hosts file entries on 10.8.2 were completely ignored by the system until.

  1. I moved my entries to the top of the file

  2. I used a single tab to separate the IP address and the host name

  3. I threw in a $ dnscacheutil -flushcache just to be safe

I haven't dug deeply into why this happens, just passing on the ritual that solved it for me.

1

I found that on OS X 10.9 Safari and Firefox continued to access the blocked domains until I had implemented IPv6 blocks in the etc/hosts file. Only Chrome was affected by the IPv4 blocks.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .