One of the odd things about running a web site is even with a small site like this I end up giving a lot of money for services to people I’ve never met in person and whom live very far from me. My previous web host was based in Tennessee, while the current one is located in Connecticut. Occasionally I also paid a very helpful programmer from England to help me install and configure some CGI scripts on the old site.
As a small customer in this kind of environment, you really need to have friendly customer service on the other end of the arrangement. I’m pretty good at breaking things and it’s important to me if I think I’ve found a bug or a problem that somebody takes a second to ask “is he on to something?”
Which is why Dave Winer’s behavior continues to mystify me. The software this site uses runs on top of one of Winer’s products, Frontier, but the folks at Macrobyte deal with buying licenses, installation, etc., so I have no business relationship with Winer. And thank goodness for that.
Andrea and other users of EditThisPage.Com noticed that Google was no longer indexing their pages, and started posting rather mild complaints on their site. Userland (Winer’s company) didn’t know what was going on, but Seth Dillingham wrote a script that tried to access Andrea’s site posing as Google’s indexer, and lo and behold he got back a message saying,
Inktomi, your crawler is repeatedly hitting our servers getting the same WAP files over and over. Please stop pounding us, it’s hurting the service we provide to our customers. Thanks. email@example.com.
Now if you read Scripting.Com more than casually, you know that Winer has complained about the behavior of indexing robots before, and for good reason. Some of the indexers do things that can quickly degrade web server performance. For example, on a few of my sites I have event calendars. If I forget to add the event calendar to my robots.txt file, a number of indexers will attempt to request the calendar page for every day from 1900 through near the end of this century. And they will request those files very rapidly, so the server goes from having maybe 500 to 600 requests an hour to 5,000 to 6,000 requests within 10 minutes. Now imagine a robot doing that at several hundred sites at EditThisPage.Com.
So it was perfectly reasonable for Seth to point out that Userland was blocking Google,
In fact, Userland had intentionally turned off Google indexing a few weeks ago.
Because Userland didn’t like the load that Google’s indexer was putting on their servers, so they have prevented Google from indexing any of their sites.
At first, Winer said that there was no way that Userland was blocking Google because one of Winer’s programmers said they weren’t (a common technical support answer — “what you’re describing is impossible”). Later, Winer conceded that Userland was blocking Google but put it down to a mistake or a but or “what ever.”
But my point here about customer service is that rather than concentrate on what seems to me to have been the primary issue — could Seth’s report of being blocked when masquerading as the Google indexer be repeated — Winer latches onto Seth’s conjecture that Userland is blocking Google because of the strain it puts on their servers and runs with that.
This is typical of how Winer repeatedly gets himself in trouble. He always ends up focusing on perceived or real personal slights while often ignoring the real meat and potatoes of the problem or issue that is being made.
The amusing thing is that Winer was complaining,
That Doc’s site has been de-indexed by Google completely punctures Seth’s theory, which unfortunately he didn’t state as a theory, and so now it’s become part of the folklore that UserLand is nasty or whatever, I’m already getting flames thanks to Seth’s post here.
Seth was completely right, but, thanks to Winer’s comments, Doc Searls went ahead and blasted Google on his site for their “censorship” and said Google’s actions prove that the web needs to move more toward Winer’s vision (I can’t wait to read the WinerLog take on that!!!)
And while I’m at it, two quick comments on Doc’s advocacy of Dave’s approach.
First, Winer seems to be moving to a model of having people use Radio Userland to serve web sites from the desktop. Maybe people who are comfortable with network security issues won’t have any problems with this, but the last thing I would want to do would be to add a web server to my desktop.
Second, Searls and Winer both conclude from this incident that, as Winer puts it, “the Internet doesn’t work,” with Searls saying the Internet needs a genuine directory infrastructure rather than a Google-style search engine. I disagree. In fact the Internet — especially Google — work a lot better than I ever imagined they would. Remember when there were all these stories about how the sheer size of the web would soon break popular search engines such as Alta Vista? Well, of course, technically it did, but then along came Google who decided to invest all of their cash in developing excellent search engine technology rather than throw away millions on buying a single domain name.
I continue to be amazed at how quickly I can find even the most obscure of information on Google. Which isn’t to say that there isn’t room for improvement or even new and better standards for searching and/or creating directories. But, still, Google does an excellent job of making Internet searching work well.
There are no revisions for this post.