errors, on fixing

I read with interest this blog post over on Freedom to Tinker about the Google Book Search folks talking about finding and fixing errors in their giant catalog, metadata errors especially. The conversation seems to have largely started at this post on LanguageLog and gotten more interesting with follow-up comments from folks at Google. One of the things we have all learned in libraryland is that the ability to trawl through our data with computers means that we can find errors that might have otherwise stayed buried for years, or perhaps forever. Of course computers also help us create these errors in the first place.

What’s most interesting to me is a seeming difference in mindset between critics like Nunberg on the one hand, and Google on the other. Nunberg thinks of Google’s metadata catalog as a fixed product that has some (unfortunately large) number of errors, whereas Google sees the catalog as a work in progress, subject to continual improvement. Even calling Google’s metadata a “catalog” seems to connote a level of completion and immutability that Google might not assert. An electronic “card catalog” can change every day — a good thing if the changes are strict improvements such as error fixes — in a way that a traditional card catalog wouldn’t.

Note: thanks to people who let me know that one link was wrong, and that I managed to typo both “computers” and “interesting” in this post.