Sunday, March 11, 2007

Digitizing Efforts Continue

There was a press release about Simon & Schuster on Wednesday. The publisher, a subsidiary of CBS, has hired Innodata Isogen to digitize titles. "Innodata Isogen will provide a full suite of digitization and conversion services for the publisher's backlist titles."

This means that S&S joins Random House and HarperMorrow in digitizing their stock.

UK's The Guardian had an article on Saturday, updating Google's initiative to copy every book ever published. The company predicts they can accomplish this massive task within ten years.

There's little doubt about Google's goal, though. "We are talking about a universal digital library." Dan Clancy, the former NASA scientist behind Google's book-scanning technology, told the New Yorker. "I hope this world evolves so there exists a time where somebody sitting at a terminal can access all the world's information."

The publishing industry remains suspicious of Google's motives even as it races to digitize its inventory. It's likely that some kind of deal will be struck down the road by which the publishers can take advantage of Google's search engine capabilities while picking and choosing which excerpts of books will be made available to a search request.

The article included some interesting statistics:

In 1450, new titles were published at a rate of 100 per year. In 1950, that figure had grown to 250,000. By the millennium, the number published exceeded a million.

It's estimated that of all the books ever published, more than 95% are out of print.

The Library of Congress in Washington DC is the largest library in the world with about 29m books among its 130m items, while the British Library has about 13m catalogued books.

No comments: