A small item in the Ziff-Davis magazine E-Week says that book publishers are
going all out to find encryption techniques to protect e-books. I wish them
good luck, but they are probably wasting their time.
The bottom line is that anything that needs to be decrypted regularly by consumers
is simply not protectable — it is only a matter of time before a bootleg version
appears. The best thing that encrypting can do is slow down the pirates, which
probably helps deter piracy somewhat but helps to break the product for legitimate
users.
This is the status quo today with computer games. I love playing The
Sims, for example, but for copy protection reasons it requires that
the game CD is in the computer whenever it is being played. Every time I want
to play The Sims I have to go digging through stacks of CDs. Of course, it only
took 10 minutes online to find sites with complete instructions on how to bypass
the protection and run the game entirely from the hard drive. The only thing
Maxis really accomplished by doing this is inconvenience me, a paying customer.
The same thing goes for books. I own several thousand books, and the best of
both worlds would package an electronic form of a book along with the physical
form (put a CD, for example, in the back of every book). A CD with a PDF, HTML,
etc., version of the book would dramatically increase the usefulness of books
(the primary defect of books being that most book indexes suck, and there is
no easy way to quickly find related passages among multiple books — a problem
to which computers are uniquely suited to solve).
Instead book publishers want me to spend way too much money for an electronic
version of the book which is going to be very difficult — on purpose — to
integrate with other e-books. And all in the name of copy protection schemes
that will be broken and featured on Slashdot a couple months after they are
released.
This would be akin to publishers deciding to print all their books on some
weird purple-colored paper to prevent photocopying (game publisher TSR actually
used to publish materials in text specifically to deter kids from photocopying
it back when copiers had problems reproducing blue). In fact with the speed
of today’s computers and the amazing accuracy of some of the newer OCR versions,
if I really want an electronic copy I can get one pretty quickly just by scanning
a book into say a PDF file and telling Adobe Acrobat to perform its text recognition
magic.
The long term problem for publishers is that the marginal value of their product
is about to decline dramatically. There will always be a demand for the superstars
in every area of writing, but for a lot of books people care more about the
topic or genre rather than any specific writer. For example, before the Internet
if I wanted a wide range of statistics on hand for immediate reference, I would
have had to pay a couple hundred bucks for some heavy books. Today I can get
the same information for free from any number of web sites, and much the same
thing is happening to pretty much all information.
Not that information wants to be free in the economic sense. There are plenty
of viable economic models for information in the Internet age but they are more
diversified and individualized and are starting to shift the leverage away from
large publishers to small groups and individuals. The publishers are right to
fear the free flow of their products, but they are trying to stop a train that
left the station about 10 years ago.