[opendtv] HD Video -- Bad for consumers, Bad for Hollywood?

  • From: "Manfredi, Albert E" <albert.e.manfredi@xxxxxxxxxx>
  • To: <opendtv@xxxxxxxxxxxxx>
  • Date: Wed, 27 Sep 2006 17:34:12 -0400

Strange mix of ideas, IMO. I don't disagree that DRM is an unholy mess,
but to go from there to saying that HD video is a bad idea seems, uh,
ludicrous. It sounds like someone trying to say that we should have
stuck with 78 RPM records and 5 KHz, 30 dB dynamic range AM radio.

Contrary to what he says, the Brits and other Europeans are introducing
or have introduced HDTV already. In some cases, it's available only over
DBS, but it's there.

As to the spectrum grab stuff, that's the same misleading argument we've
heard for a decade or more. Broadcasters are going to be making do with
less spectrum, not more. I wonder why he doesn't make that clear. I
guess it sounds more dramatic to say that the broadcasters grabbed more
spectrum, and leave what should be obvious unsaid.

I find his big screen argument, towards the end, also to be iffy, if not
bogus. Big screens may be used as a large desktop, sure. As they are in
a computer. But big screens are also used as the alternative to movie
theaters, that people have largely abandoned. And for guys to impress
their friends with live sports. Not JUST for multiple different windows
at all.

Bert

--------------------------------------------
http://www.digitaltvdesignline.com/howto/showArticle.jhtml?articleId=193
006131&pgno=1

September 26, 2006

HD Video -- Bad for consumers, Bad for Hollywood?

By Cory Doctorow

The high-definition screen has become a kind of Christmas tree,
overladen with ornaments hung by regulators, greedy entertainment execs,
would-be monopolists from the tech sector, broadcasters desperate to
hold onto their spectrum, and even video-game companies nostalgic for
the yesteryear of impervious boxes. The tree is toppling -- and it might
just take out a few industries when it crashes.

High def kicked off in the '80s, when Detroit was losing the car wars to
Japan and Motorola was figuring out that radio spectrum was pure gold,
if applied to mobile phones. Moto pointed out that the National
Association of Broadcasters' members were squatting on lots of spectrum
they'd been allocated, but hadn't lit up with TV signals. (Broadcasters
get their spectrum for free, and in exchange, we're supposed to get some
programming over those airwaves.) Motorola proposed to buy the idle
spectrum from the Federal Communications Commission, and use it to run a
phone business.

The NAB panicked -- there's nothing a corporate welfare bum hates more
than an end to its government handouts. So the broadcasters cast about
for an excuse, any excuse, to continue to hold onto our valuable radio
spectrum while doing nothing much with it. They found that excuse in
Japan, where high-definition sets were being met with popular and
critical acclaim. Japan -- having destroyed the American auto industry
-- was about to destroy American broadcasting with its devious high-def
sets, creating a high-def gap that America would struggle in vain to
bridge!

The nervy broadcasters asked the commission to leave all that fallow
spectrum intact, and furthermore, to allocate them even more spectrum,
so that they could broadcast HD signals alongside of the analog ones.
Once enough Americans had bought high def-receivers, the FCC could
switch off the analog towers, return the spectrum to the American public
and then, then it could be sold to the likes of Moto for mobile
applications.

Incredibly, the commission swallowed this, and gave the broadcasters
even more spectrum. The broadcasters approach spectrum like a dragon
approaches gold: it is something to be hoarded and jealously guarded,
but they're not much interested in using it. So they took all that
high-def spectrum and built a nest of it, rested their ponderous, scaly
bellies on it, and never lit it up.

By the 2000s, Congress and the FCC were desperate to get that spectrum
in use. Representative Billy Tauzin (now a shill for the pharmaceutical
industry) offered to give Hollywood any law it wanted in order to entice
them to open their movies to broadcasters, which might, in turn, entice
broadcasters to light up those towers, which might entice Americans to
throw out their standard TVs. No, really! This is the kind of Rube
Goldberg strategy that they're chasing! In the U.K., by contrast, they
simply created a standard for "FreeView," a box that tunes in 30 free,
standard-definition digital TV channels and plays them on your old set,
giving Brits an unbeatable enticement to switch to digital: one box gets
you free cable for life and you don't have to throw out your TV.

If the studios had their druthers, they'd just encrypt high-def signals.
An encrypted signal needs a key to decrypt, and you can set up all kinds
of rules about when, how, and who can decrypt a show by building it into
the contract that comes with the key. But you can't encrypt over-the-air
TV: The broadcasters get the spectrum for free, and in exchange they
have to serve us. It wouldn't do to let them lock us out of the programs
aired on our airwaves.

The Broadcast Flag is the law the studios came up with to square this
circle. They proposed a Soviet-style planned economy (Fox president Andy
Setos, who wrote the Broadcast Flag draft, referred to it as a
"well-mannered marketplace") where all TV receivers would have to be
built to honor the rules set down by the entertainment industry. The
studios would get a veto over any feature that threatened its existing
business-model, and anyone who wanted to interface with a TV receiver
would have to agree to play by Hollywood's rules. Even video cards, hard
drives, and motherboards would fall under this rule.

The Broadcast Flag was adopted by the FCC, and then was struck down by a
D.C. court that told the commission its jurisdiction stopped at the
broadcasting tower, and didn't extend to your living room. But the
studios and the broadcasters continue to advance their plans for a
high-def universe, and they continue to use HD as a Trojan horse for
smuggling in mandates over the design of commodity electronics.

The first line of this is high-def media players, particularly games and
the competing DVD specifications (to call them "standards" is to insult
to honest standards), Blu-ray and HD-DVD. These systems only output
high-definition picture on their digital outputs, and those outputs are
encrypted. To decrypt them on your TV, you need to get permission from
the entertainment industry, and to get permission, you have to make a
bunch of wicked promises.

For example, you have to promise to honor region codes, those nuisances
that try to restrict what country you can watch your lawfully purchased
movies in. That's not about copyright: Copyright doesn't let an author
tell you what country you can take his books to, nor a director where
you can watch his movies. It's just an arbitrary business model that the
studios can impose with the force of law, just by scrambling their
movies and making permission to descramble contingent on a
manufacturer's treating their business model as though it were law.

The new HD technologies include anti-user nasties like "renewability" --
the ability to remotely disable some or all of the device's features
without your permission. If someone, somewhere, figures out how to use
your DVD burner to make copies of Hollywood movies, they can switch off
*everyone's* burner, punishing a limitless number of innocents to get at
a few guilty parties.

The HD DRM systems also include gems like "selectable output control" --
wherein some programs will refuse to be played on some devices. As you
flip up and down the dial, parts of your home theater will go dark.
Creepier still is "authorized domain" -- the ability to flag content so
that it can only be played within a "household," where the studios get
to define what is and isn't a valid living arrangement.

On top of these restrictions are the punishing "robustness" regimes that
come with HD DRM systems. These are the rules manufacturers have to
follow to ensure that the anti-user stuff in their devices isn't
compromised. It's a requirement to add expensive armor to products that
stop a device's owner from opening up her device to see what's inside,
and make changes. That's bad news for open source, of course, since open
source is all about being able to look at, modify, and republish the
code that runs a device.

But even if you don't care about open source, the cash and utility cost
of compliance is a hardship. Sony's HD version of the PlayStation costs
a whopping $100 more than the non-HD version, and Sony's first
generation of Blu-Ray DVD drives *won't play Blu-ray movies* because
they can't get sufficient anti-owner countermeasures into the box.
Microsoft's 32-bit version of Vista won't do HD, either.

Most extraordinary is the relationship of HD DRM to the world's largest
supply of HD screens: LCD computer monitors. The vast majority of
HD-ready, 1080i-capable screens in the world are cheapo computer LCDs.
Chances are you've got a couple at home right now.

But unless these screens are built with crippleware HDMI or DVI
interfaces, they won't be able to receive high-def signals. DRM
standards call these legacy screens, and treat them as second-class
citizens.

All this may be enough to scuttle HD's future. Let's hope so, for
Hollywood's sake.

Because, you see, HD is also poison for the entertainment industry's own
products. The higher the resolution, the harder it is to make the
picture look good. Standard-def programs on high-def screens look like
over-compressed YouTube videos, and when you get a high-def program shot
by traditional directors, it looks even worse, every flaw thrown into
gaudy relief. Have a look at the HD-native episodes of Friends some day
-- it's all gaping pores, running pancake makeup, caked-on hairspray,
and freakishly thin bodies with giant, tottering heads.

It's even worse when it comes to computer-generated imagery, that staple
of big-budget blockbusters. Computer graphics have a painfully short
adolescence, a period of months during which an animation sequence looks
impressive. From there, it's a fast, one-way slide into liver-spotted
senescence, in which the artifice of the computer becomes a jumble of
last year's polygons. When this year's Coke commercials have slicker
graphics than last year's $200 million extruded sci-fi product, the last
thing you want to do is show it on a giant, high-res screen.

The natural life cycle of computer-aided movies in an era of Moore's Law
is to a series of successively smaller, lower-resolution screens. As
Geek Entertainment TV's Irina Slutsky says, "An iPod screen is my
personal airbrush." Some movies are already dated by the time they hit
the big screen -- take Polar Express, which looked so creepy that I
almost mistook it for a horror film when I saw it on a megascreen on
opening weekend. The next Christmas, I caught it on an old 12" TV at a
friend's cottage. It looked terrific -- I nearly forgot that I was
seeing pixels, not people.

There are some directors who get HD, to be sure. Mark Cuban's HDNet
features a pretty good selection of nice-looking big-screen pictures.
Cuban's one of the few entrepreneurs making content intended for a long
life in HD, and not coincidentally, he's a staunch opponent of HD DRM
systems. You can also get a nice HD experience by picking up a classic
film from a master filmmaker -- DigitalLifeTV's Patrick Norton is a fan
of Goodfellas at HD.

But for every Mark Cuban and Martin Scorsese, there are a thousand
people making programs that look better at standard-def or even smaller
-- shows that play well in your pocket but whose props and actors look
like cardboard at 100 inches.

That shouldn't surprise us, really: computer users have had giant
displays for a decade, and we don't use them to show one gigantic
window! Give a computer user a 30" flat-panel and she'll load it up with
25 windows -- some video, some text, some interactive. Blowing all that
screen real estate on a single picture is a waste.

Hollywood has fallen into the "horseless carriage" trap: A big screen is
like a small screen, but bigger. A personal computer is like a
mainframe, but on your desk. In reality, the big living room screen is a
different thing altogether. For years, the electronic hearth has lost
currency and relevance to households who would no sooner all watch the
same screen than wear the same underwear. 

The big screen is not a big TV -- big screens are multiwindow
workspaces. There's an opportunity there, a chance to bring the family
back together in front of the same set. Who's to say that all 100 inches
of the living room set should show one football game? Why not give both
kids their own spaces to play their own consoles, in corners of the
screen, give Mom and Dad their own areas to watch, throw up a browser
and some RSS-fed image and text-crawls?

A big screen for big pictures might have sounded good in the '80s, when
the FCC was signing over the nation's priceless spectrum to the NAB. But
lots of things sounded like a good idea in the eighties: multimedia
CD-ROMs, ISDNs, and RISC architecture. It's 2006: We know better now.

Cory Doctorow is co-editor of the Boing Boing blog, as well as a
journalist, Internet activist, and science fiction writer.

All material on this site Copyright 2006 CMP Media LLC. All rights
reserved
 
 
----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at 
FreeLists.org 

- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: