Category Archives: Consultants

23 degrees…half the light. 3D What?

Sillver Screen Light Failure Point3D Luminance Issues—Photopic, barely. Mesopic, often. Scotopic? Who knows…? 

We don’t mean to be picking on the good people at Stewart Film Screens by making an example of their Silver Screen light rolloff curve. They just happen to grace us with the most usable graphic description of what is happening to our light. Looking at Harkness Screens Data Sheet for Spectral 240 3D Screens is not better and may be worse. 

We know the problems of getting light to the eyes for any of the available 3D systems. The initial filter eats up to 50% of the light from the projector, plus the manner of each eye getting turned off 50% of the time, and the darkness of the glasses all steal a lot of light. If the projectors could produce enough light to overcome all these transmission problems…which they generally can’t…it would just mean more burnt expensive bulbs and higher electricity costs. 

But even if the exhibitor cranks it as best as possible, and tweaks the room to get the best RGB balance at the best seats of the house, if the auditorium is using a ‘silver’ screen to maintain the polarity of the RealD or MasterImage system, the patron who is 23 degrees off the center-axis will have half the light available. Put another way, as you can see from the full picture at the Stewart site, 3 seats away from center is a totally different picture…as is the 4th and 5th, etc., as the situation just gets worse. 

If the cinema had achieved 5 foot Lamberts (17 candela/m2) behind the glasses (most don’t get 3ftL – 10c/m2), then 3 seats off center will be 2.5ftL (8.5c/m2). At this point, bright reds have all turned to brick red or darker, and blues are becoming relatively dominant – it isn’t that there are fewer yellows or greens in the picture – it is that the eye becomes better able to discern the blue in the mix. (Another way to describe what is known as the Purkinje shift is that an object that appears greenish-yellow in brighter light will appear to be greenish blue as the intensity of the light descends lower than below 10 candelas/m2.) Combine that with stray light from a few EXIT signs, which not only mess with the contrast but puts non-symetrical data into the normally “practically-” symmetrical 3D mix, plus some reflections in the back of the eyeglasses and the patrons should not wonder why they don’t universally have an enjoyable experience. 

We won’t beat this into a pulp since most real-life scenarios just get worse.

What will make it better?

Consumer education to begin, which is the real excuse for this article. Patrons must know what to insist upon. 

Projectors can’t generate enough light to get 3D up to the 14ftL (48 candelas/m2) that 2D movies are shown at. But the new Series II projectors can do ‘more’ and industry tests show that ‘more’ is better, especially if the original was ‘mastered’ to be shown at ‘more’. James Cameron was prepared to ship theaters a ‘print’ of Avatar that was mastered at hotter levels for cinemas who asked for it…up to 10 ftL! Patrons must insist that if they are paying more for the experience, they should get better…perhaps 10ftL is not going to be the standard this year, but 7 or 8? Grass roots effort anyone? The studios set the intention in the DCI spec at 14, so one would think that they will come to the plate with ‘more’ if asked. [DCI Specification 1.2; page 48…and tell them that you want an order of Uniformity and some of that ±4 Delta E while they’re at it.] 

The future also holds at least two potential ways that will give a better picture. Brian Claypool at Christie points out that one of the features of the Series II projector is “more native support for faster frame rates.” For example, many people in the creative community believe that higher frame rates will do more for image quality than having more pixel resolution. Again, Brian Claypool, “Do you remember how rich every frame was in Avatar, that your eye just kept wanting to look around? Well, imagine having 2 times as many frames for your eyes to follow… it will feel like looking out a window on another reality”.

{youtube}WgrdKmgniGI{/youtube}

The other, also long-term, change is replacing bulbs with lasers in the projectors. Good news on that front was announced by one player, Laser Light Engines. We deconstructed their newest announcement and some of their potential at: Laser Light Engines gets IMAX funding—Putting Light on the Subject

Some mark this as digital cinema’s 11th year, but it wasn’t until 6 years ago that 2K was delivered, an example of the evolution of this industry. 

Links: Luminance Conversion Table

Scotopic Issues with 3D, and Silver Screens

Knoting Laser Light

Dager’s Reinventing Cinema: DCinema’s First Decade

To be sure there were serious efforts prior to 1999. JVC with their D-ILA technology can make a legitimate claim for the first digital cinema demonstration. On March 19, 1998, they collaborated on a digital presentation at a cinema in London. Another early effort was the movie The Last Broadcast, which may have made cinematic history on October 23, 1998 when it became the first feature to be theatrically released digitally, via satellite download, to theatres across the United States. Wavelength Releasing, Texas Instruments, Digital Projection and Loral Space headed that effort. In 1999, it was repeated across Europe using QuVIS technology and The Last Broadcast became the first feature to be screened digitally at the Cannes Film Festival. In 2000, Disney, Texas Instruments and Technicolor worked with several U.S. and international exhibitors to deploy prototype digital cinema systems in commercial theatres. Technicolor assembled and installed the systems using the TI mark V prototype projector, a special Christie lamp house and QuVIS’s QuBit server with custom designed automation interfaces.

But the Phantom Menace digital screenings generated widespread visibility and publicity and developments began to occur on a more regular basis. The Society of Motion Picture and Television Engineers began work on standards for digital cinema in 2001. The Digital Cinema Initiatives formed in March 2002 as a joint effort by Disney, Fox, MGM, Paramount, Sony, Universal and Warner Bros. The serious technical groundwork was being laid. The rest, as the cliché goes, is history.

The challenge? To literally rethink, retool and reinvent, from the ground up, a global industry that had worked successfully for a century. Read that sentence again to get a sense of how overwhelming – and some would, and did, say unnecessary – that task would be and you may gain a greater appreciation for how much was actually accomplished in a decade.

The State of Digital Cinema – April 2010 – Part Zero

What they came up with is called the tri-stimulus system since the primary idea is that there are nerve endings in the eye which act as receptors, some of which primarily deal with green light, some with red and some with blue. These color receptors are called the cones (which don’t work at all in low light), while the receptors that can deal with low levels of light are called the rods.

Now, for the first of our amazing set of numbers, there are as many as 125 million receptors in the eye, of which only 6 or 7 million deal with color. When (predominantly) only one type of these receptors gets triggered, it will send a signal to the brain and the brain will designate the appropriate color. If two or more of these receptors are triggered, then the brain will do the work of combining them much the same way that a painter mixes water colors. (We’ll pretend it is that simple.)

OK; so how do you create a representation of all that color and detail on the TV or movie screen?

Let’s start with film. We think of it as one piece of plastic, but in reality it is several layers that each have a different dye of different sensitivity on it. Each dye reacts in a different and predictable manner when exposed to light through the camera lens. In the lab, each layer goes through a different chemical process to ‘develop’ a representation of what it captured when exposed by the camera system. There are a lot of steps in between, but eventually the film is exposed to light again, this time pushing light in the opposite manner, through the film and then through the lens. That light gets colored by the film and shows up on the screen.

One of the qualities of film is that the chemical and gel nature makes the range of colors in the image appear to be seamless. And not just ‘appears’ with the definition of “gives the impression of.” In fact, there is a great deal of resolution in modern film.

Then TV came along. We see a smooth piece of glass, but if we could touch the other side of a 1995 era TV set we would feel a dust that reacts to a strong beam of electricity. If we look real close we will see that there are actually different color dots, again green, red, and blue. Engineers figured out how to control that electric beam with magnets, which could trigger the different dots of color to make them light up separately or together to combine into a range of colors, and eventually combine those colors into pictures.
That was great, except people wanted better. Technology evolved to give them that. Instead of lighting up magic dust with a strong beam of electricity, a couple methods were discovered that allowed small colored capsules of gas to be lit up and even small pieces of colored plastic to light up. These segments and pieces were able to be packed tightly against each other so that they could make the pictures. Instead of only hundreds of lines being lit up by the electron gun in the old TV set, now over a thousand lines can be lit up, at higher speeds, using a lot less electricity.

Then a couple engineers figured out make and control a very tiny mirror to reflect light, then quickly move to not reflect light. That mirror is less than 25% of the size of a typical human hair.

Hundreds of these mirrors can be placed next to each other on a chip less than 2 centimeters square. Each mirror is able to precisely move on or off at a rate of 144 times a second, which is 6 times the speed that a motion picture film is exposed to light for a picture.

This chip is called a DLP, a Digital Light Projector, because a computer can tell each mirror when to turn one and off, so that when a strong light is reflected on an individual or set of mirrors, it will create part of a picture. If you put a computer in charge of 3 chips, one for green, one for red and one for blue, the reflected light can be focused through a lens and a very detailed picture will appear on the screen. There is a different but similar technology that Sony has refined for their professional cinema technology which uses crystals that change their state (status).

Now for the 2nd in our amazing set of numbers. There are 1,080 rows made up of 2,048 individual mirrors each for over 2 million 2 hundred thousand mirrors per chip. If you were to multiply that times 3 chips worth of mirrors, you get the same “about 6 or 7 million” mirrors as there are cones in each eye.

Without going into details (to keep this simple), we keep getting closer to being able to duplicate the range and intensity of colors that you see in the sky. This is one of the artists goals, in the same way as the engineers want to make a lighter, flatter, environmentally better television and movie playing system. It isn’t perfect, but picture quality has reached the point that incremental changes will be more subtle than substantive, or better only in larger rooms or specialist applications.

For example, a movie that uses the 2K standard will typically be in the 300 gigabyte size. A movie made in 4K, which technically has 4 times the resolution, will typically be less than 15% larger. This movie will be stored on a computer with many redundant drives, with redundant power supplies and graphics cards that are expressly made to be secure with special “digital cinema only” projectors.

Hopefully you have a feeling for the basic technology. It is not just being pushed onto people because it is the newest thing. The TV and movie businesses are going digital for a number of good reasons. To begin with, it wasn’t really possible to advance quality of the older technology without increasing the cost by a significant amount…and even then it would be incredibly cumbersome and remain an environmental nightmare. There are also advantages of flexibility that the new technology could do that the old couldn’t…or couldn’t at a reasonable price or at the quality of the new.

The technology of presenting a 3D image is one of those flexibility points. 3D was certainly one of the thrills of Avatar. The director worked for a decade learning how to handle the artistic and the technical sides of the art. He developed with closely aligned partners many different pieces of equipment and manners of using existing equipment to do things that haven’t been done before. And finally he spent hours on details that other budgets and people would only spend minutes. In the end James Cameron developed a technique and technology set that won’t be seen as normal for a long time from now…and an outstanding movie.

Could Avatar have been made on film? Well, almost no major motion picture has been made exclusively on film for a long time. They all use a technique named CGI (for the character generated imagery), which covers a grand set of techniques. But if you tried to generate the characters in Avatar exclusively on a computer with CGI, they never would have come out as detailed and inspiring as they did. Likewise, if he tried to create the characters with masks and other techniques with live action, you wouldn’t get the texture and feeling that the actors gave to their parts.

Could Avatar have been displayed with film, in 2D. Yes, it could have and it was.

3D is dealt with in more detail in Part II of this series, but here are some basics:

To begin, 3D is a misnomer. True 3 dimension presumes the ability to walk around a subject and see a full surround view, like the hologram of Princess Leah.

In real life a person who is partly hidden in one view, will be even more hidden or perhaps exposed from another view. On the screen of today’s 3D movie, when a character appears to  b partly hidden by a wall as seen by a person on the left side of the theater, they will also appear the same amount of hidden by someone on the right side of the theater.

In fact, what we see with out eyes and what we see in the new theaters is correctly termed “stereoscopic”. We are taught some of this in school, how to make two lines join somewhere out in space (parallax) and draw all the boxes on those lines to make them appear to recede in the distance…even though they are on one piece of paper. There are several more clues in addition to parallax that we use to discern whether something is closer or farther, and whether something is just a drawing on a sheet of paper or a full rounded person or sharp-edged box…even in a 2D picture.

And we have been doing this for years. We know that Bogie and Bergman are in front of the plane that apparently sits in the distance…our eyes/brain/mind makes up a story for us, 3 dimensions and probably more, even though it is a black and white set of pictures shown at 24 frames per second on a flat screen.

Digital 3D is an imperfect feature as of now. It has improved enough that companies are investing a lot of money to make and show the movies. The technology will be improved as the artists learn the technology and what the audiences appreciate.

Although we are in a phase that seems like “All 3D, All The Time”, 3D isn’t the most important part of the digital cinema transition. At first blush the most important consideration is the savings from all the parts of movie distribution, including lower print costs and transportation costs. But actually, because prints no longer cost over a thousand euros, and because it will be simple to distribute a digital file, lesser known artists will have the opportunity to get their work in front of more people, and more people will find it easier to enjoy entertainment from other cultures and other parts of the world.

This Series now includes:
The State of Digital Cinema – April 2010 – Part 0
The State of Digital Cinema – April 2010 – Part I
The State of Digital Cinema – April 2010 – Part II
Ebert FUDs 3D and Digital Cinema

The State of Digital Cinema – April 2010 Part Two

SMPTE refined the work that the studios sponsored and summed up in a series of compliance documents (See: DCI Movies) done in the spirit of, “This is the minimum that we require if you want to play our movies.” As the saying goes, “Standards are great! That’s way there are so many of them.” And as an executive stated, “We can compete at the box office, but if we cooperate on standards, it benefits everyone.”

In fact, the cinema standard that is known as 2K is beyond good enough, especially now that the artists in the post-production chain have become more familiar with how to handle the technology at different stages. Most people in the world don’t get to see a first run print anyway, and a digital print (which doesn’t degrade) compares more than favorably with any film print after a few days. Plastic which is constantly brought to its melting point becomes an electrostatic dust trap, stretch and gets scratched, and the dyes desaturate.

To this date, most digital projectors are based upon a Texas Instrument (TI) chip set. Sony’s projector is based upon a different technology, and has always been 4K (4 times the resolution of 2K), but not many movies have been shipped to that standard yet. The TI OEMs will be shipping 4K equipment by the end of the year (or early next year.) Except in the largest of cinemas, most people won’t be able to tell the difference between 2K and 4K, but the standard was built wide enough to accommodate both.

Confusing the consumer, 2K in pixels (2048 picture elements in each line) seems near enough to the 1920×1080 standard of TV know as 1080p. But there are other differences in the specification besides pixel count, such as the color sample rate, that are more important. In addition, many steps of the broadcast chain degrade the potential signal quality so that hi-def broadcast is subject to the whims of how many channels are being simultaneously broadcast, and what is happening on those channels. (For example, if a movie is playing at the same time as 15 cooking channels, it will have no problem dynamically grabbing the extra bandwidth needed to show an explosion happening with a lot of motion. But if several movies all dynamically require more bandwidth simultaneously, the transmission equipment is going to have to bend some of them in preference to others, or diminish them all.) Blu-ray will solve some of that, depending on how much other material is put on the disc with the movie. Consumers like the “other stuff” plus multiple audio versions. Studios figure that only a relative handful of aficionados optimize their delivery chain enough to be able to tell the difference. So they end up balancing away from finest possible quality for the home, while finest quality is maintained for the cinema by virtue of the standards.

With all the 3D movie releases announced, people question whether they should expect 3D in the home. It is quite possible. The restrictions or compromises are many though. First, special glasses are required, and there seems to be a reaction against the glasses. Many companies are attempting to develop technologies that allow screens to do all the work (no glasses), but when the largest company, which spent the most money over the last few years, pulls out of the market, it isn’t a good sign. (Philips pulls out of 3D research | Broadband TV News) The reality is that one person can see the 3D image if they keep their head locked in one position, and perhaps another person in another exact position, but it isn’t a marketable item.

Fortunately, there were three companies at ShoWest which offered much cooler glasses for watching 3D, including clip-ons. Since there are 3 different types of 3D technology in the theaters, it a complicated task for the consumer. At best, the cinema will hype that they have 3D, but they rarely give the detail of which type or equipment they are using.

There are several clues that humans use to establish depth data and locations of items from a natural scene. Technically, these items in the 3rd dimension are placed on what is called the `z axis’ (height and width being the x and y axes.); Matt Cowan details a few of these clues in this presentation, and there are others. Filmmakers have understood how to use these in 2D presentations for ages.

But the challenge for decades has been synchronizing the projection and display of two slightly different images, taken by cameras 6.4cm apart (the same as the `average’ eye distance), in a manner that shuts out the picture of the right eye from the left eye, and a moment later shuts out the picture of the left eye from the right eye fast enough that the eye gets info to the brain in such a way that the mind says, “Ah! Depth.” Digital projectors makes this attempt easier. It has evolved even in the last 2 years, and that evolution will continue.

There are four companies (Dolby, RealD, MasterImage and XpanD) who produce 3 different technologies for digital 3D systems for the cinema theater. Each coordinates with the projector in a slightly different manner. The projector assists by speeding up the number of frames presented to the eyes, 300% more in fact, with a technique called “triple flashing”.

For comparison, 2D film projector technology presents the image two times every 1/24th of a second. This means that the film is pulled in front of the lens every 24th of a second, allowed to settle, then a clever gate opens to project light through the film to the screen, which then closes and opens and closes again. Then the film is unlocked and pulled to the next frame. With digital 2D, motion pictures are handled the same, presenting the same picture to the screen twice per 24th of a second, then the next picture and so on. Triple flashing a 3D movie increases the rate from 48 exposures per second to 72 per second…for each eye! Every 1/24th of a second the left eye gets 3 exposures of its image, and the right eye gets 3 exposures of its slightly different image; L, R, L, R, L, R, then change the image.

Since it would be difficult to get everyone to blink one eye and then the other in the right sequence for an hour or two, the different 3D systems filter out the picture of one eye and then the other,. The Dolby systems does this (simply stated) by making one lens of the glasses an elaborate color filter for one eye, with the complimentary twin for the other eye. The projector has a spinning color wheel with matching color filters which, in effect, presents one image that one eye can’t see (but the other can), then presenting the opposite. RealD does this with a circular polarizing filter in front of the projector lens that switches clockwise then counter-clockwise, and glasses which have a pair of clockwise/counter-clockwise lenses. The XpanD system does this with an infra-red system that shutters the opposing lenses at the appropriate time. There is a 4th system named MasterImage which uses the same polarizing glasses as RealD, but with a spinning filter wheel instead of a very clever and elaborate (read, “expensive”) LCD technology.

Suffice to say that there are advantages and disadvantages to each system. Dolby’s glasses are made from a sphere of glass so that the eye’s cornea is always equidistant from the glass filter. They are also more expensive, though they have had two price drops as quantities have gotten up, from an original $50 a pair, to last year’s $25, and now $17 each. They need to be washed between uses for sanitary reasons, which provides jobs of course, but also adds to logistics and cost. XpanD glasses also need washing between use and have a battery that needs changing at some point. (Without going into the detail, the XpanD IR glasses are thus far the technology of choice for the home market, though no company should be counted out at this stage.)

RealD were the first to market and originally marketed with the studios, who provided single use glasses for each movie. Dolby sold against this by taking the ecology banner, announcing that they had developed their glasses with a coating that can be washed at least 500 times. RealD found that their glasses could be recycled to some minor extent and have now put green recycling boxes into the lobbies of the theater for patrons to drop them into for return to the factory, washing, QC and repackaging (of course, in more plastic.) There are no statistics as to how many get returned and how many get re-packaged.

A few cinemas are selling the glasses for a dollar or a euro, and seeing a lot of people take care of, and return with, their glasses. Eventually this model will be more wide-spread, with custom and prescription glasses, but the movie industry was concerned with putting up a barrier while 3D was in infancy, and glasses makers weren’t interested when the numbers were low.

Since the three systems are different, and there is no way to make a universal pair of glasses, patrons are going to have to know what type of system is used at their cinema of choice, or buy multiple pairs. In any case, the glasses are not going to be ultra-slim and sexy. In addition to being the filter for the projected light, they must also filter extraneous light. If they allow too much light from Exit signs or aisle lighting or your iPhone, the brain-trickery technology will not work. There are enough problems with 3D in general, and today’s version of it in particular, to allow any variables.

The most grievous is the amount of light getting filtered by all the lenses, coupled with the fact that half the light is being filtered from both eyes by making you blink 72 times per second. Less than 20% of the original light is seen in the eye by some systems. Up till now there hasn’t been a way to crank up the light level to compensate, and if projectionists tried, the cost in electricity goes up and life of the system would go down. This is one major reason that manufacturers of new projectors are hyping lower light levels.

The other technical compromise with the polarizing lens systems is that they require what is called a “silver” screen to help maintain the polarization (and secondarily, to help maintain light levels.) But there is no free lunch with physics. Silver screens can be optimized, but the worst of them will have ‘hot spots’ in the room that make the side seats or upper seats see a different (darker) image while some seats have brighter or hopefully some with even the ‘correct’ amount of light. The major screen manufacturers have done a lot of work to mitigate this effect, and will tell you this problem is now virtually solved, but there are a lot of older screens out there, and incorrectly installed screens and a lot of people who have walked around and still see the effect. Sit in the center of the cinema and you will have the best odds, somewhat toward the front (the projector is higher than you are, and presuming that the screen is flat, the theoretical correct angle to your eyes is down. On the other hand, audio mixers mix from about three quarters back. YMMV.

Part 3 and 4 deals with acquisition, with and without 3D, more considerations of digital and 3Ds evolution, how to make your own master, where in the world are these digital boxes? and whether there will be 50% saturation by the end of 2011.

Cross posted to: DCinemaTools

The State of Digital Cinema – April 2010 | Part One

Two years ago, the evolution and rush to all things digital in the cinema world reached a classic chasm point, especially for digital cinema presentation to the theater screen. (See bottom question/answer.) It seemed that the technology was worked out, it seemed that the politics were worked out, it seemed that the financing models were worked out…and yet, the number of installations and new sales sat flat…or worse.

Huge companies like Texas Instruments (TI) and Sony had spent millions getting the technology ready for a secure and marketable implementation. Their OEM partners where ready to throw the handle to ‘Plaid’ to fill the needs of 125,000 screens in a world that needs to go from film-based to digital server based systems. The changeover requires a 60-80 thousand euro projector and 20,000 euro server to replace a 30,000€ film chain, a mature technology that typically lasted multiple decades with minor maintenance. But to the rescue, the studios offered plans that would pay back the initial investment by a mechanism known as a Virtual Print Fee (VPF). These were developed to compensate certain cinemas, over time, for playing inexpensive digital copies (distributed via hard disk and eventually satellite and fiber) instead of expensive film prints (distributed by trucks and airplanes.)

So, with all the ducks so apparently in a row, why weren’t the 7,000 ‘innovators’ and early adopters of 2007 joined by 10’s of thousands more screens by early 2010, when the number was merely double that (even after the initial 3D explosion)?

The reality was that the technical, political and financial realities weren’t really ready. Notwithstanding the world financial collapse that hindered access to the billions needed for the transition, there were nuances that made financing not so simple. In addition, the standards were still in transition, both on paper and in the labs and factories.

Financially, the major Hollywood studios are prepared to finance the transition up to the amount that they save in print costs and distribution. The nuance is that they only send out prints to the first-run cinemas, leaving the 2nd and 3rd level cinemas with no funding. (The background nuance is that once the digital transition is complete, the studios save billions per year forever, but are only helping to fund the initial roll-out. The exhibitors save a few low cost employees, and benefit from better quality and the ability to present features other than movies.)

World-wide, the Hollywood studios that developed the VPF mechanisms also didn’t find it fair that they should have to finance cinemas which made income from movies other than Hollywood movies. Nor did they want to overpay for equipment if a cinema made money from operas, concerts, sports or other alternative content that digital projection allows. This caused many national groups, in particular those in the UK, France, Italy and Germany to search for ways to fund the smallest to mid-sized facilities so that they would have digital equipment when enough critical mass was reached for film prints to become ancient history.

The UK funded several hundred screens with lottery money in one partially successful experiment, but it exposed a few holes in the plans. Simply stated, a movie’s life starts in one screen for a week or two, then moves to a smaller screen while the next movie in line attempts to take the larger audience in the larger room. But if there is only one set of digital gear, and that in the larger room, then the cinema still needs a film print to complete the movie’s run. One of the points of a Hollywood VPF is an agreement to get 50% of screens digital in one year and 100% in three years (with at least one capable of 3D.)

When the slow wheels of national finance plans got past the proposal stage, the largest cinemas in France and Germany complained that the ‘tax’ they paid per ticket was funding their competitors. Both plans were recently (in the last few months) thrown out as unfair by the country’s legal systems. (Norway figured it out on their own and are on their way to digitizing the entire country’s cinemas.

Meanwhile, the standards committees within the Society of Motion Pictures and Television Engineers (SMPTE) completed the last of the standards documents in 2009, submitting them to the ISO in the process. What should have been to no one’s surprise, some of the equipment, in particular the installed projectors that utilize the Texas Instruments chipset (the vast majority), didn’t meet those standards. In fact, the first projectors (dubbed ‘Series II’) to meet those standards were released in March 2010, at the industry’s ShoWest convention. Unlike the WiFi industry’s ability to ship equipment for over a year before the standards validated their presumed compliance, there are several pieces of older digital projection gear that will need expensive updating, with some equipment updatable and technically passing compliance requirements, but not able to include some important ‘modern’ features.

In addition to finally getting compliant projectors, those who waited for the new Series II equipment will also be getting equipment that is able to run with lower power consuming bulbs, and of course, give more light to the all important 3D image.

The invasion of 3D movies has been a boon to cinemas. The studios have all embraced it by announcing an ever increasing 3D release schedule, first with animated releases, but now (famously with the Avatar release) with CGI enhanced live action. The exhibitors not only are able to attract larger audiences with this nascent technology, but they are able to charge more per ticket in the process. This helped give the industry its first 10 billion dollar year in 2009, and keep actual ticket sales on an upward trend. In the alternative content area, live opera is still the most prevalent and successful, but live pop concerts have been successful, and more are slated. Sporting events have been experimented with, some in 3D, and will probably become more successful in the near future.

Coincidently, a few major installation groups have gotten financing in the last few months – It appears that the three largest US chains have the financing to cover 10 or 12 or 14,000 of their 17,000 screens. The disparity between PR and reality is not a trifle, but public information is hard to come by. The announcement that they were working with JPMorgan for money in 2007 mentioned numbers that were twice (Celluloid Junkie-More Rumblings About DCIP’s Financing) what they announced recently. And, the recent announcements don’t mention how they will finance 3D equipment, which costs up to $30,000 per screen…and is not covered by VPF agreements.

Notwithstanding those hidden nuances, it finally is movement across the chasm from innovators to more conservative early adopters. In addition, several integrators in Europe, India, China, Japan and Korea have recently announced hundred and multi-hundred piece installation deals in their areas. See: DCinemaToday for up to the minute market news for the exhibition side of digital cinema.

With the release of the Series II equipment, other features that were built into the standards are driving manufacturers to build matching equipment. Most welcome is equipment for the deaf/hard of hearing and visually impaired communities (HI/VI). There was a special exhibition at ShoWest of these company’s works-in-progress; devices that use special glasses that create closed captions which float the text over the screen (so that one doesn’t have to constantly look up and down to see both), and another system that will use WiFi to put captions on one’s iPhone (among other devices), as well as new ways to put dialog-enhanced audio into earphones.

The best news for the HI/VI field is that the SMPTE and ISO standards are are in place, have been recently ‘plug-fest’ tested for interoperability, and contrary to the previous film-centric systems, the new standards are based upon open, not proprietary (read: patented, licensable, expensive, frustrating) technology. (For a brief discussion on HI/VI captioning and the `enthusiasm’ of differing viewpoints, see: Smashing Down The Door – Digital Cinema and Captions For the Deaf and Hard of Hearing)

The arguments still persist around the excellent qualities of film, much like the arguments in the audio world about the qualities of tape recording and vinyl. While some of the arguments are interesting and some of those even true (the ability/inability to wash a screen with the indescribable transitions of Lawrence of Arabia‘s desert sunset comes to mind), the arguments against film are too many. Film is an ecological nightmare, the prints are expensive to ship around, re-gather and store, and whatever qualities that they exhibit at first runs are grossly diminished after a week of getting banged around within the film projection process. And unlike the audio business, where specialty houses can still afford to make tape for those who want to record on it, as fewer companies use film for shooting and exhibition, the cost of material and processing will become too expensive for the budgets of even the Spielberg’s of the art.

Fortunately, the evolution of quality in digital production and post-production equipment has substantially gone beyond the requirements of ‘film’ makers. As with all recent digital technology, quality points are also being hit at the low end, so that artists can make motion pictures which can fill the big screen for less money and take advantage of the substantial distribution benefits of the digital infrastructure. At the high end, artists can do more, perhaps more quickly and certainly with more flexibility and features. For the consumer, this means that quality is possible from a wider range of storytellers and the possibility to see material from other regions around the world becomes more easily accomplished.  

Part II of this series goes into more detail on specifications, some current realities of 3D technology, what “substantially gone beyond the requirements” really means, and a brief excursion on how it relates to the home market.

References:
DCinemaToday
MKPE’s Digital Cinema Technology FAQ

This Series now includes:
The State of Digital Cinema – April 2010 – Part 0
The State of Digital Cinema – April 2010 – Part I
The State of Digital Cinema – April 2010 – Part II
Ebert FUDs 3D and Digital Cinema

Question 0: What is the exact definition of DCinema

[The question is being answered by David Reisner of D-Cinema Consulting. David is a board member of several organizations such as the ASC and ISDCF, co-author of several books on many fields of the cinema process and specializes in design and implementation of digital cinema infrastructure projects.]


For nearly 100 years, motion pictures have been delivered to theaters on 35mm film and have been shown with film projectors.

Digital Cinema, officially called D-Cinema in the technical community, delivers movies to theaters as digital files – most often on harddisk, sometimes via satellite, probably in future also by network/internet.  The movies are then shown using digital cinema servers (special purpose computer systems) and theater-grade digital projectors.  D-Cinema also includes/requires a number of digital and physical security mechanisms, to keep content (movies) safe.  The key documents are the DCI “Specification” (actually a requirements document) and a number of SMPTE standards.

D-Cinema requires support for 2048 x 1080 or 4096 x 2160 images and 14 foot-lambert brightness (similar to film standard brightness, although theaters sometimes use lower light levels for cost).  Movies are distributed in 12-bit X’Y’Z’ color – much more color detail than HDTV’s Rec. 709.  X’Y’Z’ can represent all the colors that a human can see, but the real limitation is the projector (and, to be fair, the camera and post-production process).  All D-Cinema projectors show at least a minimum color gamut which is a significantly wider range of color than Rec. 709 – similar to the range supported by film.

For some markets or purposes (e.g. pre-show, advertising, maybe small markets), some people use things informally called electronic cinema, e-cinema.  There is no formal standard for e-cinema although there is some informal agreement in certain areas.  E-cinema will have lower resolution, narrower color, less brightness, and little or no security.

Major studio content will only be distributed to D-Cinema systems that meet the SMPTE and DCI specifications and requirements, and have passed the DCI Compliance Test.

David Reisner
D-Cinema Consulting
image quality, color, workflow, hybrid imaging
[email protected]
www.d-cinema.us

Gmail flaw shows value of strong passwords

[Editor’s Point] Yet again, this is not important to us as a community just because some of us might have a gmail account. This is important because security moves in a diminishing cycle. Of the 10 items in the article, who among us is vigialant on more than 3? on even 3?

The other nice thing about this article is that it is written in a way that it can  be given to anyone; a great training tool.

Read the entire article at: Gmail flaw shows value of strong passwords | By Becky Waring

According to Aguilera’s new security alert, Google allows anyone with a Gmail account to guess another Gmail user’s password 100 times every two hours, or 1,200 times per day. …

To its credit, Gmail requires fairly long passwords of 8 characters or more. However, as Aguilera points out, Gmail allows users to create extremely weak passwords such as aaaaaaaa.

A quick survey of my friends and relatives revealed that not one of them uses strong passwords. Most people have no idea how to create them. Yet everyone I asked expressed guilt at using easy-to-crack passwords: pet names, birthdays, and common dictionary words.

Most people’s passwords could be guessed in far fewer than 10,000 attempts. And, despite using weak passwords, the people I interviewed say they rarely change their sign-in strings. (One-third of the people surveyed use the same password for every Web site they sign in to, and the infamous Conficker worm needed to try only 200 common passwords to break into many systems, according to an analysis by the Sophos security firm.)

Here’s the topper: many respondents to my informal survey admitted to keeping an unencrypted file on their systems that lists every password they use!

The article continues to tell why that is just wrong, and what can be done – simply – to fix the problem…as well as challenge us all with 10 things that we who know better probably don’t do…well, maybe a few…

3D Event Consultants Dream…and Nightmare

It all came together. That’s the good news. And the data gathered told some stories.

The first is that there are some oxen that might get gored…evidenced by how some people’s opinions stood markedly aligned with the equipment capabilities of their respective (and respected) companies.   

The second story is that the future doesn’t have to frighten those early projector-buying pioneers (who have enough arrows in their backs). In the case of mastering, so this author dares to say he heard almost unanimously, higher luminance is better – and up to ‘some’ level (this is where the variance of opinion took place) there was benefit to the playback regardless of the capabilities of the projector.

But those pieces of data is not what this piece is about. This piece means to tell the story of a half-dozen people pulled together a stunt that couldn’t have been done by many others, which attracted hundreds of people, and which was done for free.

The topic was brought up at a general meeting in January. A few comments were made that it sounded like a good idea, and the chairman took that as enough backing to assemble a team of volunteers…meaning really, 3 or 4 people…and a date…meaning really, 8-10 weeks. Then made shorter to fit it in before ShoWest. 

As the Chairman, Jerry Pierce, put it, “Dolor dapibus Phasellus id Aenean rhoncus Maecenas Nunc pellentesque In convallis. Porttitor fringilla sed natoque Aliquam wisi Sed tempus pretium pretium Pellentesque. Sed congue at magna nunc sociis gravida Donec elit accumsan nonummy. Lacinia mauris nunc malesuada sed laoreet elit ipsum malesuada Aenean nunc. Commodo mus nibh ac congue Aliquam orci Donec semper ipsum elit. Mattis lacus.”

Mattis lacus, indeed. 

“Pellentesque pede tincidunt tellus lorem ultrices enim”, quiped an engish engineer who has done these types of things before for the BBC and others. “Tristique tincidunt sem pretium. Mollis euismod lacinia et Curabitur orci pellentesque eget Vestibulum Duis penatibus.

It is possible that Kevin Wines knew what he was getting into, even if unpaid. “Vestibulum Donec id congue fames auctor interdum mauris auctor tellus cursus. Odio interdum Aenean interdum egestas vitae pellentesque dictum diam nec feugiat, et Modern Video est risus dui consequat commodo nibh Donec at risus condimentum Quisque.

That’s pretty consistent with Modern Video’s reputation. They did a lot of favors and did them well. One wonders if there will be an article written about volunteers like Marvin Hall and Mark Smirnoff from MVF.  

David Reisner, another of the unpaid and the keeper of the documentation and meeting notes laffed; “Rutrum tincidunt tincidunt porttitor ridiculus Vestibulum et semper vitae Vestibulum urna.”

A lot of manufacturers chipped in as well. Often they have as much to lose as to gain from an effort like this. But this is a note about the unpaid, people like Harry Mathias and Walt Orway who stood ready to act given any circumstance…and like all the others above, paying for travel and hotels and dinners away, and phone calls and hours of meetings…they made it happen and happen well. And it wasn’t the first time for any of them.