Tag Archives: cinema

Cinemark and Regal Go Captioned

Here are some links to the news. Cinemark has decided upon the Doremi CaptiView system, while Regal hasn’t specified the technology. [Sony Entertainment Access Glasses with Audio are announced as the choice at CinmeaCon 2012 – Ed.] Cinemark had ongoing lawsuits with the Association of Late-Deafened Adults (ALDA), which are going to be dropped, according to the announcements of ALDA.

Regal will also be working with the website development group and Captioned Movies Search Engine site CaptionFish to help promote the services to their customers.

Hearing Loss Law : Washington Hearing Loss Lawyer & Attorney : John Waldo Law Firm : Hard of Hearing, Disability, Hearing Impairment : Seattle, Bainbridge, Washington, Pacific Northwest – Dead Link

Cinemark and ALDA (Association of Late-Deafened Adults) announce movie theatre accessibility for customers with hearing disabilities Global Alliance on Accessible Technologies and Environments

Cinemark Agrees to Provide Closed-Caption Option – DCInsider

Cinemark Installing Closed Captioning in all Theatres | Digital Cinema Report.

Regal theaters to become more accessible | Denver Business Journal

NJ theaters to use caption devices for deaf

Closed Captioning at the Movie Theatres. 

Regal Makes Seattle America’s Most Accessible Movie City

“Unfortunately, not every theater chain is following the lead of Regal and Cinemark. AMC theaters, America’s second-largest chain, continues to take the position that it will equip some but not all of its theaters to show captions. We are currently in the process of addressing that question in our Seattle lawsuit, and would hope for a favorable ruling, a change in AMC’s corporate position, or perhaps both.”

3Questions: OpenDCP – Now with GUI

Open Source tools are described throughout the DCI specifications, and the nuance of using them is detailed in the myriad SMPTE (and ISO) documents of Digital Cinema. The Digital Cinema Package (DCP) is a complex joining of various video and audio standards coupled with several security protocols that make the transport, local storage and playout of entertainment able to be used by any combination of the available ‘compliant’ media players and projectors.

Since official compliance is a new part of the dcinema world, this hasn’t been an easy task. It is made more complicated by the several transitions that the equipment is going through; Series One and Series Two projectors, external to internal media blocks (IMBs), InterOp to SMPTE compliant systems are a few of the major examples.

For the last 10 years packages have been made by the classic companies, Technicolor and Deluxe, and more recently by some of the integrators such as Cinedigm, ArtsAlliance and XDC. Dolby has long had a separate group making packages.

There are several manufacturers who make package creation systems. The two most popular are from Doremi (CineAsset) and Qube (QubeMaster Pro and Xpress). Fraunhofer makes a package named EasyDCP. All of these systems cost in excess of $5,000. All are using somewhat user-cuddly front ends to steer the user through the many details and choices available. It is well known in the field that any product that pops out the other side needs to be tested on each variation of cinema player and projector to make certain that it will play when needed.

OpenDCP is no different2, but until now its interface was by command line (CLI), which added a layer of complexity to the learning curve. This month a new release was posted on the open source code site http://code.google.com/p/opendcp/.

The package roadmap tells of some of the features that hold it back from being the perfect tool for all users. One item not listed is that the GUI version will only create single reel packages (though the CLI will create multi-reel packages). And like all DCP creation packages, the user needs to test the package on the target system.

This brings up the point of “Why”, which becomes easily understood if one searches the net for requests by film-makers and directors who want their product played at film festivals and local cinemas that use digital projection systems. These artists commonly have eaten their relatively small budgets getting the entertainment shot and edited, where there is enough format and standards confusion. Often the festival site doesn’t know the answers either since this is yet another technical area in flux, manned by volunteers who only get fragments of data to pass on to their constituents. The topics of using DVDs or Blu Ray discs comes up. There is a commonality of panic as each question brings up further confusion. The nuance of multi-track audio and going from TV-centric HD standards to truly HD cinema standards (wider color space, 4:4:4 color depth instead of 4:2:0 and different White Points for example) brings up more decision points that can’t be universally answered.

Thus, one more complication in the road to cinema salvation by Alternative Content. While there are many good arguments that these details are best handled by pros who have experience with permanently set-up and maintained professional tools, the reality is that many of these artists just don’t have the money (or rather, they have time that they are forced by circumstances to value at less per hour.) One recent local film festival worked with a patron who charged a flat 200€ fee for the transfers, while the Venice Film Festival transfers materials gratis (in exchange for publicity, which Qube and D2 have taken advantage of for the last two years.)

There is also a need at cinemas to create and package local commercials or theater policy trailers for insertion into the pre-show of the movies and sport and concerts that they show through their digital projection systems. This might be easily handled in larger cities where there are companies who can make economies of scale work in their favor. But spending thousands getting a DCP made will eat all the profits from a quickly shot local pizza parlor ad. New tools such as the RED Scarlet, the Canon 5D MkIIGoPro or Drift cameras and easy to use editing software make this a nice adjunct to a clever facility…only held up by the expense and ease of creating the DCP.


With this background, we spoke to Terrence, the lead programmer for the OpenDCP project. He is a cinema owner of a 7 theater cinema facility which was one of the first independent complexes in the US to go completely digital. He has had extensive experience in the computer field as well, and it was just this need for making local commercials that got him on the project. After listing some of the features of this new DCP creation system with the Graphical User Interface, we’ll ask our Three Questions.

Features

  • JPEG2000 encoding from 8/12/16-bit TIFF images
  • Supports all major frame rates (24,25,30,48,50,60)
  • Cinema 2K and 4K
  • MPEG2 MXF
  • XYZ color space conversion
  • MXF file creation
  • SMPTE and MXF Interop
  • Full 3D support
  • DCP XML file creation
  • SMPTE subtitles
  • Linux/OSX/Windows
  • Multithreaded for encoding performance
  • XML Digital signatures
  • GUI

One last point – Open Source does not necessarily imply free. There is a lot of nuance in just this point, but for example, the EasyDCP system of Fraunhofer also uses tools that follow Open Source standards within its structure, yet it is a highly priced (and highly valued) package. More detail can be found at: GNU, Free Software, and Open Source Software – Linux 101

Hello Terrence. For all the great and required features of the OpenDCP software, what in reality should a user expect as they dive into its use? Without knocking any other package, what advantages and disadvantages will one see when using OpenDCP?

OpenDCP: Let’s continue on the conversation about Open Source tools to illustrate some points. In the current version of the OpenDCP package we use an open source encoder named “openjpeg” that does the work of encoding from the TIFF images to JPEG2000 package. The commercial products can afford to license much faster encoders. Their highend tools might create packages at 15 frames per second (fps) while the OpenDCP packages are converted at 3fps. On long-form projects this can make a significant difference in time. Not quality, of course, and for the short commercial or under 20 minute project this would be an acceptable compromise.

Another advantage that open source projects seem to take better advantage of is the methods of communication with their users. Where commercial entities have to beware of odd statements that live forever on the internet, as well as hackers and spammers and the like, our control issues are not as great and so the OpenDCP user forum can be more open and vibrant. It fits our spirit of cooperation to point to the work of an independent expert in the digital signatures field like Wolfgang Woehl of Filmmuseum Munich whose github digital_cinema_tools social coding site is filled with practical and historical information. He, as a support board monitor, and others of his skill are able to help guide the product and test it in ways that build on the fundamentals of Open Source. People can look through the code and make certain that the standards are kept, and that we don’t do things that commercial entities are often tempted to do.

It isn’t out of the question that we could license a faster JPEG 2000 encoder. We’ve discussed ways to do this on the site – there is a yearly cost of $10,000 to meet. Maybe we could do this with a Pro version, spreading the cost over a number of users. Or maybe we can help spur the OpenJPEG programmers along…anyone out there who is a math genius that wants to help?

DCTools: That’s out of our league, but hopefully there’s someone out there who can apply their genius to the task. How did you decide to take on this OpenDCP task?

OpenDCP: The origins of OpenDCP started in Oct 2010. I had wanted to create a policy trailer for my movie theater. Unfortunately, the cost to have one converted was around $2000 and the cost of the commercial DCP software was in the $5000 range. After some research I came across some people that were attempting to create DCPs using various open source tools. They had success, but the process was a bit involved. It required a half dozen tools, some knowledge of the DCI specifications, compiling of tools. I had some programming experience, so I decided I could take what I had learned and create a tool everyone could use. The first version had a command line interface and it’s feature set grew over a few months. It simplified the process a lot, but I really wanted to add a GUI and last month I released the first GUI version of the tool.

There is certainly a lot of interest in film festivals. A couple have floated the idea of an OpenDCP Film Festival. Unfortunately, I have neither the time or knowledge to plan that sort of thing.

DCTools: There is a great deal of interest toward the inclusion of the hard of hearing and the hearing and visually impaired audience into the great culture known as “Going To The Movies”. Indie producers who I’ve spoken to point out that there are thousands of professional movies shot but only hundreds get finished. Of those, only a small percentage get distribution. So added features like closed captions, narrative tracks and even sub-titles for other markets gets put on the “If List”.

On the other hand, the US Department of Justice will be handing down their directives or rulings soon on how many open and closed caption movies should be played in the commercial cinemas, and the EU is walking toward that path with the recent inclusion of the UN Human Rights documents being used as the basis for inclusion of people’s with handicaps in the marketplace.

How does OpenDCP handle these things, and what else is on your road map?

OpenDCP: Right now, we handle one narrative track per DCP. [DCTools: Many HI/VI equipment manufacturers can switch up to 4 narrative tracks per DCP.] Thus far the typical user hasn’t been doing anything too complex in those regards. OpenDCP will create SMPTE subtitle tracks. But we’ll get there with more options. For example, the GUI currently limits you to one reel per DCP. The command line allows multiple reels and the GUI will as well, just didn’t get done for the first release.

Subtitles are probably the biggest thing people want support for. OpenDCP can handle SMPTE subtitle tracks, but it doesn’t do anything with MXF Interop/Cinecanvas. For my own personal needs, I don’t use subtitles, they are pretty rare in the U.S. However, it seems almost everyone outside the U.S. really needs that support. The problem is that the majority want the Cinecanvas because they mention that SMPTE compliant packages are still not in the field. Most cinemas think that they aren’t going to upgrade their software until InterOp stops working, which is another challenge for SMPTE in general. My issue is that I don’t really want to spend my limited development time implementing features that will be deprecated.

As different packages are usable in the field it seemed like the DCPs that OpenDCP generated wouldn’t play on different sets of equipment all the time. Some media players seemed finicky while others would accept anything. It took several weeks of trying, but it finally worked. It was good because it helped find some slight differences between the MXF Interop and SMPTE packages and flushed out some bugs in my code.

I actually wasn’t even all that aware of how closed caption support in DCPs was handled until a month or so ago. Most of the information I used building OpenDCP came from the DCI 1.2 specification and sort of reverse engineering countless DCPs I had collected from my theater. Then when somebody was having trouble getting a DCP working on the player they were using, they donated a set of SMPTE documents to the project. Reading through the various documents really helped and thats when I learned about the CC stuff.

We hope to have material at the next ISDCF Plugfest. That will hopefully give us more feedback from the professional users.

I’ve gotten feedback from people of all different skill sets that have been able to use OpenDCP to create DCPs. Some have been using it for preshow/commericals, a few are using it for archiving, and independent film makers are quite happy with the results. The current version takes a tiff image sequence and does the jpeg2000 and XYZ color conversion for the picture track. The audio track is created from 24-bit 48/96khz PCM wav files. It supports pretty much supports the entire DCI specification – 3D, 2K/4K, 24, 25, 30, 50, 60fps, digital signatures, etc.

Future features including being able to convert more image types, read directly from video files, image resizing, and simplify the process even more.

Developing OpenDCP has been a great process, first just trying to meet the needs I had as a cinema owner, then really putting my EE degree and programming skills to use. One of the neatest things has been meeting and discussing digital cinema with all kinds of people. I’ve been lucky enough to see some really excellent independent short films and learn so much along the way.

1 GNU GPL v3

2 The OpenDCP author wants to be clear that the project is still considered beta, and that the user should expect some issues depending on different factors. For example, while reading the forum this article’s author noticed that one user had difficulties with an older computer with a slow processor – changing the number of threads in the set-up let the build complete successfully. Thus, the recommendation is to start the DCP process with a small with 5-10 second clip. Get a successful workflow and then do a full conversion.

KODAK Advances Lasers’ March on DCinema

The industry group is named Laser Illuminated Projection Association, or LIPA and was co-founded by IMAX and the company they have contracted with to supply laser light engines for their projectors, the New England based Laser Light Engines, plus Sony and according to their press release, “other cinema-industry players”.

Kodak made a statement in October 2010 that said they supported LIPA’s goals, but had already made an application to the FDA for a waiver on their projection design, which they expected soon. Soon has arrived. Following is the press release from Kodak.

Kodak has also said that they are laser system agnostic in their design, and though their demo unit uses Necsel devices (from California), they could also use a system from other companies, including Laser Light Engines. The two companies are a 400 mile (650 kilometer) drive apart.

So, let me guess? What does the public want to know? Ah! Time. This press release states “within two years.” Earlier releases have said, “12-18 months.” 


For a concise look at the KODAK system at the time of its first demonstations in October 2010, see:
Large Display Report: KODAK Demonstrates Laser Projector


This magazine is editorially in favor of switching over ASAP. The advantages of an even wider gamut will be a great device for differentiating home entertainment from the cinema experience, and 3D will never look right until it is able to get out of the mud of <10 candelas. Lasers help this because they can not only push more light through the system economically, they can also put a coherent ‘spin’ on the photons. Typically, lasers put out a linear polarization which isn’t quite right for 3D…think about not having to move your head for 2 hours to keep the linear glasses aligned properly with the screen. But circular polarization is possible. It is just one more thing on the research plate, no doubt.

Cost? If a Xenon bulb costs $5,000 and a typical cinema spends that 3 times per year per projector, and if a laser system will last 10 years, that gives us a simple comparison to measure against; $150,000. Lenses for Xenon systems cost on the order of $15,000, while similar spec’d lenses of higher f# will be significantly less. Add savings for personnel costs (and the danger of handling Xenon bulbs) plus the advantages of 10 years of significantly lower air conditioning needs…against…against…hmmm…no one is talking figures for cost just yet.


FDA Greenlights KODAK Laser Projection Technology

 ROCHESTER, N.Y., February 24, 2011 – The FDA (Food and Drug Administration) has approved a variance that allows for the sale of KODAK Laser Projector Systems using KODAK Laser Projection Technology to cinema exhibitors without the need for individual site or show operator variances. This is an important step forward in delivering brighter 2D and 3D images that provide higher dynamic range and a wider color gamut to theaters.

“The FDA approval brings KODAK Laser Projection Technology significantly closer to the marketplace and validates the work we’ve done to ensure that this technology is safe and dependable,” says Les Moore, Kodak’s chief operating officer for Digital Cinema. “In addition to allowing the sale of KODAK Laser Projector Systems using KODAK Laser Projection Technology, the FDA variance serves as a template to be followed by manufacturers that we license to incorporate this new laser technology.”

Typically, digital projection systems using high power lasers fall under the definition of a “demonstration laser” and must follow existing regulations for conventional laser projectors, such as those used in laser light show displays. Kodak has been working in conjunction with laser safety consultants and the FDA to address potential safety issues. The unique optical design of KODAK Laser Projection Technology manages the projector output so that it can be considered to be similar to conventional Xenon projection systems. The FDA variance allows the sale of KODAK Laser Projector Systems with KODAK Laser Projection Technology and theater/show configurations incorporating them.

KODAK Laser Projection Technology promises to bring vastly improved image quality to theater screens, including significantly brighter 3D viewing, and to dramatically reduce costs to digital projection in cinemas through the innovative use of long-life lasers, lower-cost optics and more efficient energy usage. Kodak introduced its laser technology in September 2010. The technology has been received enthusiastically by exhibitors, manufacturers, studios and viewers who have seen the demonstrations.

Moore notes that KODAK Laser Projection Technology is a key ingredient to potential improvements in digital cinema picture quality for both filmmakers and movie-goers. “This laser technology is a significant breakthrough that promises to have a positive ripple effect throughout the cinema world,” adds Moore. “We at Kodak have always endeavored to provide filmmakers with the best possible tools with which to tell their stories. That philosophy has served us well for more than a century, and we will continue nurturing that partnership long into the future.”

Kodak is in discussions to license this advanced technology, with an eye toward marketplace implementation within the next two years.

For more information, visit http://www.kodak.com/go/laserprojection.

 


 

About Entertainment Imaging

Kodak’s Entertainment Imaging Division is the world-class leader in providing film, digital and hybrid motion imaging products, services, and technology for the professional motion picture and exhibition industries. For more information, visit: 

www.kodak.com/go/motion.

Twitter at @Kodak_ShootFilm.

 

Media Contacts:

Sally Christgau/Lisa Muldowney

760-438-5250

[email protected][email protected]

DCPC – Digital Cinema Package Creator

Functions:

– SMPTE / MXF Interop DCPs
– 2D + 3D DCPs
– 2k BW/Scope, 4k BW/Scope and HD resolutions
– 6 Channel Sound, 24bit/48kHz
– Film and still image creation
– MPEG2 DCPs for E-Cinema Server
– DCP “re”wraping of MXF files
– Source image formats: bmp, tif, dpx, MPEG2 ES (MPEG2 DCP)
– Source Sound format: PCM 24bit
– Framerates: 24, 25

Utility archive: This archive contains the required Imagemagick, and a helpful
Program to e.g. avi video files to split into individual images in order to create aDCP.

3D channel separation test DCP: The DCP contains two test images for you to consider the quality of the channel separation. The left image contains the test images, the right image is black.

Deaf Sue Cinemark Chain

 

The suit is brought by The Association of Late-Deafened Adults (“ALDA”) on behalf of its members with hearing loss, and two individual plaintiffs.  The plaintiffs are represented by Disability Rights Advocates (“DRA”), a non-profit disability rights firm headquartered in Berkeley, California that specializes in high-impact cases on behalf of people with disabilities and John Waldo, a lawyer whose practice focuses on the unique legal needs of the Hard-of-Hearing and Deaf. He works on access and advocacy issues through the Washington Communication Access Project (Wash-CAP), www.hearinglosslaw.com

So begins the press release from Disability Rights Advocates which is available for download here with the complaint that was filed this week in a California Superior Court in Alameda County, California.  

Ongoing 3D Tools Article

This article will highlight 3D tools that sail past the author’s eyes. They may get more full articles in the appropriate sections if someone writes it, or may not.

Acquisition:  

3D movie Calculator | Stereographers calculator for iphone

Not only an essential tool, but the front page of the website has a concise set of details about 3D principles that should be known by rote.

 

Request for Comments: DoJ: Movie Captioning, Video Description

Just above the questions that the Department of Justice requests answers to, is the paragraph:

Finally, the Department is considering proposing that 50% of movie screens would offer captioning and video description 5 years after the effective date of the regulation. The Department originally requested guidance on any such figure in its 2008 NPRM. Individuals with disabilities, advocacy groups who represented individuals with disabilities, and eleven State Attorneys General advocated that the Department should require captioning and video description 100% of the time. Representatives from the movie industry did not want any regulation regarding captioning or video description. A representative of a non-profit organization recommended that the Department adopt a requirement that 50% of movies being exhibited be available with captioning and video description. The Department seeks further comment on this issue and is asking several questions regarding how such a requirement should be framed.

Finally, to temper the conversation, we submit the comment that Suzanne Robitaille of ablebodied.com made in her article on finding a captioned version of Avatar: “Ironic, as Avatar is about a man with a disability.”

An RTF document of the questions are also attached. This author makes no claims on whether the two attachments have mistakes, but nothing was purposefully screwed with.

Related Items:
New Accessibility Law Passes | TV, Internet and ???
Presentation: Hearing and Vision Impaired Audiences and DCinema
Implementing Closed Caption and HI / VI in the evolving DCinema World

RealD and Polaroid — Possible Promise PR

All stereoscopic technology, popularly (though not properly) called 3D, depends upon each eye receiving a slightly different picture, just as the spacing of the eyes gives each eye a slightly different picture in nature. 3D animation and camera systems try to duplicate this natural system, as do post-production systems. During exhibition, the projector then sends 6 images every 1/24th of a second, 3 identical left alternating with 3 identical right. Most systems block one eye while the other eye is receiving its picture. Then combined with other 3D clues that we use[1], the brain ‘fuses’ these nearly identical ‘parallax’ images together to give us a hopefully more realistic motion picture.

RealD and MasterImage systems use a “circular” polarizing technique to give each eye a different picture. After the projector sends the light of each picture, the light is given a “spin”. One lens blocks light coming at the eye with a clockwise spin, while the other lens lets that clockwise light come through. The next picture is given a counter-clockwise spin, and the corresponding lenses block or allow light. To maintain that polarized spin, the screen must be coated with a special paint, which screen manufacturers sell as Silver Screens.

Dolby uses a different technique, giving each eye different frequencies of light, which alternate before the projector lens. XpanD uses a 3rd technique, making its glasses lenses actively turns on and off in sync with the left and right image being transmitted from the projector. [This is the technology that most types of consumer TVs are using, for several reasons.]

In nature, light comes at us from all directions, bouncing off of many objects with different properties, one of the properties being the absorption and reflection of different frequencies giving us different colors. Another property is that the particles of light, the photons, come at us with different spins. Dr. Land, the inventor of the Polaroid process discovered that “glare” comes at us with a particular aligned spin, which could be blocked with a particularly aligned filter. The alignment in most cases is linear, that is, in a horizontal line, so this technique uses a linear filter. [The other techniques for creating home 3D images is using a linear filter over the TV screen, with linear lenses in the glasses. This is harder for manufacturers to do perfectly and there are other technical compromises with this type. So even though the glasses are cheaper, it doesn’t seem to be the trend in home 3DTV.]

Polaroid has just announced that they are licensed to carry the RealD brand name, and endorsement, on a line of 3D glasses. Polaroid isn’t the company that they used to be, but they are a force in the market. Polaroid shipped 7.5 million pairs of glasses last year, according to the website of their Swiss parent company Stylemark (of a total 50.5 million of Stylemark’s other brands.) They were developed in Scotland, and shipped predominantly throughout Europe, east through Russia and south through Asia, India and Australia. One guesses that none of them were circularly polarized. 

One also guesses that they have a lot of style, something that has been missing in theater 3D glasses. There are a couple of reasons for this. For glasses from Dolby and XpanD, which are reusable many hundreds of times, they must stand up to the abuse of wearing, collection, washing and distribution. But the real style-breaker, the thing that all the complainers whine about, is that the ear pieces are bulky, not elegant little stems. Here is a full sized picture of the Polaroid 3D glasses, while we discuss the temple arms, the stems that go from the lenses to the back of the ears. 

Polaroid 3D Glasses, Large Photo

One of the problems of tricking the brain, making it believe that there is a 3D image being presented on a 2D surface, is when one eye is given a lot of information that is different from what the other eye is getting. This doesn’t typically happen in nature. But it does typically happen in a cinema theater because they can get extra information from EXIT signs, reflections from the neighbor’s 3D glasses or popcorn bucket, and especially from reflections from the rear of our own glasses. The reasons that people get headaches from 3D movies is not fully examined, and may be from multiple and varied sources, but one reason seems to be this problem of non-symmetrical images. Blocking much of this extra light is possible with substantive temple arms, regardless of how they look. (No one talks about your ears for example…as far as you know…)

Also, if the glasses fit better, then the reflection from the rear (including re-reflected light that comes from the skin below the eyes) would be less of a problem. But “free glasses” have to be substantial enough to be mis-handled and “one size fits all”, even though people’s faces are different shapes and  sizes, and more importantly, so is the distance between people’s eyes (actually, people’s pupils, but I didn’t want to sound silly or get technical – the Inter-pupillary distance, the PD, is important for another 3D conversation.) One of the cool things about the Dolby glasses is that they are made from spherical glass, so that the distance from the lens to the pupil is the same, making it easier for the eye and eliminating edge distortion which is inherent with shaped lenses. But since the distance between people’s eyes can range from the low 4+ centimeters to the low 8+ centimeters, this is a problem that needs to be addressed, which the Polaroid press release says they have: 

And prescription lens wearers are not forgotten, with a range of premium 3D cover styles that fit comfortably over any optical frame. There is even a junior style for the younger audience to enjoy. 

But emphasizing the style issue is just plain wrong. They should be educating the public on why they need to block top and side light, which is not a ‘style-compatible’ issue. The ear stems must be bulky enough to block light entering from all directions.

Another benefit that Polaroid will hopefully bring is some consistency. One engineer reported that he recently measured 10 pair of 3D glasses, and none of the 20 lenses were close to being the same in terms of passing light and color. 

What the press release doesn’t say is when and how much. 

References: Schubin’sCafe has an article which explains many details of pupillary distance. He also describes several important 3D concepts, both in terms of cinema, and in terms of how it is not so simple to transfer digital “prints” and technology to 3DTV: The Other Three Dimensions of 3DTV

[1]Matt Cowen from RealD has made several presentations describing the several 3D clues that we have all used while watching 2D movies without stereoscopy, to understand where in space an article or person is relatively located.
3D; How It Works 

Glasses also are relevant to darkness in the room, so these two articles might come in handy:
Scotopic Issues with 3D, and Silver Screens
23 degrees…half the light. 3D What?

 

adjustable frames for US Army 3D lensesShades with leather side pieces for blocking sun.

All 3DAvatar™, AllThe3DTime™ [Updated]

News Corp. chairman and CEO Rupert Murdoch previously said …(excerpted)

FirstShowing.Net —James Cameron Delivers Updates on Avatar 2 and Re-Release

Yep, James Cameron and Avatar are back in the news again, but … First, he confirmed that he is producing Guillermo del Toro’s At the Mountains of Madness (announced a few weeks ago) and that they’ll shoot it in native 3D using next generation 3D cameras. [Surprise?]

We don’t exactly know what Cameron will be directing next, … he’s been getting inspiration for Avatar 2 by traveling down to South America and meeting with native tribes. “I have an overall narrative arc for [Avatar] 2 and 3, and there are some modifications to that based on my experiences in the last few months from having gone down to the Amazon and actually hung out with various indigenous groups who are actually living this type of story for real… but it’s not changing the overall pattern,” he said.

Finally, Cameron talked about converting Titanic to 3D and also complained about how terrible the Clash of the Titans 3D conversion was (as we all know). …

Marketsaw.blogspot — EXCLUSIVE: James Cameron Interview! Talks AVATAR Re-release, Sequels, 3D Conversions & Working With Del Toro!

[Listen to the audio interview on this page]

 

0:40 – Cameron confirms he is producing Guillermo del Toro’s AT THE MOUNTAINS OF MADNESS. The movie will be shot in native 3D using next generation FUSION 3D cameras from Pace. …

2:30 – Cameron talks about 3D conversions. TITANIC’s conversion is taking 8 months to a year to complete, not a fast turnaround like CLASH OF THE TITANS. Cameron: “(TITANS) showed a fundamental lack of knowledge about stereo space, …

5:00 – Cameron on how they are technically converting TITANIC. “You just can’t cut out edges, you’re going to get flat people moving around.” He will be using all his knowledge to put things on their right depth planes. They had tests for TITANIC from seven different conversion vendors on the exact same shots and they got back seven different answers as to were they thought things were spatially. “Some of them were not bad guesses and some of them were ridiculous.”

6:50 – The whole argument about conversion will go away for high end, first run 3D. Two years from now when there are thousands of 3D cameras out their shooting live feeds to 3D broadcast networks, how can a producer go to a studio and say…

9:05 – Cameron on talking with Steven Spielberg about converting his classic movies to 3D. …

11:20 – Cameron talks about AVATAR 2’s current status. …

12:04 – He is focusing his writing right now on the AVATAR novel (corresponds to the first film)…

12:45 – The AVATAR re-release will have 9 extra minutes, not 8 and it will all be CG. No extra footage of live action characters drinking coffee. Rainforest; some at night; a hunt sequence – …

15:45 – Cameron does not have the release timing of the 3D Blu-ray as …

Scotopic Issues with 3D, and Silver Screens

Here’s an interesting tid-bit to throw into the mix.

mesopic to photopic in candelaTo use rough numbers, according to this clever Luminance Conversion chart, 3 ftL (foot-lamberts) is 10 cd/m2 (candela per square meter). On the log chart to the left, that is somewhat below the arbitrary line between photopic and mesopic, the line where the eyes shift from a high degree of cone activity to predominantly rod vision. As the website which details this data points out (Visual Expert–Night Vision), among other things, this approach to dark brings a shift that diminishes sensitivity to long wavelength colors (red).

One thing we are pretty certain of, from recent discussions, is that;

  • some cinemas are pushing to get to 3 ftL behind the glasses, that
  • few would know how to measure that, and that
  • few would even dare to measure in the seats outside of the sweet spot of a silver screen.

To quote further from the Night Vision site:

“As illumination declines, the visual system starts conserving light in three ways. First, inhibitory responses weaken, and eventually stop. Second, inhibition is replaced by convergence, where the receptor outputs sum together to increase sensitivity but further reduce resolution. Third, there is more available photopigment as light declines. When light strikes a molecule in a photoreceptor, it “bleaches” the molecule, causing electrical activation that leads to a visual sensation. While in the bleached state, it is unresponsive to light. The more photopigment in a bleached state, the less available to respond to light and the lower the sensitivity. In dim light, very little of the photopigment is bleached, so the eye has greater light sensitivity. All of this occurs before and continues after the switch from cones to rods.

“One effect of switching to rods, however, is the “Purkinje shift.” During photopic cones vision, viewers are most sensitive to light that appears greenish-yellow. In scotopic vision, they are most sensitive to light which would appear greenish-blue during the day.”

End of Part 1; Scotopic Issues with 3D, and Silver Screens

Part 2: 23 degrees…half the light. 3D What?

Part 3:

Released en francais: DCinema Technical Best Practices [Updated]

Now In English, translated from the french by the EDCF – European Digital Cinema Forum; This excellent guide from the Federation National Cinemas Francais (FNCF) and the Commission Superieure Technique de l’Image et du Son (CST): TECHNICAL GUIDE FOR THE PROJECTION BOOTH IN DIGITAL CINEMA – Click the attachment link below.

End Update   — — 

La luminance de toutes les images, dans tous les formats de projection, doit être calibrée à 48 cd/m2. Le projecteur doit permettre la création de cette luminance.

The Federation of Cinemas and the Commission of Best Practices (La fédération des cinémas et la commission supérieure technique) has released a comprehensive document called The Technical Guide for the Digital Cinema Projection Booth (le Guide technique de la cabine cinéma numérique). The quote above, as an example, says that:

“The luminance of all images, in all the formats of projection, must be calibrated at 48 candelas per square meter. The projector must permit the creation of that luminance.”

And which professional digital cinema projector doesn’t create that level of light? One that is projecting a 3D movie would fit into that category. Please ask your local cinema manager if they are showing the latest movie at the required 14 foot-Lamberts (the 48 candela/m2 equivalent that the US and England uses) like they are supposed to.

If you are signed in, you can download the PDF version of le Guide technique de la cabine cinéma numérique here.

Red’s EPIC/Scarlett Problems {Update}

July 6 Update: A new Jannard post says the EPIC bug has been found and demented (and insinuates that it was the same bug that was holding back the Scarlet) and insists that they are back on the road of building the most best great and ultimate. A hint that the manufacturer is found, by saying that it will be built in the US, though that is not explicitly stated. The delivery dates are not hinted at, though some versions will definitely be in 2011 since the 28K sensor won’t be available since then. 


 

Jim Jannard continued his excellent client experiment by filling everyone in on further bug and manufacturing delays in a 14 June reduser.com post;

 

… we have a bug. It has held us up now for two months. We have working cameras, as you know. But we aren’t going to release anything until the cameras are done and bug free. And we have stumbled on an issue that has caused us considerable grief. It is unexpected and it has us baffled.

The fix could be tomorrow. Or not.

We have been a “lucky” company up to this point. The moon and stars lined up for us for the RED ONE (since we didn’t have a clue what we were doing in the beginning) and the RED ONE did all we asked. The M-X sensor is incredible… as you know. Our new ASICs for the EPIC and Scarlet are complicate times a million. And they work. Another miracle. Everything was late but on track. Then we hit a snag.

We have an army working on this. 24/7. Trust me when I tell you that we have been humbled. I have questioned our aggressive goals every day.

So what does this mean? Obviously another delay. To compound matters, the company that was to make Scarlet has made an incredible announcement recently and has significant issues. You can probably figure out who this is. This will force us to find a new manufacturing partner for that product. When we 1st got wind of this, we decided to make EPIC in the US, hoping that the company would find a solution in time for Scarlet production. That now seems unlikely so we are now scrambling for a new partner.

The manufacturing problem that is mentioned is presumed to be tied to Foxconn in China who is undergoing some major restructuring. It has to have several manufacturers scrabbling. For example, Apple has long made iPhone and other products with this group.

The EPIC and Scarlet camera are meant to bridge the original RED ONE, the Scarlet with 3K resolution and 5K or better for the EPIC. As recently as April, the EPIC was slatted for shipping in July, the Scarlet in August. 

Richard Lackey’s http://dcinema.wordpress.com/ has a great synopsis.

 

More World Cup 3D Woes—German exhibs question quality

Variety is following this issue: Read their full report at:

German exhibs nix 3D World Cup
Operators gives thumbs-down to technology
By ED MEZA—Posted: Thurs., May 27, 2010, 4:00am PT

See also: Collapse in 3D World Cup Broadcast -Variety 
THURSDAY, 13 MAY 2010 13:43


This isn’t the first broadcast coming into cinemas. Opera and live concerts have been successful. But, in 2D.

As far as production values, the Sony operation in South Africa is noted for being highly qualified. 

So, where is the problem? 

One can understand low resolution and interlace effects. When compared to 2K digital cinema, everything (including Blu-ray) is going to suffer in comparison. DCinema 3D is often chastised for its current limits.

Andreas Cruesemann, Cineplex’s head of sales and marketing is quoted to say, “We can’t take money for an experiment. That’s why we said no. We are not saying no to soccer or 3D screenings. We will be very happy when it’s working. We were really disappointed last week because we expected more. If the picture was good of course we’d pay for it.”

What the exhibitors don’t have to pay for is a 2D feed, but the other side is that they can’t charge for it. But exhibitors get more funds from the concessions anyway. If they can pull patrons in on nights that might be empty because of a popular match, would they care if it is 2D? There is no record for this, 2D or 3D.

The market is obvious and should be straight by the Olympics in 2012. The 30 games that were to be broadcast in 3D would have been a solid test, but as is seen from the dismissal of Aruna as distributor just a few weeks before the event, and now a major country’s largest exhibitor groups dissing the quality…it makes for an interesting set of questions.

The Variety article goes into the politics, but without technical details. Since the technical details are in flux, we’ll be careful not to demean anyone’s attempts. As one industry insider is known to say “That’s the great thing about standards…there’s so many of them.”

In this case, we are talking about the insertion of an evolving video ‘standard’ into an evolving cinema ‘standard’. According to the Variety article, people saw a test that was expected to show problems. Perhaps it shouldn’t have had an audience from the a non-technical public. 

There are certain challenges that need to be addressed in the transition of a TV signal with equipment that was designed for the highest quality digital cinema signal.


 

 

 

 

 

 

 

 


Richard LaBerge, executive VP for tech provider Sensio, told Daily Variety the German exhibs most likely saw a May 5 test of the worldwide network that was never meant for their eyes.

“I don’t know who invited the exhibitors,” said LaBerge. “I would assume it was Aruna.”

Aruna is the Swiss company that obtained the 3D out-of-home rights, only to have FIFA pull them. (Daily Variety, May 20). Whether or not that was a result of the unenthusiastic response to the 3D presentation is not clear.

LaBerge said it was “risky” to show that test, as it was expected to reveal problems with the video. “We did not recommend they show that to exhibitors.”

Exhib chains Cinemaxx and Cinestar have not ruled out carrying the games in 3D but say production values have to be much improved before they sign on.

LaBerge said there will be more screenings for exhibs once the video problems are resolved. The entire project is being done at breakneck speed to make the June 11 deadline.

FIFA, which could not be reached for comment, is partnering with Sensio and using the tech provider’s 3D format to deliver live telecasts via satellite.

Exhibs also have the option to present World Cup matches in 2D as free public screenings.

(David S. Cohen contributed to this report.)

Contact the Variety newsroom at [email protected].

The State of Digital Cinema – April 2010 – Part Zero

What they came up with is called the tri-stimulus system since the primary idea is that there are nerve endings in the eye which act as receptors, some of which primarily deal with green light, some with red and some with blue. These color receptors are called the cones (which don’t work at all in low light), while the receptors that can deal with low levels of light are called the rods.

Now, for the first of our amazing set of numbers, there are as many as 125 million receptors in the eye, of which only 6 or 7 million deal with color. When (predominantly) only one type of these receptors gets triggered, it will send a signal to the brain and the brain will designate the appropriate color. If two or more of these receptors are triggered, then the brain will do the work of combining them much the same way that a painter mixes water colors. (We’ll pretend it is that simple.)

OK; so how do you create a representation of all that color and detail on the TV or movie screen?

Let’s start with film. We think of it as one piece of plastic, but in reality it is several layers that each have a different dye of different sensitivity on it. Each dye reacts in a different and predictable manner when exposed to light through the camera lens. In the lab, each layer goes through a different chemical process to ‘develop’ a representation of what it captured when exposed by the camera system. There are a lot of steps in between, but eventually the film is exposed to light again, this time pushing light in the opposite manner, through the film and then through the lens. That light gets colored by the film and shows up on the screen.

One of the qualities of film is that the chemical and gel nature makes the range of colors in the image appear to be seamless. And not just ‘appears’ with the definition of “gives the impression of.” In fact, there is a great deal of resolution in modern film.

Then TV came along. We see a smooth piece of glass, but if we could touch the other side of a 1995 era TV set we would feel a dust that reacts to a strong beam of electricity. If we look real close we will see that there are actually different color dots, again green, red, and blue. Engineers figured out how to control that electric beam with magnets, which could trigger the different dots of color to make them light up separately or together to combine into a range of colors, and eventually combine those colors into pictures.
That was great, except people wanted better. Technology evolved to give them that. Instead of lighting up magic dust with a strong beam of electricity, a couple methods were discovered that allowed small colored capsules of gas to be lit up and even small pieces of colored plastic to light up. These segments and pieces were able to be packed tightly against each other so that they could make the pictures. Instead of only hundreds of lines being lit up by the electron gun in the old TV set, now over a thousand lines can be lit up, at higher speeds, using a lot less electricity.

Then a couple engineers figured out make and control a very tiny mirror to reflect light, then quickly move to not reflect light. That mirror is less than 25% of the size of a typical human hair.

Hundreds of these mirrors can be placed next to each other on a chip less than 2 centimeters square. Each mirror is able to precisely move on or off at a rate of 144 times a second, which is 6 times the speed that a motion picture film is exposed to light for a picture.

This chip is called a DLP, a Digital Light Projector, because a computer can tell each mirror when to turn one and off, so that when a strong light is reflected on an individual or set of mirrors, it will create part of a picture. If you put a computer in charge of 3 chips, one for green, one for red and one for blue, the reflected light can be focused through a lens and a very detailed picture will appear on the screen. There is a different but similar technology that Sony has refined for their professional cinema technology which uses crystals that change their state (status).

Now for the 2nd in our amazing set of numbers. There are 1,080 rows made up of 2,048 individual mirrors each for over 2 million 2 hundred thousand mirrors per chip. If you were to multiply that times 3 chips worth of mirrors, you get the same “about 6 or 7 million” mirrors as there are cones in each eye.

Without going into details (to keep this simple), we keep getting closer to being able to duplicate the range and intensity of colors that you see in the sky. This is one of the artists goals, in the same way as the engineers want to make a lighter, flatter, environmentally better television and movie playing system. It isn’t perfect, but picture quality has reached the point that incremental changes will be more subtle than substantive, or better only in larger rooms or specialist applications.

For example, a movie that uses the 2K standard will typically be in the 300 gigabyte size. A movie made in 4K, which technically has 4 times the resolution, will typically be less than 15% larger. This movie will be stored on a computer with many redundant drives, with redundant power supplies and graphics cards that are expressly made to be secure with special “digital cinema only” projectors.

Hopefully you have a feeling for the basic technology. It is not just being pushed onto people because it is the newest thing. The TV and movie businesses are going digital for a number of good reasons. To begin with, it wasn’t really possible to advance quality of the older technology without increasing the cost by a significant amount…and even then it would be incredibly cumbersome and remain an environmental nightmare. There are also advantages of flexibility that the new technology could do that the old couldn’t…or couldn’t at a reasonable price or at the quality of the new.

The technology of presenting a 3D image is one of those flexibility points. 3D was certainly one of the thrills of Avatar. The director worked for a decade learning how to handle the artistic and the technical sides of the art. He developed with closely aligned partners many different pieces of equipment and manners of using existing equipment to do things that haven’t been done before. And finally he spent hours on details that other budgets and people would only spend minutes. In the end James Cameron developed a technique and technology set that won’t be seen as normal for a long time from now…and an outstanding movie.

Could Avatar have been made on film? Well, almost no major motion picture has been made exclusively on film for a long time. They all use a technique named CGI (for the character generated imagery), which covers a grand set of techniques. But if you tried to generate the characters in Avatar exclusively on a computer with CGI, they never would have come out as detailed and inspiring as they did. Likewise, if he tried to create the characters with masks and other techniques with live action, you wouldn’t get the texture and feeling that the actors gave to their parts.

Could Avatar have been displayed with film, in 2D. Yes, it could have and it was.

3D is dealt with in more detail in Part II of this series, but here are some basics:

To begin, 3D is a misnomer. True 3 dimension presumes the ability to walk around a subject and see a full surround view, like the hologram of Princess Leah.

In real life a person who is partly hidden in one view, will be even more hidden or perhaps exposed from another view. On the screen of today’s 3D movie, when a character appears to  b partly hidden by a wall as seen by a person on the left side of the theater, they will also appear the same amount of hidden by someone on the right side of the theater.

In fact, what we see with out eyes and what we see in the new theaters is correctly termed “stereoscopic”. We are taught some of this in school, how to make two lines join somewhere out in space (parallax) and draw all the boxes on those lines to make them appear to recede in the distance…even though they are on one piece of paper. There are several more clues in addition to parallax that we use to discern whether something is closer or farther, and whether something is just a drawing on a sheet of paper or a full rounded person or sharp-edged box…even in a 2D picture.

And we have been doing this for years. We know that Bogie and Bergman are in front of the plane that apparently sits in the distance…our eyes/brain/mind makes up a story for us, 3 dimensions and probably more, even though it is a black and white set of pictures shown at 24 frames per second on a flat screen.

Digital 3D is an imperfect feature as of now. It has improved enough that companies are investing a lot of money to make and show the movies. The technology will be improved as the artists learn the technology and what the audiences appreciate.

Although we are in a phase that seems like “All 3D, All The Time”, 3D isn’t the most important part of the digital cinema transition. At first blush the most important consideration is the savings from all the parts of movie distribution, including lower print costs and transportation costs. But actually, because prints no longer cost over a thousand euros, and because it will be simple to distribute a digital file, lesser known artists will have the opportunity to get their work in front of more people, and more people will find it easier to enjoy entertainment from other cultures and other parts of the world.

This Series now includes:
The State of Digital Cinema – April 2010 – Part 0
The State of Digital Cinema – April 2010 – Part I
The State of Digital Cinema – April 2010 – Part II
Ebert FUDs 3D and Digital Cinema

Docs: HI / VI – Part One; DoJ

and perform periodic reviews of any rule judged to have a significant economic impact on a substantial number of small entities, and a regulatory assessment of the costs and benefits of any significant regulatory action as required by the Regulatory Flexibility Act, as amended by the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA).

[Editor] Following is some legal precedants which need sorting for relevance:

UNITED STATES OF AMERICA AGAINST HOYTS CINEMAS CORPORATION, REGAL …
an amended complaint under the Americans with Disabilities Act (?ADA?) alleging that the Regal Entertainment Group, Regal Cinemas, Inc. and Hoyts Cinemas Corporation…
http://www.usdoj.gov/crt/ada/regal.htm

UNITED STATES OF AMERICA V. CINEMARK USA, INC.
12 American Fork The Meadows 715 West 180 North American Fork, UT 84003 Holiday Village 4 1776 Park Avenue #4 Box 770-309 Park City, UT 84060 Virginia Cinemark Norfolk…
http://www.usdoj.gov/crt/ada/cinemark/cinemark4main.htm

–Accessibility Realities Correspondance–
letter responds to your letter regarding accessibility in multiscreen cinemas under the Americans With Disabilities Act (ADA). Specifically, your letter asks the…
http://www.usdoj.gov/crt/foia/tal551.txt

In the United States District Court for Western District of Tennessee …
this action to enforce provisions of the Americans with Disabilities Act (ADA) against Defendants American Multi-Cinema, Inc. and AMC Entertainment Inc. (collectively…
http://www.usdoj.gov/crt/ada/amcnonlos.htm

U.S. v. Hoyts Cinemas Corp.: Opposition of the US to Defendant …
citizens. A decade after the Americans with Disabilities Act (“ADA“), 42 U.S.C. ? 12101 et seq., was signed into law, Hoyts Cinemas Corp. (“Hoyts”) is designing and…
http://www.usdoj.gov/crt/ada/briefs/hoytopbr.pdf

Fiedler v. American Multi-Cinema, Inc.
and operated by the defendant, American Multi-Cinema, Inc. (“AMC”), are in violation of title III of the Americans with Disabilities Act (“ADA” or “the Act”), 42…
http://www.usdoj.gov/crt/ada/briefs/fiedlerbr.pdf

SETTLEMENT AGREEMENT BETWEEN THE UNITED STATES OF AMERICA AND …
DISABILITIES ACT IN DEPARTMENT OF JUSTICE COMPLAINT NUMBER 202-21-17 horizontal divider BACKGROUND This matter was initiated by a complaint filed under title III…
http://www.usdoj.gov/crt/ada/wallace.htm

JUSTICE DEPARTMENT SUES MAJOR MOVIE THEATER CHAIN FOR FAILING …
DEPARTMENT SUES MAJOR MOVIE THEATER CHAIN FOR FAILING TO COMPLY WITH ADA WASHINGTON, D.C. – American Multi-Cinema, Inc. and AMC Entertainment, operators of one of…
http://www.ada.gov/archive/amcpress.htm

JUSTICE DEPARTMENT SUES MAJOR MOVIE THEATER CHAIN FOR FAILING …
DEPARTMENT SUES MAJOR MOVIE THEATER CHAIN FOR FAILING TO COMPLY WITH ADA WASHINGTON, D.C. – American Multi-Cinema, Inc. and AMC Entertainment, operators of one of…
http://www.ada.gov/archive/amcpress.htm

Disability Rights online Newsletter: Issue Eight
of Massachusetts filed suit against Hoyts Cinemas Corporation, a theater chain subse- quently acquired by Regal in March of 2003. The initial suit alleged that Hoyts…
http://www.ada.gov/newsltr0805.pdf

–No Title–
convenience basis shall make available, upon request, a TDD for the use of an individual who has impaired hearing or a communication disorder. (2) This part does…
http://www.usdoj.gov/crt/foia/tal063.txt