Tag Archives: DCI

Beyond DCI – The Need for New D-Cinema Standards

Those of us who create and manufacture digital cinema projection equipment face the challenge of knowing what standards to aim for as frame rates jump from 24 to 48, 60 and beyond. At 24 fps, the DCI-specified peak bit rate of 250 Mbps for the picture is satisfactory. It seems logical then that, by doubling the frame rate to 48, we also need to double the bit rate to 500. Indeed, the general industry direction for exhibiting 48 fps 3D material is leaning toward a bit rate of 450 Mbps, leaving some room for peaking to 500.

As we move toward HFR and its necessary twin, high bit rate (HBR), the whole production and exhibition chain must move in unison. Cameras, servers, IMBs and projectors all have to be modified and advanced to keep up with the necessary speeds. Nowhere is this more evident than in the giant screen venues, which require multiple synchronized projectors and servers that can handle high bit rates and high frame rates. As an industry facing change, we need to come to agreement on what is necessary and update the DCI and SMPTE specs for the D-cinema industry. Before we can, we need to address some technical issues, namely the need for HFR content to test, how to measure frame rate specs of equipment, and what to do with mixed content within the same show. An additional concern is the special needs of Giant Screen exhibition, namely servers capable of streaming 4K 3D data to dual synchronized projectors.

First, there is the problem that testing new equipment at high frame rates and high bit rates requires content. This won’t be an issue if filmmakers begin filming at 48 or 60 fps. We’re looking forward to The Hobbit presentations later this year to see the full potential of the media.

Another technical issue is the varied ways in which bit rate is measured. We saw at NAB and CinemaCon this year that most manufacturers of D-cinema projection equipment now stream 48 fps data at aggregate speeds of 500 Mbps. However, this does not necessarily mean that all the internal independent components within the JPEG 2000 codestream, each of which may have limitations, can run at bit rates of 500 Mbps. DCP providers need to be aware that these limitations exist when making decisions about mastering. We will need to ensure that manufacturers report both aggregate and component bit rates.

An interesting dilemma that has not yet been solved is what to do with content of different speeds played in the same show. Servers and projectors will behave differently when switching between content with different frame rates and this can lead to viewing problems. What if a 24 fps trailer is played before a 48 fps presentation of The Hobbit? We will need to hear from exhibitors and content owners about want they want to provide in terms of an acceptable user experience. The Inter-Society Digital Cinema Forum (ISDCF) is aware of this problem and has been conducting tests with various manufacturers and content owners involved.

One major limitation to implementing HFR stereo 3D that we have solved is that of moving data at sufficient speeds from server to projector. This has been accomplished by swapping the HD-SDI cable for an Ethernet connection, as well as embedding IMBs in projectors. IMBs have now become industry-standard equipment shipped with all servers, and a necessary part of any new spec. Coupled with improvements in Series 2 projectors, including image brightness, the IMB’s increased speeds will certainly enhance image quality and alleviate some of the viewer discomfort during stereo 3D projection.

How Qube handles these challenges

The Qube XP-I server is capable of a bit rate of 1 Gbps, while each Xi IMB can handle up to 500 Mbps, with no component bit rate limitation. This is in keeping with current storage throughput and image decoder specs. Qube servers have the same component and aggregate bit rates.

The Qube XP-I server and Xi IMB are capable of frame rates up to 120 fps per eye. This gives a frame rate of up to 240 fps for dual projection driven by a single server streaming a single DCP for stereo 3D.

Qube has also shown that exhibition of 4K 3D content on Giant Screens is possible from a single XP-I server, streaming data at 1 Gbps and 30 fps to dual synchronized ultra-bright projectors. This greatly enhances the 3D viewing experience at Giant Screen venues.

When updating digital cinema specifications, we should aim high with respect to HFR in anticipation of where filmmakers might go. In this way we will be future proofing the next set of standards.

Rajesh Ramachandran is the CTO of Qube Cinema.

Beyond DCI – The Need for New D-Cinema Standards

Those of us who create and manufacture digital cinema projection equipment face the challenge of knowing what standards to aim for as frame rates jump from 24 to 48, 60 and beyond. At 24 fps, the DCI-specified peak bit rate of 250 Mbps for the picture is satisfactory. It seems logical then that, by doubling the frame rate to 48, we also need to double the bit rate to 500. Indeed, the general industry direction for exhibiting 48 fps 3D material is leaning toward a bit rate of 450 Mbps, leaving some room for peaking to 500.

As we move toward HFR and its necessary twin, high bit rate (HBR), the whole production and exhibition chain must move in unison. Cameras, servers, IMBs and projectors all have to be modified and advanced to keep up with the necessary speeds. Nowhere is this more evident than in the giant screen venues, which require multiple synchronized projectors and servers that can handle high bit rates and high frame rates. As an industry facing change, we need to come to agreement on what is necessary and update the DCI and SMPTE specs for the D-cinema industry. Before we can, we need to address some technical issues, namely the need for HFR content to test, how to measure frame rate specs of equipment, and what to do with mixed content within the same show. An additional concern is the special needs of Giant Screen exhibition, namely servers capable of streaming 4K 3D data to dual synchronized projectors.

First, there is the problem that testing new equipment at high frame rates and high bit rates requires content. This won’t be an issue if filmmakers begin filming at 48 or 60 fps. We’re looking forward to The Hobbit presentations later this year to see the full potential of the media.

Another technical issue is the varied ways in which bit rate is measured. We saw at NAB and CinemaCon this year that most manufacturers of D-cinema projection equipment now stream 48 fps data at aggregate speeds of 500 Mbps. However, this does not necessarily mean that all the internal independent components within the JPEG 2000 codestream, each of which may have limitations, can run at bit rates of 500 Mbps. DCP providers need to be aware that these limitations exist when making decisions about mastering. We will need to ensure that manufacturers report both aggregate and component bit rates.

An interesting dilemma that has not yet been solved is what to do with content of different speeds played in the same show. Servers and projectors will behave differently when switching between content with different frame rates and this can lead to viewing problems. What if a 24 fps trailer is played before a 48 fps presentation of The Hobbit? We will need to hear from exhibitors and content owners about want they want to provide in terms of an acceptable user experience. The Inter-Society Digital Cinema Forum (ISDCF) is aware of this problem and has been conducting tests with various manufacturers and content owners involved.

One major limitation to implementing HFR stereo 3D that we have solved is that of moving data at sufficient speeds from server to projector. This has been accomplished by swapping the HD-SDI cable for an Ethernet connection, as well as embedding IMBs in projectors. IMBs have now become industry-standard equipment shipped with all servers, and a necessary part of any new spec. Coupled with improvements in Series 2 projectors, including image brightness, the IMB’s increased speeds will certainly enhance image quality and alleviate some of the viewer discomfort during stereo 3D projection.

How Qube handles these challenges

The Qube XP-I server is capable of a bit rate of 1 Gbps, while each Xi IMB can handle up to 500 Mbps, with no component bit rate limitation. This is in keeping with current storage throughput and image decoder specs. Qube servers have the same component and aggregate bit rates.

The Qube XP-I server and Xi IMB are capable of frame rates up to 120 fps per eye. This gives a frame rate of up to 240 fps for dual projection driven by a single server streaming a single DCP for stereo 3D.

Qube has also shown that exhibition of 4K 3D content on Giant Screens is possible from a single XP-I server, streaming data at 1 Gbps and 30 fps to dual synchronized ultra-bright projectors. This greatly enhances the 3D viewing experience at Giant Screen venues.

When updating digital cinema specifications, we should aim high with respect to HFR in anticipation of where filmmakers might go. In this way we will be future proofing the next set of standards.

Rajesh Ramachandran is the CTO of Qube Cinema.

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

500Mbps Good Enough Tests

hfr test logo image mattersMany eyes and many tests later, the specifications for quality digital cinema playback was decided upon by the community. Then, in the spirit of ‘good enough’, Stereoscopic 3D quality problems were ignored. And more recently, it appears that High Frame Rate (HFR) and in particular HFR S3D is moving like an unexamined juggarnaut into the future.

At the SMPTE event last month held in conjunction with NAB, Dr. Marty Banks tossed some landmines into the Knowledge Base. Then one of the people who did tests that made the 48 frame per second decision for the Hobbit gave their historical view. His bombshell was that 48 was chosen because it wasn’t known whether (enough…any?) hardware manufacturers  could come to the plate with working equipment by the time of the release in late 2012.

One part of the DCI and SMPTE and ISO Specification for D-Cinema is a 250Mbps interface between the projector and the media player. In the early days this meant the link from the server, but since Series II TI systems capable of running 4K material (and all systems from Sony), this means an internal media block.

[Update: Qube announced at CinemaCon that their IMB supplies a 1Gig stream to the projector. A quick scan of the interwebz and the memory of other visits at CinemaCon puts the rest of the manufacturers at 500Mbps.]

24 frames per second times 2…OK, let’s double the Mbps into the projector…500Mbps is the bar that seems to be accepted wisdom for ‘good enough’ 48 frames per second stereoscopic 3D material, such as The Hobbit. Anyone got a problem with that? Answer: Who could? No one really has varied sources of material or even firm software to test it with.

Enter the new company image-matters. They have assembled equipment that will be able to show material at speeds above and below 1Gbps. They will show this at 6 cities around the world for the next 6 months. People will look and talk.

Here is the link for the press announcement:

High Frame Rate & High Bit Rate Test Equipment and Test Series

April 14, 2012, NAB Show, Las Vegas, for immediate release.

Image Matters, intoPIX, MikroM and Virident collaborate beyond the state of the art. The target is a series of tests on June 7 and 8 2012 in Burbank CA, coordinated by Michael Karagosian of MKPE Consulting, and cinematographers Kommer Kleijn SBC and David Stump ASC, as co-chairs of the SMPTE 21DC Study Group for Higher Frame Rates.

These tests will be conducted in collaboration with studios and the creative community. They will measure the minimum JPEG 2000 codestream bit rate requested by high frame rate content to reach the visually lossless quality demanded by digital cinema applications.

The experimental equipment set will enable playback of JPEG 2000 codestream bit rate higher than 1 Gbps (i.e. more than 4 times the current DCI specification). The decoded 2K images will be transmitted to a single projector at a frame rate of up to 120 fps (i.e. 60 fpe for Stereoscopic 3D content).

In order to speed up the test process and to allow the easy production of multiple encoding flavours, the equipment set will also be capable of encoding high frame rate content from uncompressed files in near real-time.

The assembled equipment will consist of one server incorporating 4 Virident FlashMAX boards and one intoPIX JPEG 2000 PRISTINE-P4 board. The PRISTINE will playback the decoded codestream on four 3G SDI links to the MikroM IMB inserted into the projector. The MikroM’s IMB will receive the four 3G-SDI links and pass the uncompressed image data directly to the projector backplane. Image Matters will insure project coordination and integration.

The integration has enough headroom to allow, on request, multiple equipments to be combined to achieve higher bit rates and/or higher frame rates.

Storage

  • Four 1.4 TB Virident FlashMAX MLC cards: 
    • total capacity of 5.6TB
    • total read bandwidth of 5.2 GB/s
    • total write bandwidth of 2.2GB/s on XFS file system.
  • Each Virident card has: 
    • a half height and half length form factor
    • a PCIe x8 Gen1 bus • power consumption of 25 W
    • a sustainable random read of 1,3 GB/s

JPEG 2000 Encoding/decoding

  • One intoPIX PRISTINE P4 board
    • 2K & 4K JPEG2000 decoder FPGA IP-cores
    • high frame rates capacity: up to 120 Fps
    • high bitrate capacity: up to 1 Gbps
    • four 3G-SDI outputs
    • one Genlock input
    • One MikroM Integrated MediaBlock MVC 201
      • four 3G-SDI input
      • Formatting and pass through of uncompressed image data
      • Up to 120 2K fps

      Information

      Please contact Jean-François Nivart
      [email protected]
      +32 495 23 00 08

      About Image Matters

      Image Matters offers innovative hardware and software modules for professional image and sound handling. This new venture helps OEMs, integrators and end-users to develop advanced imaging systems and applications easily and quickly.

      More information on www.image.matters.pro

      About intoPIX

      intoPIX is a leading supplier of image compression technology to audiovisual equipment manufacturers. We are passionate about offering people a higher quality image experience and have developed FPGA IP cores that enable leading edge JPEG 2000 image compression, security and hardware enforcement. Achieving a major breakthrough in digital cinema, intoPIX has achieved a leading position in the professional image compression industry based on the JPEG 2000 standard. More information on our company, customers and products can be found on www.intopix.com

      Interested in HFR technology? Contact Gael Rouvroy, intoPIX C.T.O. – [email protected] – +32479774944

      About MikroM

      MikroM is a leading design house and provider of state-of-the-art audio/video technologies for selected professional markets. The portfolio covers silicon-proven IPs, ASICs, PCBs and Systems in combination with professional design services. With a variety of products and services MikroM focus on application-specific and reliable solutions for system integrators and OEMs in quality-driven markets as HD Broadcast, Digital Cinema and Advertisement/Presentation.

      About Virident

      Virident Systems’ professional Storage Class Memory (SCM) solutions deliver unconditional consistent performance that supports the most data-intensive content and applications. Virident Systems is backed by strategic investors, Intel®, Cisco® Systems and a leading storage hardware and software solutions provider as well as venture investors Globespan CapitalPartners, Sequoia Capital, and Artiman Ventures. For more information visit www.virident.com.

      References:

      High Frame Rates – The New Black, Getting to Speed

      HFR-S3D Post SMPTE/CinemaCon Hobbit

      Combine 3, Drop 2, 120 becomes 24

500Mbps Good Enough Tests

hfr test logo image mattersMany eyes and many tests later, the specifications for quality digital cinema playback was decided upon by the community. Then, in the spirit of ‘good enough’, Stereoscopic 3D quality problems were ignored. And more recently, it appears that High Frame Rate (HFR) and in particular HFR S3D is moving like an unexamined juggarnaut into the future.

At the SMPTE event last month held in conjunction with NAB, Dr. Marty Banks tossed some landmines into the Knowledge Base. Then one of the people who did tests that made the 48 frame per second decision for the Hobbit gave their historical view. His bombshell was that 48 was chosen because it wasn’t known whether (enough…any?) hardware manufacturers  could come to the plate with working equipment by the time of the release in late 2012.

One part of the DCI and SMPTE and ISO Specification for D-Cinema is a 250Mbps interface between the projector and the media player. In the early days this meant the link from the server, but since Series II TI systems capable of running 4K material (and all systems from Sony), this means an internal media block.

[Update: Qube announced at CinemaCon that their IMB supplies a 1Gig stream to the projector. A quick scan of the interwebz and the memory of other visits at CinemaCon puts the rest of the manufacturers at 500Mbps.]

24 frames per second times 2…OK, let’s double the Mbps into the projector…500Mbps is the bar that seems to be accepted wisdom for ‘good enough’ 48 frames per second stereoscopic 3D material, such as The Hobbit. Anyone got a problem with that? Answer: Who could? No one really has varied sources of material or even firm software to test it with.

Enter the new company image-matters. They have assembled equipment that will be able to show material at speeds above and below 1Gbps. They will show this at 6 cities around the world for the next 6 months. People will look and talk.

Here is the link for the press announcement:

High Frame Rate & High Bit Rate Test Equipment and Test Series

April 14, 2012, NAB Show, Las Vegas, for immediate release.

Image Matters, intoPIX, MikroM and Virident collaborate beyond the state of the art. The target is a series of tests on June 7 and 8 2012 in Burbank CA, coordinated by Michael Karagosian of MKPE Consulting, and cinematographers Kommer Kleijn SBC and David Stump ASC, as co-chairs of the SMPTE 21DC Study Group for Higher Frame Rates.

These tests will be conducted in collaboration with studios and the creative community. They will measure the minimum JPEG 2000 codestream bit rate requested by high frame rate content to reach the visually lossless quality demanded by digital cinema applications.

The experimental equipment set will enable playback of JPEG 2000 codestream bit rate higher than 1 Gbps (i.e. more than 4 times the current DCI specification). The decoded 2K images will be transmitted to a single projector at a frame rate of up to 120 fps (i.e. 60 fpe for Stereoscopic 3D content).

In order to speed up the test process and to allow the easy production of multiple encoding flavours, the equipment set will also be capable of encoding high frame rate content from uncompressed files in near real-time.

The assembled equipment will consist of one server incorporating 4 Virident FlashMAX boards and one intoPIX JPEG 2000 PRISTINE-P4 board. The PRISTINE will playback the decoded codestream on four 3G SDI links to the MikroM IMB inserted into the projector. The MikroM’s IMB will receive the four 3G-SDI links and pass the uncompressed image data directly to the projector backplane. Image Matters will insure project coordination and integration.

The integration has enough headroom to allow, on request, multiple equipments to be combined to achieve higher bit rates and/or higher frame rates.

Storage

  • Four 1.4 TB Virident FlashMAX MLC cards: 
    • total capacity of 5.6TB
    • total read bandwidth of 5.2 GB/s
    • total write bandwidth of 2.2GB/s on XFS file system.
  • Each Virident card has: 
    • a half height and half length form factor
    • a PCIe x8 Gen1 bus • power consumption of 25 W
    • a sustainable random read of 1,3 GB/s

JPEG 2000 Encoding/decoding

  • One intoPIX PRISTINE P4 board
    • 2K & 4K JPEG2000 decoder FPGA IP-cores
    • high frame rates capacity: up to 120 Fps
    • high bitrate capacity: up to 1 Gbps
    • four 3G-SDI outputs
    • one Genlock input
    • One MikroM Integrated MediaBlock MVC 201
      • four 3G-SDI input
      • Formatting and pass through of uncompressed image data
      • Up to 120 2K fps

      Information

      Please contact Jean-François Nivart
      [email protected]
      +32 495 23 00 08

      About Image Matters

      Image Matters offers innovative hardware and software modules for professional image and sound handling. This new venture helps OEMs, integrators and end-users to develop advanced imaging systems and applications easily and quickly.

      More information on www.image.matters.pro

      About intoPIX

      intoPIX is a leading supplier of image compression technology to audiovisual equipment manufacturers. We are passionate about offering people a higher quality image experience and have developed FPGA IP cores that enable leading edge JPEG 2000 image compression, security and hardware enforcement. Achieving a major breakthrough in digital cinema, intoPIX has achieved a leading position in the professional image compression industry based on the JPEG 2000 standard. More information on our company, customers and products can be found on www.intopix.com

      Interested in HFR technology? Contact Gael Rouvroy, intoPIX C.T.O. – [email protected] – +32479774944

      About MikroM

      MikroM is a leading design house and provider of state-of-the-art audio/video technologies for selected professional markets. The portfolio covers silicon-proven IPs, ASICs, PCBs and Systems in combination with professional design services. With a variety of products and services MikroM focus on application-specific and reliable solutions for system integrators and OEMs in quality-driven markets as HD Broadcast, Digital Cinema and Advertisement/Presentation.

      About Virident

      Virident Systems’ professional Storage Class Memory (SCM) solutions deliver unconditional consistent performance that supports the most data-intensive content and applications. Virident Systems is backed by strategic investors, Intel®, Cisco® Systems and a leading storage hardware and software solutions provider as well as venture investors Globespan CapitalPartners, Sequoia Capital, and Artiman Ventures. For more information visit www.virident.com.

      References:

      High Frame Rates – The New Black, Getting to Speed

      HFR-S3D Post SMPTE/CinemaCon Hobbit

      Combine 3, Drop 2, 120 becomes 24

CST 6th Day of Techniques…DCinema

The presumption is that a projector will be delivered, set-up and fit to the screen. But as the woman pointed out, more and more facilities are getting the projector dropped of, the picture is aligned to the screen and everything else is good to go…no colorimetry calibration.

She mentioned that many maintenance contracts lacked this initial colorimetry calibration. The odd part is that many of the maintenance agreements preclude engaging a 3rd party for this calibration.

The installation groups on the panel did point out that they include a yearly calibration.

[Fill in your own comments about DCI and SMPTE specs and how often light obeyed annual rules. How many bulbs are changed in that period of time? Did any of the bulbs get put in off-kilter? How often are higher rated bulbs swapped in to support 3D? Digital Cinema Initiatives (DCI) – DIGITAL CINEMA SYSTEM SPECIFICATION, VERSION 1.2]

Your author has been hearing this story for 5 years, at least. The first time he heard it, it really struck him as odd since all the systems that he was involved with setting up in the 2002-2006 era were all set up with an expensive spectroradiometer and a skilled operator. The digital world brings a lot of advantages, but in this area there are many things that are not objective.

Perhaps everyone is using the SMPTE DProVe system? DProVe | Digital Projector Verifier

This article is a work-in-progress since it is simple to go to a thousand tangential problem areas from these few facts. There is even talk of a breakthrough on the CNC silver screen problem.

There may be a lot of overtime for the SMPTE Police.

CST 6th Day of Techniques…DCinema

The presumption is that a projector will be delivered, set-up and fit to the screen. But as the woman pointed out, more and more facilities are getting the projector dropped of, the picture is aligned to the screen and everything else is good to go…no colorimetry calibration.

She mentioned that many maintenance contracts lacked this initial colorimetry calibration. The odd part is that many of the maintenance agreements preclude engaging a 3rd party for this calibration.

The installation groups on the panel did point out that they include a yearly calibration.

[Fill in your own comments about DCI and SMPTE specs and how often light obeyed annual rules. How many bulbs are changed in that period of time? Did any of the bulbs get put in off-kilter? How often are higher rated bulbs swapped in to support 3D? Digital Cinema Initiatives (DCI) – DIGITAL CINEMA SYSTEM SPECIFICATION, VERSION 1.2]

Your author has been hearing this story for 5 years, at least. The first time he heard it, it really struck him as odd since all the systems that he was involved with setting up in the 2002-2006 era were all set up with an expensive spectroradiometer and a skilled operator. The digital world brings a lot of advantages, but in this area there are many things that are not objective.

Perhaps everyone is using the SMPTE DProVe system? DProVe | Digital Projector Verifier

This article is a work-in-progress since it is simple to go to a thousand tangential problem areas from these few facts. There is even talk of a breakthrough on the CNC silver screen problem.

There may be a lot of overtime for the SMPTE Police.

CST 6th Day of Techniques…DCinema

The presumption is that a projector will be delivered, set-up and fit to the screen. But as the woman pointed out, more and more facilities are getting the projector dropped of, the picture is aligned to the screen and everything else is good to go…no colorimetry calibration.

She mentioned that many maintenance contracts lacked this initial colorimetry calibration. The odd part is that many of the maintenance agreements preclude engaging a 3rd party for this calibration.

The installation groups on the panel did point out that they include a yearly calibration.

[Fill in your own comments about DCI and SMPTE specs and how often light obeyed annual rules. How many bulbs are changed in that period of time? Did any of the bulbs get put in off-kilter? How often are higher rated bulbs swapped in to support 3D? Digital Cinema Initiatives (DCI) – DIGITAL CINEMA SYSTEM SPECIFICATION, VERSION 1.2]

Your author has been hearing this story for 5 years, at least. The first time he heard it, it really struck him as odd since all the systems that he was involved with setting up in the 2002-2006 era were all set up with an expensive spectroradiometer and a skilled operator. The digital world brings a lot of advantages, but in this area there are many things that are not objective.

Perhaps everyone is using the SMPTE DProVe system? DProVe | Digital Projector Verifier

This article is a work-in-progress since it is simple to go to a thousand tangential problem areas from these few facts. There is even talk of a breakthrough on the CNC silver screen problem.

There may be a lot of overtime for the SMPTE Police.

Certificate Authorities and DCinema

Another has been found to have introduced a man-in-the-middle attack vector, meaning that once a legitimate user opened the door by giving the correct credentials, someone slipped in and assumes the identity of that user with all their rights (usually kicking them off the system – something that should arouse suspicion but which happens so often, seems normal.

Last week the Big Kahuna of CAs, Verisign, had to admit that they also were hacked into and that data was stolen from their systems. Coming so long after the break-in and after people got used to the news that smaller sites were hacked (relatively smaller sites…still significant to the system though), this isn’t getting a lot of play. When Belgian CA GlobalSign was broken into the hue and cry approached ChickenLittle-ish. This week I see articles on Verisign that don’t get any clicks.

Is it that all the tech geniuses at all the dcinema installers and installation and distribution sites double-triple checked their firewalls and decided they were nuke free and nuke-proof? Or perhaps we are complacent, feeling that the industry is not like the bank industry, with no immediate link to buckets of spendable cash, and no one really focusing the industry. Or, perhaps more logically, the dcinema industry is just hoping that the entire unbuilt fortress of SMPTE compliance will get together before the jewels that the studios need to protect get too exposed, because – “Hey, we’re pedaling as fast as we can, and see, you wanted all these updates put into legacy equipment with constant patching to the legacy InterOp format…”

For bettor or worse, there is no universal trusted device list in the industry, most likely due to potential liability issues. This has led to every company and their brother having a separate list – though there is enough interplay that these are presumed to have enough intercourse that if one list is polluted with a rogue ‘signed’ utensil, it would be disseminated throughout the lists. So, the best and the worse of all possible worlds.

Into this is a RFI from a company (last week) suggesting that they can build a system…

This article is a work in progress. Here are some of the industry articles that provoked the issue:

Who to trust after the VeriSign hack? | IT PRO

VeriSign admits 2010 hack | IT PRO

Trustwave issued a man-in-the-middle certificate – The H Security: News and Features

Break-ins at domain registrar VeriSign in 2010 – The H Security: News and Features

Backdoor in TRENDnet IP cameras – The H Security: News and Features

Certificate fraud: Protection against future “DigiNotars” – The H Security: News and Features

OpenPGP in browsers – The H Security: News and Features

Google researchers propose way out of the SSL dilemma – The H Security: News and Features

Google wants to do away with online certificate checks – The H Security: News and Features

Is the end nigh for Certificate Authorities? | IT PRO

Certificate issuing stopped at KPN after server break-in discovered – The H Security: News and Features

Certificate Authorities and DCinema

Another has been found to have introduced a man-in-the-middle attack vector, meaning that once a legitimate user opened the door by giving the correct credentials, someone slipped in and assumes the identity of that user with all their rights (usually kicking them off the system – something that should arouse suspicion but which happens so often, seems normal.

Last week the Big Kahuna of CAs, Verisign, had to admit that they also were hacked into and that data was stolen from their systems. Coming so long after the break-in and after people got used to the news that smaller sites were hacked (relatively smaller sites…still significant to the system though), this isn’t getting a lot of play. When Belgian CA GlobalSign was broken into the hue and cry approached ChickenLittle-ish. This week I see articles on Verisign that don’t get any clicks.

Is it that all the tech geniuses at all the dcinema installers and installation and distribution sites double-triple checked their firewalls and decided they were nuke free and nuke-proof? Or perhaps we are complacent, feeling that the industry is not like the bank industry, with no immediate link to buckets of spendable cash, and no one really focusing the industry. Or, perhaps more logically, the dcinema industry is just hoping that the entire unbuilt fortress of SMPTE compliance will get together before the jewels that the studios need to protect get too exposed, because – “Hey, we’re pedaling as fast as we can, and see, you wanted all these updates put into legacy equipment with constant patching to the legacy InterOp format…”

For bettor or worse, there is no universal trusted device list in the industry, most likely due to potential liability issues. This has led to every company and their brother having a separate list – though there is enough interplay that these are presumed to have enough intercourse that if one list is polluted with a rogue ‘signed’ utensil, it would be disseminated throughout the lists. So, the best and the worse of all possible worlds.

Into this is a RFI from a company (last week) suggesting that they can build a system…

This article is a work in progress. Here are some of the industry articles that provoked the issue:

Who to trust after the VeriSign hack? | IT PRO

VeriSign admits 2010 hack | IT PRO

Trustwave issued a man-in-the-middle certificate – The H Security: News and Features

Break-ins at domain registrar VeriSign in 2010 – The H Security: News and Features

Backdoor in TRENDnet IP cameras – The H Security: News and Features

Certificate fraud: Protection against future “DigiNotars” – The H Security: News and Features

OpenPGP in browsers – The H Security: News and Features

Google researchers propose way out of the SSL dilemma – The H Security: News and Features

Google wants to do away with online certificate checks – The H Security: News and Features

Is the end nigh for Certificate Authorities? | IT PRO

Certificate issuing stopped at KPN after server break-in discovered – The H Security: News and Features

Lasers…somebody knows…Barco? RED???

The basic exception was Laser Light Engines (LLE), who have a deal with IMAX to put lasers into the big room cinemas. If ever there were a nice niche to start this adventure with, this is it. Specialized, contained to dozens and hundreds instead of 10’s of thousands, able to absorb any exceptional pricing, able to evolve. Delivery was scheduled to begin in Spring 2012.

Then the film maker turned digital imaging specialist Kodak shows a system that they clearly are not productizing. But they are playing in the game. They helped set up the organization which is working (throughout the world?) to take projection booth laser systems out of the field of laser entertainment systems, which require a special technology variance for every set-up. Kodak was able to get one by themselves, but the Laser Illuminated Projection Association – LIPA – includes Sony and IMAX, plus LLE and Kodak in this effort. In the US, the over-riding entity is the Food and Drug Administration’s Center for Devices and Radiological Health, which is in charge of ensuring laser equipment safety.

This spring, LLE showed up in Hollywood at that chapter’s SMPTE meeting with Sony and Barco giving powerpoint presentations. Sony had made a couple of public remarks previously, but one had to be culling their online tech papers to notice. And until this point Barco had been quiet…except that the week before they did a demo at the RED Studios Hollywood lot. Nice splash.

Then nothing. No remarks from anyone at CineExpo or CineEurope. The idea has gelled that digital laser projection is 2 years away, or more.

Then this week. The RED user group message board lit up after two pre-viewer comments placed at the head of a thread by RED owner Jim Jannard: Mark L. Pederson of OffHollywood and Stephen Pizzo, Co-Founder of Element Technica and now partner of 3ality Technica, make remarks about having watched a demo of RED’s laser projector. “Vibrant”, “clean”, “never seen projection so …”, etc. Then a few non-answers to poorly phrased guesses (for example, that 4K is a benchmark, and passive 3D did leak out, but both could mean several things) and that was that…25 pages of wasted time thereafter. [Can anyone please vouch for the merits of Misters Pederson and Pizzo as to their ability to discern whether the technology they viewed is comparibly better than what has been seen otherwise?)

Barco, on the hand (and yet similarly) have made an announcement that 9 and 10 January will be their big days. – D3D Cinema to Present Giant Screen 4K 3D Laser Projection Demo at 2nd Annual Moody Digital Cinema Symposium – Well, actually, no. Barco only said, “We’re fully committed to providing the highest quality solutions for giant screen theaters” and some similar non-relevent info about how wonderful their partner is. Basically though, their name is on a press release announcing that they will butterfly laser driven digital cinema light against 15 perf 70mm and 4 other “revolutions”:

  • The FIRST demonstration of Barco’s revolutionary laser light engine on a giant screen
  • The FIRST demonstration of true DLP 4K resolution 3D on a giant screen
  • The FIRST 4K 3D comparison of ‘ultra-reality’ 48 frame/sec & 60 frame/sec content
  • The FIRST giant 3D 500 mbps comparison, nearly double the current cinema bit rate standard

Not withstanding the lack of filtering for marketing bits, and regardless of how some of the terms have been ill-defined in the past (4K 3D, for example), this is still a pretty good line-up.

Prediction: 2012 will be the year that several studios tell their exhibition partners a final date for film distribution (in 2013) and 2012 will have more than one commercial laser system in the field.

Prediction 3 – there may not be more than one DCI compliant system in the field though. RED might find that, if they thought bringing a small camera to market was a difficult trick, supporting projectors is a whole different matter…even if it is only to post-houses and their owners.

Regardless, this is mostly good news. That the RED is using passive doesn’t exactly mean silver screen passive. Perhaps Dolby passive, which would certainly be good news. If it is silver screen passive, that is bad news. Since silver screens don’t comply with SMPTE standards, they may end up on the scrap heap of history. But that is a different story for another article.

3Questions: OpenDCP – Now with GUI

Open Source tools are described throughout the DCI specifications, and the nuance of using them is detailed in the myriad SMPTE (and ISO) documents of Digital Cinema. The Digital Cinema Package (DCP) is a complex joining of various video and audio standards coupled with several security protocols that make the transport, local storage and playout of entertainment able to be used by any combination of the available ‘compliant’ media players and projectors.

Since official compliance is a new part of the dcinema world, this hasn’t been an easy task. It is made more complicated by the several transitions that the equipment is going through; Series One and Series Two projectors, external to internal media blocks (IMBs), InterOp to SMPTE compliant systems are a few of the major examples.

For the last 10 years packages have been made by the classic companies, Technicolor and Deluxe, and more recently by some of the integrators such as Cinedigm, ArtsAlliance and XDC. Dolby has long had a separate group making packages.

There are several manufacturers who make package creation systems. The two most popular are from Doremi (CineAsset) and Qube (QubeMaster Pro and Xpress). Fraunhofer makes a package named EasyDCP. All of these systems cost in excess of $5,000. All are using somewhat user-cuddly front ends to steer the user through the many details and choices available. It is well known in the field that any product that pops out the other side needs to be tested on each variation of cinema player and projector to make certain that it will play when needed.

OpenDCP is no different2, but until now its interface was by command line (CLI), which added a layer of complexity to the learning curve. This month a new release was posted on the open source code site http://code.google.com/p/opendcp/.

The package roadmap tells of some of the features that hold it back from being the perfect tool for all users. One item not listed is that the GUI version will only create single reel packages (though the CLI will create multi-reel packages). And like all DCP creation packages, the user needs to test the package on the target system.

This brings up the point of “Why”, which becomes easily understood if one searches the net for requests by film-makers and directors who want their product played at film festivals and local cinemas that use digital projection systems. These artists commonly have eaten their relatively small budgets getting the entertainment shot and edited, where there is enough format and standards confusion. Often the festival site doesn’t know the answers either since this is yet another technical area in flux, manned by volunteers who only get fragments of data to pass on to their constituents. The topics of using DVDs or Blu Ray discs comes up. There is a commonality of panic as each question brings up further confusion. The nuance of multi-track audio and going from TV-centric HD standards to truly HD cinema standards (wider color space, 4:4:4 color depth instead of 4:2:0 and different White Points for example) brings up more decision points that can’t be universally answered.

Thus, one more complication in the road to cinema salvation by Alternative Content. While there are many good arguments that these details are best handled by pros who have experience with permanently set-up and maintained professional tools, the reality is that many of these artists just don’t have the money (or rather, they have time that they are forced by circumstances to value at less per hour.) One recent local film festival worked with a patron who charged a flat 200€ fee for the transfers, while the Venice Film Festival transfers materials gratis (in exchange for publicity, which Qube and D2 have taken advantage of for the last two years.)

There is also a need at cinemas to create and package local commercials or theater policy trailers for insertion into the pre-show of the movies and sport and concerts that they show through their digital projection systems. This might be easily handled in larger cities where there are companies who can make economies of scale work in their favor. But spending thousands getting a DCP made will eat all the profits from a quickly shot local pizza parlor ad. New tools such as the RED Scarlet, the Canon 5D MkIIGoPro or Drift cameras and easy to use editing software make this a nice adjunct to a clever facility…only held up by the expense and ease of creating the DCP.


With this background, we spoke to Terrence, the lead programmer for the OpenDCP project. He is a cinema owner of a 7 theater cinema facility which was one of the first independent complexes in the US to go completely digital. He has had extensive experience in the computer field as well, and it was just this need for making local commercials that got him on the project. After listing some of the features of this new DCP creation system with the Graphical User Interface, we’ll ask our Three Questions.

Features

  • JPEG2000 encoding from 8/12/16-bit TIFF images
  • Supports all major frame rates (24,25,30,48,50,60)
  • Cinema 2K and 4K
  • MPEG2 MXF
  • XYZ color space conversion
  • MXF file creation
  • SMPTE and MXF Interop
  • Full 3D support
  • DCP XML file creation
  • SMPTE subtitles
  • Linux/OSX/Windows
  • Multithreaded for encoding performance
  • XML Digital signatures
  • GUI

One last point – Open Source does not necessarily imply free. There is a lot of nuance in just this point, but for example, the EasyDCP system of Fraunhofer also uses tools that follow Open Source standards within its structure, yet it is a highly priced (and highly valued) package. More detail can be found at: GNU, Free Software, and Open Source Software – Linux 101

Hello Terrence. For all the great and required features of the OpenDCP software, what in reality should a user expect as they dive into its use? Without knocking any other package, what advantages and disadvantages will one see when using OpenDCP?

OpenDCP: Let’s continue on the conversation about Open Source tools to illustrate some points. In the current version of the OpenDCP package we use an open source encoder named “openjpeg” that does the work of encoding from the TIFF images to JPEG2000 package. The commercial products can afford to license much faster encoders. Their highend tools might create packages at 15 frames per second (fps) while the OpenDCP packages are converted at 3fps. On long-form projects this can make a significant difference in time. Not quality, of course, and for the short commercial or under 20 minute project this would be an acceptable compromise.

Another advantage that open source projects seem to take better advantage of is the methods of communication with their users. Where commercial entities have to beware of odd statements that live forever on the internet, as well as hackers and spammers and the like, our control issues are not as great and so the OpenDCP user forum can be more open and vibrant. It fits our spirit of cooperation to point to the work of an independent expert in the digital signatures field like Wolfgang Woehl of Filmmuseum Munich whose github digital_cinema_tools social coding site is filled with practical and historical information. He, as a support board monitor, and others of his skill are able to help guide the product and test it in ways that build on the fundamentals of Open Source. People can look through the code and make certain that the standards are kept, and that we don’t do things that commercial entities are often tempted to do.

It isn’t out of the question that we could license a faster JPEG 2000 encoder. We’ve discussed ways to do this on the site – there is a yearly cost of $10,000 to meet. Maybe we could do this with a Pro version, spreading the cost over a number of users. Or maybe we can help spur the OpenJPEG programmers along…anyone out there who is a math genius that wants to help?

DCTools: That’s out of our league, but hopefully there’s someone out there who can apply their genius to the task. How did you decide to take on this OpenDCP task?

OpenDCP: The origins of OpenDCP started in Oct 2010. I had wanted to create a policy trailer for my movie theater. Unfortunately, the cost to have one converted was around $2000 and the cost of the commercial DCP software was in the $5000 range. After some research I came across some people that were attempting to create DCPs using various open source tools. They had success, but the process was a bit involved. It required a half dozen tools, some knowledge of the DCI specifications, compiling of tools. I had some programming experience, so I decided I could take what I had learned and create a tool everyone could use. The first version had a command line interface and it’s feature set grew over a few months. It simplified the process a lot, but I really wanted to add a GUI and last month I released the first GUI version of the tool.

There is certainly a lot of interest in film festivals. A couple have floated the idea of an OpenDCP Film Festival. Unfortunately, I have neither the time or knowledge to plan that sort of thing.

DCTools: There is a great deal of interest toward the inclusion of the hard of hearing and the hearing and visually impaired audience into the great culture known as “Going To The Movies”. Indie producers who I’ve spoken to point out that there are thousands of professional movies shot but only hundreds get finished. Of those, only a small percentage get distribution. So added features like closed captions, narrative tracks and even sub-titles for other markets gets put on the “If List”.

On the other hand, the US Department of Justice will be handing down their directives or rulings soon on how many open and closed caption movies should be played in the commercial cinemas, and the EU is walking toward that path with the recent inclusion of the UN Human Rights documents being used as the basis for inclusion of people’s with handicaps in the marketplace.

How does OpenDCP handle these things, and what else is on your road map?

OpenDCP: Right now, we handle one narrative track per DCP. [DCTools: Many HI/VI equipment manufacturers can switch up to 4 narrative tracks per DCP.] Thus far the typical user hasn’t been doing anything too complex in those regards. OpenDCP will create SMPTE subtitle tracks. But we’ll get there with more options. For example, the GUI currently limits you to one reel per DCP. The command line allows multiple reels and the GUI will as well, just didn’t get done for the first release.

Subtitles are probably the biggest thing people want support for. OpenDCP can handle SMPTE subtitle tracks, but it doesn’t do anything with MXF Interop/Cinecanvas. For my own personal needs, I don’t use subtitles, they are pretty rare in the U.S. However, it seems almost everyone outside the U.S. really needs that support. The problem is that the majority want the Cinecanvas because they mention that SMPTE compliant packages are still not in the field. Most cinemas think that they aren’t going to upgrade their software until InterOp stops working, which is another challenge for SMPTE in general. My issue is that I don’t really want to spend my limited development time implementing features that will be deprecated.

As different packages are usable in the field it seemed like the DCPs that OpenDCP generated wouldn’t play on different sets of equipment all the time. Some media players seemed finicky while others would accept anything. It took several weeks of trying, but it finally worked. It was good because it helped find some slight differences between the MXF Interop and SMPTE packages and flushed out some bugs in my code.

I actually wasn’t even all that aware of how closed caption support in DCPs was handled until a month or so ago. Most of the information I used building OpenDCP came from the DCI 1.2 specification and sort of reverse engineering countless DCPs I had collected from my theater. Then when somebody was having trouble getting a DCP working on the player they were using, they donated a set of SMPTE documents to the project. Reading through the various documents really helped and thats when I learned about the CC stuff.

We hope to have material at the next ISDCF Plugfest. That will hopefully give us more feedback from the professional users.

I’ve gotten feedback from people of all different skill sets that have been able to use OpenDCP to create DCPs. Some have been using it for preshow/commericals, a few are using it for archiving, and independent film makers are quite happy with the results. The current version takes a tiff image sequence and does the jpeg2000 and XYZ color conversion for the picture track. The audio track is created from 24-bit 48/96khz PCM wav files. It supports pretty much supports the entire DCI specification – 3D, 2K/4K, 24, 25, 30, 50, 60fps, digital signatures, etc.

Future features including being able to convert more image types, read directly from video files, image resizing, and simplify the process even more.

Developing OpenDCP has been a great process, first just trying to meet the needs I had as a cinema owner, then really putting my EE degree and programming skills to use. One of the neatest things has been meeting and discussing digital cinema with all kinds of people. I’ve been lucky enough to see some really excellent independent short films and learn so much along the way.

1 GNU GPL v3

2 The OpenDCP author wants to be clear that the project is still considered beta, and that the user should expect some issues depending on different factors. For example, while reading the forum this article’s author noticed that one user had difficulties with an older computer with a slow processor – changing the number of threads in the set-up let the build complete successfully. Thus, the recommendation is to start the DCP process with a small with 5-10 second clip. Get a successful workflow and then do a full conversion.

Iosono Surround Sound – a perfect companion to 3D releases?

Having been programmed with the actual dimensions of the replay space, and number of available playback sources, the Iosono decoder outputs audio signals tailored for the installation-specific loudspeaker channels. The result has been described as an ‘acoustic hologram.’

‘While it is possible to create a fairly realistic sense of acoustic space with conventional surround-sound technology,’ concedes Brian Slack, Iosono’s SVP of Studio Technologies, ‘there is one major drawback – the mix requires that the listener sits in the so-called ‘sweet spot.’ Outside of that, any sound will be perceived to originate only from a very general direction.’

 


Read the rest of this fine Mel Lambert/ProAudio Asia article at:
The art of mixing motion pictures – Pro Audio Central 

 

In fact, this is part 2 of  article: 
As 3D conquers popular cinema, how are audio playback formats being developed to match?


 

DCI Compliance – Then There Were Three [Updated]

The good news is that after 10 years of TI doing the yeoman work of making the digital cinema industry happen, they finally have gotten two of their OEMs past the goal.

They also announced that there are now 300,000 3D capable projectors in the field. But that was a different group making noise for a different industry.

Congrats to TI. Next up, a server company…bets anyone?

[Update: Christie PR was able to help parse the noise…]:

Yes, there is a difference in our announcement.  Barco’s announcement says only that they’ve passed the “procedural” portion of the CTP.  Christie is announcing they’ve passed everything, which includes the  procedural AND design aspects, so we’re much closer to receiving complete DCI compliance certification.
Here’s Barco’s announcement:
Kortrijk, Belgium, 17 March 2010 — Barco, a global leader in digital cinema announced today that its ‘Series 2’ digital cinema projector has successfully passed the procedural test for DCI compliance administered by CineCert, the leading 3rd party authorizing test facility.
Hope this helps.

 So there. We now know better what to watch for.