Tag Archives: legacy

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

Certificate Authorities and DCinema

Another has been found to have introduced a man-in-the-middle attack vector, meaning that once a legitimate user opened the door by giving the correct credentials, someone slipped in and assumes the identity of that user with all their rights (usually kicking them off the system – something that should arouse suspicion but which happens so often, seems normal.

Last week the Big Kahuna of CAs, Verisign, had to admit that they also were hacked into and that data was stolen from their systems. Coming so long after the break-in and after people got used to the news that smaller sites were hacked (relatively smaller sites…still significant to the system though), this isn’t getting a lot of play. When Belgian CA GlobalSign was broken into the hue and cry approached ChickenLittle-ish. This week I see articles on Verisign that don’t get any clicks.

Is it that all the tech geniuses at all the dcinema installers and installation and distribution sites double-triple checked their firewalls and decided they were nuke free and nuke-proof? Or perhaps we are complacent, feeling that the industry is not like the bank industry, with no immediate link to buckets of spendable cash, and no one really focusing the industry. Or, perhaps more logically, the dcinema industry is just hoping that the entire unbuilt fortress of SMPTE compliance will get together before the jewels that the studios need to protect get too exposed, because – “Hey, we’re pedaling as fast as we can, and see, you wanted all these updates put into legacy equipment with constant patching to the legacy InterOp format…”

For bettor or worse, there is no universal trusted device list in the industry, most likely due to potential liability issues. This has led to every company and their brother having a separate list – though there is enough interplay that these are presumed to have enough intercourse that if one list is polluted with a rogue ‘signed’ utensil, it would be disseminated throughout the lists. So, the best and the worse of all possible worlds.

Into this is a RFI from a company (last week) suggesting that they can build a system…

This article is a work in progress. Here are some of the industry articles that provoked the issue:

Who to trust after the VeriSign hack? | IT PRO

VeriSign admits 2010 hack | IT PRO

Trustwave issued a man-in-the-middle certificate – The H Security: News and Features

Break-ins at domain registrar VeriSign in 2010 – The H Security: News and Features

Backdoor in TRENDnet IP cameras – The H Security: News and Features

Certificate fraud: Protection against future “DigiNotars” – The H Security: News and Features

OpenPGP in browsers – The H Security: News and Features

Google researchers propose way out of the SSL dilemma – The H Security: News and Features

Google wants to do away with online certificate checks – The H Security: News and Features

Is the end nigh for Certificate Authorities? | IT PRO

Certificate issuing stopped at KPN after server break-in discovered – The H Security: News and Features

Certificate Authorities and DCinema

Another has been found to have introduced a man-in-the-middle attack vector, meaning that once a legitimate user opened the door by giving the correct credentials, someone slipped in and assumes the identity of that user with all their rights (usually kicking them off the system – something that should arouse suspicion but which happens so often, seems normal.

Last week the Big Kahuna of CAs, Verisign, had to admit that they also were hacked into and that data was stolen from their systems. Coming so long after the break-in and after people got used to the news that smaller sites were hacked (relatively smaller sites…still significant to the system though), this isn’t getting a lot of play. When Belgian CA GlobalSign was broken into the hue and cry approached ChickenLittle-ish. This week I see articles on Verisign that don’t get any clicks.

Is it that all the tech geniuses at all the dcinema installers and installation and distribution sites double-triple checked their firewalls and decided they were nuke free and nuke-proof? Or perhaps we are complacent, feeling that the industry is not like the bank industry, with no immediate link to buckets of spendable cash, and no one really focusing the industry. Or, perhaps more logically, the dcinema industry is just hoping that the entire unbuilt fortress of SMPTE compliance will get together before the jewels that the studios need to protect get too exposed, because – “Hey, we’re pedaling as fast as we can, and see, you wanted all these updates put into legacy equipment with constant patching to the legacy InterOp format…”

For bettor or worse, there is no universal trusted device list in the industry, most likely due to potential liability issues. This has led to every company and their brother having a separate list – though there is enough interplay that these are presumed to have enough intercourse that if one list is polluted with a rogue ‘signed’ utensil, it would be disseminated throughout the lists. So, the best and the worse of all possible worlds.

Into this is a RFI from a company (last week) suggesting that they can build a system…

This article is a work in progress. Here are some of the industry articles that provoked the issue:

Who to trust after the VeriSign hack? | IT PRO

VeriSign admits 2010 hack | IT PRO

Trustwave issued a man-in-the-middle certificate – The H Security: News and Features

Break-ins at domain registrar VeriSign in 2010 – The H Security: News and Features

Backdoor in TRENDnet IP cameras – The H Security: News and Features

Certificate fraud: Protection against future “DigiNotars” – The H Security: News and Features

OpenPGP in browsers – The H Security: News and Features

Google researchers propose way out of the SSL dilemma – The H Security: News and Features

Google wants to do away with online certificate checks – The H Security: News and Features

Is the end nigh for Certificate Authorities? | IT PRO

Certificate issuing stopped at KPN after server break-in discovered – The H Security: News and Features

[Update] Deluxe/Technicolor Agree–Death Rattles of Film

Both companies have gotten into digital post and distribution services in a big way, Deluxe purchasing several companies recently and making agreements with companies such as EchoStar for satellite distribution direct to cinemas. Technicolor has been growing into these services more organically. But as the 50% penetration of digital media players and projectors is approaching and the tipping point of more digital ‘prints’ than film prints is also reached, film becomes legacy.  

No news about who gets the volume discount remuneration from Kodak at the end of the year or how studios will mark up the interstitial services.

Original SOURCE Deluxe Entertainment Services Group Inc.-18 July 2011

Technicolor Source:Digital Cinema Buyers Guide – Latest News

Signs subcontracting agreements with Deluxe for Film services in North America, Thailand and UK

Technicolor announces the launch of phase II of its photochemical film activities optimization. This follows the completion of the first phase of rationalization launched in October 2010, and will enable the Group to optimize worldwide 35mm print manufacturing capacities as well as leveraging its North American theatrical distribution infrastructure.

This phase II is structured around subcontracting agreements with Deluxe, covering:

· 35mm release print manufacturing

· Subcontracting agreement from Technicolor to Deluxe in North America

· Subcontracting agreement from Deluxe to Technicolor in Thailand

· Subcontracting agreement from Deluxe to Technicolor for negative development in the UK

Theatrical distribution

Subcontracting agreement from Deluxe to Technicolor for the distribution of photochemical film prints in the US

Technicolor will continue to service its clients, and Technicolor and Deluxe remain competitors in all markets where they operate. Technicolor maintains its front end activities in North America and remains the key provider of 65/70mm film printing worldwide.

Following the rapid shift to digital cinema since 2010, the Group launched phase I of its photochemical film optimization in the fourth quarter of 2010, with the closure of its North Hollywood facility and rationalization across European operations. Phase II subcontracting agreements lead the Group to cease its release printing manufacturing operations in Mirabel (Canada), employing 178 people, with immediate effect.

This enables the Group to have a more flexible cost structure with the share of variable costs moving from 60 to 85% in North America. In addition, the cash restructuring costs linked to the implementation of this phase II are expected to be offset by savings on photochemical maintenance capex and by the favorable impact of incremental distribution volumes.

The phase II will ensure that the Group focuses its investments in digital services where it already benefits from market leading positions, while continuing to serve its customers through the tail of film processing.