Tag Archives: Cameron

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

Black Screen Alert~! InterOp Losing Life Support

Long Live InterOp

It was the best of times, it was the worst of times. The engineers contributing to SMPTE, and the studios who contributed to DCI, came up with enough elements to create a secure and beautiful D-Cinema environment. The same studios financed the equipment qualification standards and partially financed equipment purchases for many exhibitors. These exhibitors agreed to buy this qualified equipment and use it in a way that somewhat assured that copyrights and quality-better-than-film would be typical on screens world-wide.

Fortunately, there were written and unwritten agreements which allowed the simple DCinema origins of MPEG and a fairly loose mechanism of security keys to transition to the full on (and just recently completed) versions of standards, specifications and practices known as SMPTE Compliant Digital Cinema, with SMPTE Compliant DCPs and Security and screen fulls of other ingredients. These transitional agreements are known as InterOp.

Unfortunately, InterOp worked well enough to be added to…and added to…and added to…

For example, the simplest multimedia tools use metadata to describe computer needed info and human interface info within the songs or movies that we get to and from iTunes and Hulu and Netflix. Workers who had to get equipment and people working together in the InterOp world had to come up with an interim…maybe one year or so to live…Naming Convention. It wasn’t useful for computers at all, and cumbersome for humans at best and kept getting added to without increasing the number of characters since some old equipment only had so many display characters…kinda like computers in the 60’s. There were (and are, since years later it is still in use) dozens of ways for it to go wrong, beginning with the fact that some studios chose to ignore it when it gets in the way (according to the logic at their end of the string) while projectionists might miss some nuance that is needed for logic at their end of the string.

What happened to adding metadata like modern sciences do, and which everyone knows eventually will be needed? There are other panics with higher priority. It sits partly formed, probably until it becomes a keystone item needed for some other important development.

There are other examples of InterOp and loose de facto ‘standards’ living beyond their time, the most garish being what is hopelessly called 3D.

Instead of using valuable engineering time to progress the computer to computer interface and give exhibitors a fighting chance at perfection, engineers have had to shoehorn one feature after another into the InterOp structure. It is done with the best intentions, of course. It begins with, “My customers were asking for this now, not at some point in the SMPTE-Compliant future.” It ends with, “I have to do this because my competitor is bragging about how they can do this at no extra cost even though it violates the spirit and the essence of every standard.”

There are too many examples to mention ranging from forensics and audio mapping. Specifics aren’t as important as the fact that the entire industry has floated out far enough from land that some see letters in the water, and some seem to think that they spell H – E – R – E    B – E    D – R – A – G – O – N – S

DCinema Dragons don’t breathe fire. They are light suckers. They cause Dark Screens. Coming to theaters and drive-ins near you.


Why?

Many reasons, partly centered around the effects of software upgrades. Because the upgrade from InterOp to SMPTE-Compliant software is not a simple ‘add a feature or two’ software upgrade. At the best of times, you just never know what you will be causing when you hit that ‘Upgrade’ button. Did the software writer anticipate every single parameter of combinations of hardware and software that is in your situation?

There just are some odds that you come out of the hospital feeling worse than how you went in (look up HAI). Anyone with a computer has had software upgrades that worked for thousands of others, but did not work for them (look up: damn, not again.) There is probably some inverse squared proportionality involved as well. Getting closer to a deadline quadruples the odds of failure.

So, don’t change~! Jeez. That is sooo obvious. Which is what many do. Don’t get the first generation of anything, including upgrades. Especially during summer when all the big movies are playing.

But a horizon event approaches. Some InterOp juggling just won’t work for some combinations of . There are an amalgam of changes coming though, prompted by the teams of Jackson and Cameron. It might be easy to ignore the 60 frames per second requirement of a Cameron release (famous for pushing deadlines forward as he is), but The Hobbit will probably not be delayed. 48 frames per second, stereoscopic 3D. Will it work in the InterOp world? And what other changes will be made

Why 48fps? Phil Oatley, the post group head of technology from Park Road Post (Mr. Jackson’s facility in New Zealand) who spoke at the SMPTE/NAB DCinema Days last April said that they choose 48 because they didn’t know if equipment and exhibitors could change to 60fps in time and in significant numbers. As it turns out, all server and projector manufacturers have announced 48 and 60 fps capability. Sony even put a price on it…$3,000…which they can more easily do for their 13,000 users as they have always used an internal media block in their system.

In this case, Sony has something like the Apple advantage: They control the server, the media block and the projector so the odds are higher of getting a smooth transition. And, they have gotten DCI Compliance (at one moment of software version time…does HFR cause enough of a technology disruption that they need to re-certify?)

A TI-based projector with an SD-HDI interface will be a lot more complicated. An IMB (internal media block) needs purchasing and inserting, which isn’t a cheap investment. It is dependent upon TI-code and code from the projector manufacturer as well as code from the server all working together. How different is the server, which will have had its graphics-serving guts ripped out? …will that need a new cert? Check the DCI site for Compliance passed equipment.

But we have gotten off point. Back a few years ago you could sign a VPF deal and promise that you would use DCI-Compliant equipment and run with the latest SMPTE specs and recommended practices. At the time there wasn’t one piece of gear through the compliance procedures. And since you know that there is no SMPTE Police checking your screen for the required 48 candela/square meter luminance standard, you didn’t feel bad breaking the luminance number when showing 3D, a number that approached moonlight-equivalence at the sides of the theater and barely reached 10cd/m2 in the center. (For info on the light fall off from silver screens, see: 23 degrees…half the light. 3D What?)

But the history of the studios has been to look the other way until there is a technology that fulfills the DCI requirement. When Doremi proved they could do JPEG as the standard required, MPEG suppliers were given notice. When laser light engines can provide 3D at 48 cd/m2 (14 foot-lamberts), will the studios insist that passive 3D systems with their horrid high gain silver screens are no longer allowed (as was done in France recently? See: The Death of Silver Screens~! Vive la France)

We’ll see, but this doesn’t have anything to do with HFR. HFR is outside the DCI specs. It falls into the ‘no less than’ zone, similar to the color primaries. Laser suppliers can pick primaries outside the capabilities of xenon if that is financially and politically worthwhile, just as long as they don’t chose primaries inside the DCI/SMPTE limits.

So what does HFR and SMPTE compliance have to do with each other? Only that they are two locomotives that are running on two separate but not parallel lines. There is no firm deadline for SMPTE compliant DCPs, and no one is saying that InterOp compliant DCPs have a limited life. In fact, the studios expect that DCI equipment will play future SMPTE-compliant DCPs as well as what will become ‘legacy’ InterOp DCPs.

But something, at some time, is going to bulge the balloon of InterOp to the point that going SMPTE-Compliant is the logical move. Engineers at the manufacturers are just going to say, “I can’t play this game anymore. We were promised SMPTE would be the container that fit everything, I did the work, I will InterOp no more.”

There is rumor that this will happen soon. There is a particular setup that is rubbing against the InterOp balloon. Exhibitors are saying, “We don’t want to change until the summer season is over.” Will everything play nice together if only one condition is changed in a system? Possibly. How can you increase your odds?

Go to the ISDCF site that lists all the latest software/firmware versions for the equipment in the field. See to it that you have the latest. That will increase the odds. ISDCF Current Versions

Another thing you can do is prepare a database listing all of your equipment at each projection position, all of the software and firmware versions and all the serial numbers, and leave a field where you can download your .pem file from each piece of gear. Save this and get ready for a note from your distribution center asking for this info.

 

It was the best of times, it was the worst of times,
it was the age of wisdom, it was the age of foolishness,
it was the epoch of belief, it was the epoch of incredulity,
it was the season of Light, it was the season of Darkness,
it was the spring of hope, it was the winter of despair,
we had everything before us, we had nothing before us,
we were all going direct to heaven, we were all going direct the other way
– in short, the period was so far like the present period,
that some of its noisiest authorities insisted on its being received, for good or for evil,
in the superlative degree of comparison only.

Charles Dickens – Tale of Two Cities

HFR-S3D Post SMPTE/CinemaCon Hobbit

Your analogy Michael, of going from standard definition to high definition or from VHS to DVD is a good one, but it doesn’t inform one of how it is similar. It is not that there are more pixels, it is that more pixels are able to be a discernible part of the picture…or can be if the director chooses. Another analogy would be to say that there is more depth of field, but instead of talking about the amount of available focus behind the point of focus, we get more ability to focus in front of the point of focus. Normally all that area in front is not only out of focus, but during any kind of motion in the scene or the camera, there is a smearing that contributes to destroying contrast in the picture. It is also very tiring for the eyes.

Cameron’s demonstration a year ago at CinemaCon made use of sword fighting and sweeping the camera around a fairly large room. The most vivid shot that allowed the technology to prove itself was a long, medium speed pan of several actors and actresses sitting at a long table with food and candles in front of them. There was also another set of actors whose backs were to us, so we’d see the back of their heads, with the view of the candles going in and out. At 24 frames per second (fps) the scene was typical, in-focus faces and quasi-focused candle flames with smeared blurs of the actors backs in front. At 48 frames per second, the smearing left. It wasn’t so important that the backs of these heads and shoulders were in focus, but that the smearing was gone so the discernible luminance of the scene increased – loosely, more contrast means colors and more colors means more natural feeling. The candle flames were brighter without being any more in focus.

Now I will bring 3D into the conversation and tell you that you are wrong Michael. 3D is not just a gimmick, and not just another tool. In fact, each picture that we see has dozens of clues of dimensionality without the parallax clues that stereoscopy brings. Everything from colors fading as we see them in the distance, to a fuzziness at the intersection of two objects (notice the shoulders compared to their background) to comparative sizes and not seeing a person’s legs when a table is in front of them, all tell the human visual system of eyes and brain and mind that there is a third dimension in the scene we are looking at. On the other hand, my guess is that most technical people in the business generally dislike the current implementations of Stereoscopic 3D, but for reasons that don’t have to do with the ugliness of the glasses or the upcharge or whether a well written scene could have served just as well. Most dislike it because even with the inherent horrors of the combination of high gain and silver screens (each with their own set of insurmountable problems), there isn’t enough light to do the process justice. And again, less light means more in-the-mud colors and fewer colors overall, especially whites and the light subtle colors that we normally use to discern subtle things.

I also was not a fan of S-3D until I saw the ‘dimensionalization’ of the final scene of Casablanca. I thought it was marvelous. It was on the other side of compelling. It was as if there needed to be an excuse to leave out the parallax. That doesn’t make badly shot or poorly dimensionalized S-3D OK, but it does make any S-3D ‘Not Ready for Prime Time’ when it isn’t then presented correctly – and that mostly has to do with the amount of light from the screen to the eyes. Which brings us back to High Frame Rates.

There was a two day set of SMPTE seminars dealing with digital cinema before NAB, which was the week before CinemaCon. Several thousand engineers got full geek treatment with an hour of ‘why lasers in the projector’ then seeing 6 minutes of demonstrations of Sony projectors with retrofit Laser Light Engines, Inc.’ laser systems, and 40 minutes on the various problems that high frame rates bring to the post production workflow, then 20 minutes of presentation from a technical representative from Peter Jackson’s team who explained some technical considerations of HFR.

There is a commonly held misconception that 24 frames per second was chosen because testing determined that this speed had something to do with the natural flutter rate of the eyes. In fact, 24 fps was chosen because it brought a movies sound to the point where it was not horrible. Similarly, there is some magic above 50 something fps and as we also learned (while Sperling was at Coachella missing the SMPTE event), there are potential problems to be wary of at 48 frames per second, demonstrated by Dr. Marty Banks of Cal Berkeley. So…

To answer one of your questions Michael, 48 frames was chosen at the time because they weren’t certain if equipment manufacturers would be able to get a working high frame rate system available by the time that The Hobbit was going to be released. But anyone who reads the trades most certainly knows that frame rates up to 60 have been in the specifications and doable since Series II projectors became available from Texas Instruments. Ah! but not in S-3D. As Sperling pointed out, this requires “in the projector” electronics to be fitted (or retrofitted) and a whole new way of thinking servers for the projector. An example: Sony announced to their 13,000 customers – give us $3,000 and we will retrofit your software to do S3D-HFR.

Going back though, to the demonstrations that Cameron did a year ago. In addition to 48fps S3D – which got rid of the front of focus blur – there were also identical shots taken at 60fps S3D. They were less WOW! but still importantly beneficial. Because of time constraints and the fact that we were sitting among Cameron’s other several thousand friends in the auditorium, there wasn’t a lot of time to look at these shots, but they reminded me of the arguments that George Massenburg made in a famous 3 part article entitled Lace and Filigree, written during audio’s transition to digital in the mid-80’s. There is something special as the speed improves akin to the benefits of increasing signal to noise in its various forms. Perhaps it all serves to put technology into the sphere of philosophy where it belongs. It certainly reminds us that all technology involves the art of compromise.

Last short aside, during the SMPTE event with the HFR panelists still on stage. One engineer came to the open mic and made a statement about government S3D skunkwork experiments that he had been part of which indicated that there was something that ‘popped’ at 53 fps and wondered if anyone else had run into that phenomena. The chair answered with a few speculations then expressed regret that Douglas Trumbull wasn’t there to give insights to his experiments in the field, since he not only had the longest record of making high frame rate movies but just opened a new digital high frame rate studio that has made several technical break-thoughs. And just like the Annie Hall/Marshall McLuhan moment, Doug came to the mic and added a few quips… OK; so that’s all we geeks get for high-level entertainment.

Keep up the good work. I enjoy the show and don’t begrudge any extra minutes you take to get all the interesting news of the week to us.

[Author’s salutations]

References:

 

High Frame Rates – The New Black, Getting to Speed

Combine 3, Drop 2, 120 becomes 24

 

HFR-S3D Post SMPTE/CinemaCon Hobbit

Your analogy Michael, of going from standard definition to high definition or from VHS to DVD is a good one, but it doesn’t inform one of how it is similar. It is not that there are more pixels, it is that more pixels are able to be a discernible part of the picture…or can be if the director chooses. Another analogy would be to say that there is more depth of field, but instead of talking about the amount of available focus behind the point of focus, we get more ability to focus in front of the point of focus. Normally all that area in front is not only out of focus, but during any kind of motion in the scene or the camera, there is a smearing that contributes to destroying contrast in the picture. It is also very tiring for the eyes.

Cameron’s demonstration a year ago at CinemaCon made use of sword fighting and sweeping the camera around a fairly large room. The most vivid shot that allowed the technology to prove itself was a long, medium speed pan of several actors and actresses sitting at a long table with food and candles in front of them. There was also another set of actors whose backs were to us, so we’d see the back of their heads, with the view of the candles going in and out. At 24 frames per second (fps) the scene was typical, in-focus faces and quasi-focused candle flames with smeared blurs of the actors backs in front. At 48 frames per second, the smearing left. It wasn’t so important that the backs of these heads and shoulders were in focus, but that the smearing was gone so the discernible luminance of the scene increased – loosely, more contrast means colors and more colors means more natural feeling. The candle flames were brighter without being any more in focus.

Now I will bring 3D into the conversation and tell you that you are wrong Michael. 3D is not just a gimmick, and not just another tool. In fact, each picture that we see has dozens of clues of dimensionality without the parallax clues that stereoscopy brings. Everything from colors fading as we see them in the distance, to a fuzziness at the intersection of two objects (notice the shoulders compared to their background) to comparative sizes and not seeing a person’s legs when a table is in front of them, all tell the human visual system of eyes and brain and mind that there is a third dimension in the scene we are looking at. On the other hand, my guess is that most technical people in the business generally dislike the current implementations of Stereoscopic 3D, but for reasons that don’t have to do with the ugliness of the glasses or the upcharge or whether a well written scene could have served just as well. Most dislike it because even with the inherent horrors of the combination of high gain and silver screens (each with their own set of insurmountable problems), there isn’t enough light to do the process justice. And again, less light means more in-the-mud colors and fewer colors overall, especially whites and the light subtle colors that we normally use to discern subtle things.

I also was not a fan of S-3D until I saw the ‘dimensionalization’ of the final scene of Casablanca. I thought it was marvelous. It was on the other side of compelling. It was as if there needed to be an excuse to leave out the parallax. That doesn’t make badly shot or poorly dimensionalized S-3D OK, but it does make any S-3D ‘Not Ready for Prime Time’ when it isn’t then presented correctly – and that mostly has to do with the amount of light from the screen to the eyes. Which brings us back to High Frame Rates.

There was a two day set of SMPTE seminars dealing with digital cinema before NAB, which was the week before CinemaCon. Several thousand engineers got full geek treatment with an hour of ‘why lasers in the projector’ then seeing 6 minutes of demonstrations of Sony projectors with retrofit Laser Light Engines, Inc.’ laser systems, and 40 minutes on the various problems that high frame rates bring to the post production workflow, then 20 minutes of presentation from a technical representative from Peter Jackson’s team who explained some technical considerations of HFR.

There is a commonly held misconception that 24 frames per second was chosen because testing determined that this speed had something to do with the natural flutter rate of the eyes. In fact, 24 fps was chosen because it brought a movies sound to the point where it was not horrible. Similarly, there is some magic above 50 something fps and as we also learned (while Sperling was at Coachella missing the SMPTE event), there are potential problems to be wary of at 48 frames per second, demonstrated by Dr. Marty Banks of Cal Berkeley. So…

To answer one of your questions Michael, 48 frames was chosen at the time because they weren’t certain if equipment manufacturers would be able to get a working high frame rate system available by the time that The Hobbit was going to be released. But anyone who reads the trades most certainly knows that frame rates up to 60 have been in the specifications and doable since Series II projectors became available from Texas Instruments. Ah! but not in S-3D. As Sperling pointed out, this requires “in the projector” electronics to be fitted (or retrofitted) and a whole new way of thinking servers for the projector. An example: Sony announced to their 13,000 customers – give us $3,000 and we will retrofit your software to do S3D-HFR.

Going back though, to the demonstrations that Cameron did a year ago. In addition to 48fps S3D – which got rid of the front of focus blur – there were also identical shots taken at 60fps S3D. They were less WOW! but still importantly beneficial. Because of time constraints and the fact that we were sitting among Cameron’s other several thousand friends in the auditorium, there wasn’t a lot of time to look at these shots, but they reminded me of the arguments that George Massenburg made in a famous 3 part article entitled Lace and Filigree, written during audio’s transition to digital in the mid-80’s. There is something special as the speed improves akin to the benefits of increasing signal to noise in its various forms. Perhaps it all serves to put technology into the sphere of philosophy where it belongs. It certainly reminds us that all technology involves the art of compromise.

Last short aside, during the SMPTE event with the HFR panelists still on stage. One engineer came to the open mic and made a statement about government S3D skunkwork experiments that he had been part of which indicated that there was something that ‘popped’ at 53 fps and wondered if anyone else had run into that phenomena. The chair answered with a few speculations then expressed regret that Douglas Trumbull wasn’t there to give insights to his experiments in the field, since he not only had the longest record of making high frame rate movies but just opened a new digital high frame rate studio that has made several technical break-thoughs. And just like the Annie Hall/Marshall McLuhan moment, Doug came to the mic and added a few quips… OK; so that’s all we geeks get for high-level entertainment.

Keep up the good work. I enjoy the show and don’t begrudge any extra minutes you take to get all the interesting news of the week to us.

[Author’s salutations]

References:

 

High Frame Rates – The New Black, Getting to Speed

Combine 3, Drop 2, 120 becomes 24

 

3D@Home Content Creation Pushing Quality

Message from Steering Team 1 Chair, Jon Shapiro

Dr. Jim Cameron’s 10 Rules for Good Stereo

Rob Engle’s Top Recommendations for Creating Quality 3D

Ray Hannisian, Head Stereographer, 3ality Digital

Bernard Mendiburu’s Ten Rules for Quality 3D

See also: Mendiburu’s Introduction to 3D Cinematography

Ray Zone’s 10 Tips

3D@Home’s white paper page includes such topics as MPEG’s 3DTV standards and a paper on 3D Subjective Testing.

 

All this is fine for TV, but it is also important for getting 3D to the big screen, if only for film festivals and alternative content.

3D@Home Content Creation Pushing Quality

Message from Steering Team 1 Chair, Jon Shapiro

Dr. Jim Cameron’s 10 Rules for Good Stereo

Rob Engle’s Top Recommendations for Creating Quality 3D

Ray Hannisian, Head Stereographer, 3ality Digital

Bernard Mendiburu’s Ten Rules for Quality 3D

See also: Mendiburu’s Introduction to 3D Cinematography

Ray Zone’s 10 Tips

3D@Home’s white paper page includes such topics as MPEG’s 3DTV standards and a paper on 3D Subjective Testing.

 

All this is fine for TV, but it is also important for getting 3D to the big screen, if only for film festivals and alternative content.

All 3DAvatar™, AllThe3DTime™ [Updated]

News Corp. chairman and CEO Rupert Murdoch previously said …(excerpted)

FirstShowing.Net —James Cameron Delivers Updates on Avatar 2 and Re-Release

Yep, James Cameron and Avatar are back in the news again, but … First, he confirmed that he is producing Guillermo del Toro’s At the Mountains of Madness (announced a few weeks ago) and that they’ll shoot it in native 3D using next generation 3D cameras. [Surprise?]

We don’t exactly know what Cameron will be directing next, … he’s been getting inspiration for Avatar 2 by traveling down to South America and meeting with native tribes. “I have an overall narrative arc for [Avatar] 2 and 3, and there are some modifications to that based on my experiences in the last few months from having gone down to the Amazon and actually hung out with various indigenous groups who are actually living this type of story for real… but it’s not changing the overall pattern,” he said.

Finally, Cameron talked about converting Titanic to 3D and also complained about how terrible the Clash of the Titans 3D conversion was (as we all know). …

Marketsaw.blogspot — EXCLUSIVE: James Cameron Interview! Talks AVATAR Re-release, Sequels, 3D Conversions & Working With Del Toro!

[Listen to the audio interview on this page]

 

0:40 – Cameron confirms he is producing Guillermo del Toro’s AT THE MOUNTAINS OF MADNESS. The movie will be shot in native 3D using next generation FUSION 3D cameras from Pace. …

2:30 – Cameron talks about 3D conversions. TITANIC’s conversion is taking 8 months to a year to complete, not a fast turnaround like CLASH OF THE TITANS. Cameron: “(TITANS) showed a fundamental lack of knowledge about stereo space, …

5:00 – Cameron on how they are technically converting TITANIC. “You just can’t cut out edges, you’re going to get flat people moving around.” He will be using all his knowledge to put things on their right depth planes. They had tests for TITANIC from seven different conversion vendors on the exact same shots and they got back seven different answers as to were they thought things were spatially. “Some of them were not bad guesses and some of them were ridiculous.”

6:50 – The whole argument about conversion will go away for high end, first run 3D. Two years from now when there are thousands of 3D cameras out their shooting live feeds to 3D broadcast networks, how can a producer go to a studio and say…

9:05 – Cameron on talking with Steven Spielberg about converting his classic movies to 3D. …

11:20 – Cameron talks about AVATAR 2’s current status. …

12:04 – He is focusing his writing right now on the AVATAR novel (corresponds to the first film)…

12:45 – The AVATAR re-release will have 9 extra minutes, not 8 and it will all be CG. No extra footage of live action characters drinking coffee. Rainforest; some at night; a hunt sequence – …

15:45 – Cameron does not have the release timing of the 3D Blu-ray as …