DCP bandwidth questions and answers - test results

Anything and everything to do with DCP-o-matic.
IoannisSyrogiannis
Posts: 194
Joined: Mon Nov 13, 2017 8:40 pm

DCP bandwidth questions and answers - test results

Post by IoannisSyrogiannis »

A matter that comes often in regard to DCP creation. It's brought up in numerous occasions, but there is not much of concrete, well backed info provided by reliable sources. Some established facts, that apply for 2K, 24fps DCPs, regardless of aspect ratio, but keeping in mind that between the two standards that are used, Flat would be the more demanding of the two. Within those boundaries, know the following:
- The original bandwidth limit of the media blocks is 250 Mb/s. That was planned to be the maximum to work with 3D movies of the 2K kind, that would be twice the bandwidth of a two-dimension 24 fps. Also, the 250 Mb/s limit is not safe for all media blocks. Many avoid that and prefer to go for ~235 Mbits/sec as peak. After the second generation of DCI approved equipment, media blocks came that support much more bandwidth. Namely, twice that. Yet, good practice forbids content creators to go above for non-HFR, 2D content, even if it is 4K.
- The JPEG2000 compression that is used for digital cinema purposes is the so called Visually Lossless, which means that even a trained eye wouldn't distinguish the compressed version of a frame. Though, that compression is not mathematically Lossless, where there is not any image information lost and the compressed version is the exact same with the original. The Visually Lossless compression ratio is referred as 10:1 to 20:1

Now, there is a small detail in the phrasing here. "The trained eye"... A person that can recognize compression artifacts. A nice piece of info that isn't shared widely, was that a group of "trained eyes", when looking for degradation on DCPs connected to bandwidth, they didn't find any in the equivalent of 75 Mb/s for 2K 24fps. And -most interesting- they just didn't try to go lower, in order to establish where the turning point was.

According to the above, one may ask: If that 75 Mb/s is a fair bit rate for a 2K 24fps video, and 125 Mb/s is more like the limit set, to what would those numbers translate, if we were to think 25fps, or 4K?

The answer, concerning 25fps is a rather simple one. Staying in the "fare" bitrate for the sake of being brief (you may do the math for 125Mb/s), we would say that, if for 24fps the bitrate is 75 Mb/s, for 25, it would be one and a 24th times more. That would be 75x25/24 = 78 and 1/8 Mb/s. Let's round it up to 80 Mb/s, making easy to remember.

Yet, what about 4K? There is no simple analogy or formula that will give us a fixed ratio between the JPEG2000 bandwidth of 2K and 4K.
Why? Because we are talking about a compression, and a very efficient one as well. So, quadrupling the number of pixels does not translate into quadrupling data. In fact, it doesn't even mean increasing those twofold, as many intuitively choose to do.
And why is that, then? Because of the nature of the compression used. JPEG2000 is compressing the image by creating "layers" of the image, where the next layer is half the width and half the height of the previous (just like 4K and 2K). Additionally to that downscaled image, it holds the data difference that is necessary to recompose the "upper" layer. It holds the differences in each of horizontal, vertical and diagonal directions, creating a so called "decomposition level". And keeps that extra info necessary to recompose the original, compressed and quantized. Then, the first decomposition level is undergoing the same procedure, in order for the second decomposition level to be created, with half the width and half the height resolution of the first decomposition level. Five decomposition levels are created on JPEG2000 compression for making a 2K DCP frame and six for 4K. In other words, in each DCP frame, there are six resolution layers, if the image is 2K, and seven, if the image is 4K. With each layer/resolution (but the first), having the so-called high frequency subbands (extra info) compressed (and quantized) accordingly.

Cutting to the chase, what is the ratio between a 4K and a 2K DCP?

The file size ratio between two different DCPs of the same movie in the equivalent quality of 4K and 2K can be given by figuring what the difference of their image files are, if one would discard from the 4K DCP the so-called high frequency subbands. That procedure is common, discarding the higher resolution of a 4K DCP when screened on a 2K system. The cinema screen server and media block are not decoding the whole 4K image, they just decode the 2K and feed that to the projector, sparing the extra processing and bandwidth.

Here is what I found, when I was looking to answer the question:

I used 8.000 4K frames out of the DCP of choice and exported them as .j2c files. Then, the higher resolution was discarded, without re-encoding the image, downscaling it to 2K.

----------------------------------------------------------------------------------

Original folder (4K):
6,76 GB (7.268.132.125 bytes)
Average frame size: 908.517 bytes
Largest frame size: 910.425 bytes (Frame0200.j2c which is 413.590 bytes when reduced to 2K by 54,572%)
Smallest frame size: 651.496 bytes (Frame4353.j2c which is 645.696 bytes when reduced to 2K by 0,89%)

Reduced JPEG2000 folder (2K):
4,90 GB (5.263.438.590 bytes)
Average frame size: 657.930 bytes
Largest frame size: 876.438 bytes (ReducedFrame1923.j2c which was 910.291 bytes in original 4K before reduced by 3,719%)
Smallest frame size: 407.213 bytes (ReducedFrame0198.j2c which was 910.332 bytes in original 4K before reduced by 55,268%)

2K is 72,418 % of 4K, put otherwise,
2K is 27,582 % smaller than 4K, or,
4K is 38,087 % bigger than 2K

maximum reduction: 55,268 % (Frame0198.j2c 910332 bytes ReducedFrame0198.j2c 407213 bytes)
minimum reduction: 0,775 % (Frame4354.j2c 652422 bytes ReducedFrame4354.j2c 647364 bytes)
average reduction: 27,547 % (between individual files, not in total - differences on filesystem, spreadsheet rounding, or my rounding to the one thousandth of the percentage may account for that total difference of 0,035%)

Resulting DCP video mxf files
4K of 8000 frames 24fps repackaged:
6,76 GB (7.268.397.129 bytes)
different from .j2c files by 265.004 bytes
Overall bit rate (media info): 174 Mb/s
Bits/(Pixel*Frame) (media info): 1.034

Reduced 2K of 8000 frames 24fps packaged:
4,90 GB (5.263.703.594 bytes)
different from .j2c files by 265.004 bytes
Bit rate (media info) : 126 Mb/s
Bits/(Pixel*Frame) (media info) : 2.995

----------------------------------------------------------------------------------

Where did those numbers came from?
a) An excerpt of eight thousand frames were extracted out of a test DCP that was almost three times that length, skipping the first ten thousand. The asdcplib library was used for that purpose.
b) Those frames were stripped of the higher resolution, using the transcode command of the kakadu software and the '-reduce' operation. A number of open source and/or freeware JPEG2000 codecs could not directly encode .j2c files out of .j2c files. Some implementations had the ability to reduce a resolution layer (or more). But only in order to transcode into another image format. Retaining the original file's compression was not possible with any other tested software but Kakadu.
c) Creating a list of the files and file sizes for each version, 4K original and 2K, and using a spreadsheet to document and define maximum, minimum and average values.
d) (Re-)packaging the frames into a 24fps DCP MXF file and calculating bitrate of the resulting files.

The DCP included a title and credits. Those were images of low complexity, that resulted in smaller files to start with. In order to focus on what is the concern of most people in terms of bitrate allocation, higher complexity images that result in bigger image file sizes, so I used part of the test DCP. The smallest 4K frame file size was 861 bytes and the largest was just 35 bytes bigger.
I do intent to examine other cases and check the results. But I don't expect significant deviations in the results, given that the original DCPs are live action videos.

This DCP is a short film created by the American Society of Cinematographers Motion Imaging Technology Council designed to be used as both reference material and stress testing for color pipelines, image processing, monitors and projection systems. It would be a good start, if I was to take advantage of a video that has a variety of characteristics that may appear in different cases. On top of that, on the site that is the source of this DCP, it states that it is permitted to use it for educational purposes (I copy the entire notice at the end).

Bottom line, between a 2K and a 4K version of the same video encoding, the difference in file size (and therefore bandwidth) is:

In average of the video excerpt:
4K is 38% larger than the 2K version

Between the specific images, 4K is from 1,008 to 2,236 times the 2K version.

I leave you with the given numbers and avoid writing for the time being my conclusions about what would I consider a fair bandwidth used for a 4K feature DCP. As a more personal estimation, I will leave that for the comments.

Feel free to comment on what would have made the estimations reflect better how the JPEG2000 compression used in digital cinema would be more effectively put into use. The motive behind this effort is and was to figure out what would be the most effective way to balance quality and compression/storage.

I do understand that there is no correct answer, particularities in image complexity will always appear, that would call for weighting up a situation. Yet, I also understand that people tend to think that "one can't go wrong when going for more", and that turns out to be problematic at one or another point of a DCP's lifetime.

Also, let me know if there is any open source, non proprietary software that would do that reduction-without-re-encoding for cinema .j2c files, or (even better), if there are DCP authoring programs that do that. I would sincerely like very much to find that feature available.

I hope you will find this as insightful as I did, and I wish everyone the best for these festive days.

Edit: This text is a version of an article at LinkedIn, if one cares to read that there, follow this link.
----------------------------------------------------------------------------------
Terms of the DCP license (December 24, 2024, https://dpel.aswf.io/asc-stem2/):
ASWF Digital Assets License v1.1
StEM2 -- Copyright 2022 -- American Society of Cinematographers -- All rights reserved.
Redistribution and use of these digital assets, with or without modification, solely for education, training, research, software and hardware development, performance benchmarking (including publication of benchmark results and permitting reproducibility of the benchmark results by third parties), or software and hardware product demonstrations, are permitted provided that the following conditions are met:
1. Redistributions of these digital assets or any part of them must include the above copyright notice, this list of conditions and the disclaimer below, and if applicable, a description of how the redistributed versions of the digital assets differ from the originals.
2. Publications showing images derived from these digital assets must include the above copyright notice.
3. The names of copyright holder or the names of its contributors may NOT be used to promote or to imply endorsement, sponsorship, or affiliation with products developed or tested utilizing these digital assets or benchmarking results obtained from these digital assets, without prior written permission from copyright holder.
4. The assets and their output may only be referred to as the Asset Name listed above, and your use of the Asset Name shall be solely to identify the digital assets. Other than as expressly permitted by this License, you may NOT use any trade names, trademarks, service marks, or product names of the copyright holder for any purpose.
Carsten
Posts: 2815
Joined: Tue Apr 15, 2014 9:11 pm
Location: Germany

Re: DCP bandwidth questions and answers - test results

Post by Carsten »

Added - there is not a straight default compression scheme for J2K (other than e.g. with simple compressors like ZIP, RLE,ZIF etc.), means, all J2K compressors using specific code will deliver different output data from the same input data - simply because transformation encoders as J2K do not follow simple straight data encoding, but weight intermediate computing results perceptually and differently to obtain the compression function by using different parameters and performance optimisations. There are also options for different pre- or postprocessors.
Therefore, different J2K encoders deliver better or worse loss figures at the same compression rate. Making judgements even more complicated. Also, even the slightest filtering or scaling/interpolation of signal components will create very different data rates even with the exact same compression parameters.


There used to be a time where storage space or transmission bandwidth dictated lower data rates for DCPs. I don't think these constraints still apply today, except for some very specific project based applications. Film makers tend to overestimate the need for high compressed video data rates, and often feel limited by the 250 MBit/s limit, simply because higher numbers are possible in software settings.

However, 'common' quality assumptions ('higher numbers are always better') go wrong with J2K, especially when the risks of introducing severe playback errors are underestimated.
When I set up a new instance of DCP-o-matic, I usually set the default J2K datarate to 220MBit/s and never touch it again. Even 200MBit/s would be more than enough for all 24fps applications.

HFR of course may need additional consideration, as well as the fact that the perception of compression artefacts varies between still image and moving images.
There are also well known limitations in J2K with highly saturated high contrast/highly detailed images (usually of synthetic origin), and of course with high frequency color noise.