Hi all,
The test download page now contains versions in the DCP-o-matic 2.17.x series. This series contains "unstable" versions that will eventually become 2.18.0.
2.17.x already has one quite deep change to the processing of video timestamps, which will hopefully help with a common and frustrating class of bugs which result in audio/video sync problems in DCPs.
This change has not, however, been widely tested and may cause other problems. If you feel like testing it, that would be very welcome, but please be careful if you choose to use it for production work!
Best,
Carl
2.17.x test versions
-
- Posts: 7
- Joined: Thu Nov 17, 2022 10:05 pm
Re: 2.17.x test versions
Can you elaborate on the nature of the timing change? Is it more complex than reverting to previous behavior of respecting video timestamps?
-
- Site Admin
- Posts: 2548
- Joined: Thu Nov 14, 2013 2:53 pm
Re: 2.17.x test versions
This problem has plagued me for some time, and I think over the last 10 years I have gone through a few different ways of getting it wrong... in a nutshell, though, 2.16.x and earlier versions pretty much assumed that content had a constant video frame rate. 2.17.x copes much better with variable-frame-rate inputs.
2.16.x would get a video frame from some content and (more or less) just put it in the DCP.
2.17.x has a different approach where it thinks about what DCP frames it needs to create and picks the best content frames based on their timestamps.
My own tests suggest that this is more robust, especially for a) odd videos made by slide-show software, and b) odd videos made with pullup/down/other strange frame rate conversions.
I have a small set of test content which looks OK but it would be great to get some wider testing.
2.16.x would get a video frame from some content and (more or less) just put it in the DCP.
2.17.x has a different approach where it thinks about what DCP frames it needs to create and picks the best content frames based on their timestamps.
My own tests suggest that this is more robust, especially for a) odd videos made by slide-show software, and b) odd videos made with pullup/down/other strange frame rate conversions.
I have a small set of test content which looks OK but it would be great to get some wider testing.