Page 3 of 6

Re: GPU based DCP encoding

Posted: Thu Oct 22, 2020 6:05 pm
by carl
It's still CPU-only but there is now some definite work in the pipeline to add GPU support. Unfortunately I don't know when this will be; DCP-o-matic is still a spare-time project and the amount of spare time is limited and quite variable.

Kind regards,
Carl

Re: GPU based DCP encoding

Posted: Fri Oct 23, 2020 9:13 am
by Carsten
@checcontr: Absolutely NO GPU encoding currently, and as Carl says, it may take a while until it arrives. Can you tell me the specs of your current setup - Mainboard, CPU, amount of memory? From what do you encode - compressed video files (e.g. MP4, ProRes, DNxHD), or still images series (e.g. TIFF).


- Carsten

Re: GPU based DCP encoding

Posted: Tue Nov 10, 2020 4:29 am
by rlsound
When GPU encoding arrives, what GPU will it be optimized for? I'm building a new machine at home and want to buy the correct hardware :)

Re: GPU based DCP encoding

Posted: Tue Nov 10, 2020 12:35 pm
by carl
At the moment, the most likely first "version" of GPU encoding will be for nvidia, on Windows/Linux only (not macOS) and not free.

Re: GPU based DCP encoding

Posted: Tue Nov 10, 2020 8:22 pm
by Carsten
carl wrote: Tue Nov 10, 2020 12:35 pm (not macOS)
BOOOOOOOOOHHHHHHHH!

Re: GPU based DCP encoding

Posted: Thu Nov 12, 2020 5:31 pm
by rlsound
Thanks Carl. I understand it won't be free, more than happy to splurge for it. :) Thanks!

Re: GPU based DCP encoding

Posted: Mon Feb 01, 2021 11:24 pm
by jamiegau
I have delved into GPU encoding.
Talking to the main developer of Kakadu, the advantages of GPU encoding are minimal as there is other bottlenecks involved (Getting the data in and out of the GPU memory can be a bottleneck compared to just doing it all in the CPU.. (But with universal memory and PCIe4 in the new GPUs that may be a different story now).
They are more focused on general CPU encoding as its more universal and can be applied to any platform. And in the world of cloud computing, you can just add typical VMs to a cloud compute solution to archive whatever you want. Not custom servers/CPU/GPU instances.

Just optimization of J2K on current CPUs would be 5-10 times faster than OpenJPEG2000.
I would suggest looking at licensing kakadu as I know the guys and they are incredibly understanding on this market.

This thread is MANY years old and nothing has happened. so.. Would you pay $70usd for a addition to make it render 5-10 times faster?
I would..

It must be reasonable as it comes with Resolve now, and it's cheap as for what it does.

Re: GPU based DCP encoding

Posted: Tue Feb 02, 2021 11:22 am
by Carsten
We do already now see bottlenecks with very fast CPU setups, e.g. in content examination, audio analysis and hashing. On a very fast multicore CPU, parallel J2K can be performed very quickly, but post processing/hashing so far is single threaded and can add considerable time to the overall encoding time. I don't know if there are ways to speed this up other than segmenting into multiple reels. But we have to wait for Carl to finish his work on the GPU encoder to see how important that will become, and I guess then Carl needs to take a look at these bottlenecks. Using fast M2. SSDs will probably be one solution, as some of these operations are IO-bound.

Re: GPU based DCP encoding

Posted: Wed Feb 17, 2021 4:21 am
by kimballstheater
I too am looking for solutions to speeding up this process. I run a small Art House in Colorado and eating up my PC for 13-33 hours PER Movie is rough. Especially since they take up massive amounts of space once they are converted, it's tough to hold many of them even on large external Storage systems. The 33 hours one was Dune (Ext) [1984] that clocks in at 3 hours. Here's what I'm figuring based off what I read in this Thread to get maximum speed out of this process right now.
  • Fastest Single Core CPU available.
    M.2 for Windows & DCP-o-matic.
    SSD to compile the DCP to.
    [later on] high end Video card to speed up the calculations.
    -OR- multiple older PC's that we have no use for, all converting movies to DCP to double/triple/quadruple/etc how many you can make at once.
Other things I do is disable anything that could eat up CPU (as possible), turn off SS/Sleep mode and set DCP-o-matic to utilize 100% of the CPU at all times. I also set it and forget it, as in walk away while it's converting and don't use that PC for anything else, not even browsing the Internet.

My biggest issue is the time. I L-O-V-E this software and have no thought of using any others. With it taking so long I can't do Aud rentals 'spur of the moment' unless they want to chose a Movie we already have on DCP, that I haven't deleted due to space constraints. On that note, I have a Synology DS920+ with (2) 4TB WD Red NAS drives in Raid 1 for my "big storage". I will be added dual 12TB soon for 20TB in total. These take too long to convert to have a drive fail and lose the 17-22 you can fit in 4TB.

In the short term, I'm thinking using multiple old PC's is the best option. We've been in business almost 27 years and have TONS of older PC's just laying around that we don't use any more. I'm positive I can build at least 2-3 by frankenstiening them from all the rest. If we can't really do "faster" yet, maybe we should just look into "more simultaneously" instead.

Re: GPU based DCP encoding

Posted: Wed Feb 17, 2021 9:19 pm
by Carsten
You need as many cores as possible to speed up the conversion. Even very old CPUs with many cores can be faster than modern higher clocked CPUs with less cores.

It is possible to create a feature DCP at around real-time, that is, 3hrs conversion time for a 3h feature, at modest cost.

https://dcpomatic.com/benchmarks/input.php?id=1



- Carsten