I'm working to fix a Blu-ray which was originally broadcast as 25p(50i) but only released on Blu-ray, annoyingly, in 60i with hard pulldown (I originally returned it to Amazon and expressed my displeasure, but recently realised I might be able to save it so I got it again).
I've been able to replicate the pulldown pattern (a mix of 2-2-2-3-2-3 and 2-2-3-2-2-3) so I know that the luma can be recovered, but it looks to me like the chroma on the interlaced fields has been blended. I just wanted to check with those more knowledgeable than myself that there really is nothing recoverable here, and that I'm not missing something, as interlacing can be a bit of a minefield. All I've got (in the sense of never-needed-anything-else!) for decoding VC-1 is DGDecNV (2040/2042 - upgraded mid-project - using 64-bit .exe and 32-bit .dll), which I assume is doing everything by-the-book? Or could something in what my video card is doing be to blame? (NVIDIA NVS 4200M).
Here's a sample: http://horman.net/cut.demuxed.vc1
And here's a tiny crop, deinterlaced to even and odd fields in Photoshop:
The "white" bit is supposed to be a blue shirt collar.
I think in theory I can recover per-frame chroma by a combination of subtracting and averaging neighbouring frames, but perhaps someone out there knows a cleaner way to recover the information?
Support forum for DGDecNV
2 posts • Page 1 of 1