Hi all, this is my first post here.
After months of studying and googling, I've managed to realize a desktop video player:
- wpf/d3dimage for the frontend
- directx 11 for the rendering backend
The two worlds are in communication through surface sharing: that is, directx 11 renders on a shared texture so that d3dimage (d3d9) on wpf side can present the frame.
Directx 11 is fed by GStreamer decoder, producing out I420 planar buffers.
During my research, I've understood that the preferable approach would be to split the buffer into 3 textures (separating the Y-U-V components) and then …