-
Notifications
You must be signed in to change notification settings - Fork 233
SEI data lags behind image frames when stored with PyNvEncoder and read out using PyNvDecoder #525
Comments
Ok, at least I found something like a workaround by running def demux_video(video_file, gpu_id=0):
print(f"Decoding {video_file} with standalone demuxer...")
nvdemux = nvc.PyFFmpegDemuxer(video_file)
nvdec = nvc.PyNvDecoder(nvdemux.Width(), nvdemux.Height(), nvdemux.Format(), nvdemux.Codec(), gpu_id)
frame_nv12 = np.ndarray((0, ), dtype=np.uint8)
sei = np.ndarray(0, dtype=np.uint8)
packet = np.ndarray(0, dtype=np.uint8)
enc_packet = nvc.PacketData()
dec_packet = nvc.PacketData()
num_decoded = 0
while nvdemux.DemuxSinglePacket(packet, sei):
if nvdec.DecodeFrameFromPacket(frame_nv12, enc_packet, packet, dec_packet):
num_decoded += 1
frame_nv12 = frame_nv12.reshape(-1, nvdemux.Width())
print(f" frame {num_decoded} = {frame_nv12[0, :15]} <-> SEI = {sei}")
else:
print(f" frame not ready <-> SEI = {sei}")
while nvdec.FlushSingleFrame(frame_nv12, dec_packet):
num_decoded += 1
frame_nv12 = frame_nv12.reshape(-1, nvdemux.Width())
print(f" frame {num_decoded} = {frame_nv12[0, :15]}") which now prints:
It seems like a better explanation of why the |
Hi @sakoay It looks like frame reordering is happening. During this process encoder compresses video frames in order different from input order. It's done for better compression efficiency. To check this assumption you can initialize your You can get list of VideoProcessingFramework/src/PyNvCodec/src/PyNvCodec.cpp Lines 431 to 433 in 82b51e7
|
This might be related to issue 1262669666, but as there seems to be no resolution in that thread I am posting it again in case somebody can help.
I am trying to store user data SEI paired with each encoded frame using
PyNvEncoder
, but upon reading out this SEI message usingPyNvDecoder
, it seems to be out of sync with the frames. Specifically, the SEI data encoded for frame 3 is returned when decoding frame 1, the SEI encoded for frame 4 is returned when decoding frame 2, and so forth, and the last few decoded frames are returned with no SEI message. The code I am using to produce this effect is as follows:Running this prints the following output (on a machine with a NVIDIA GeForce RTX 3070, but also the same on a RTX 4090 GPU):
The frame image data is as expected, but as you can see the SEI data originally entered for frames 1 to 3 seem to have been "lost" upon decoding. I am not sure what is happening, but if I switch the
nvenc.EncodeSingleFrame()
parametersync
toFalse
, this is what is printed instead:As expected from asynchronous running there are now extra packets at the end of the encoding loop that need to be flushed to the output file, and interestingly the number of packets flushed is the same as the number of "lost " SEI data.
I would be much obliged to receive any help on this issue! The VideoProcessingFramework has been a game changer in my efforts at high-speed encoding from multiple video sources, and I was so delighted that it supports SEI message storage.
The text was updated successfully, but these errors were encountered: