2

I have a framegrabber that I connect to a Windows PC. The capture input device is 1080p, 60fps, 4:4:4 raw video. I want to use this in a test setup where I let a user work on an existing windows application and introduce image compression, reduced resolution reduced dropped framerate. The idea is to find out about the user experience and let the user rate the different settings. My idea is to use ffmpeg to take the RAW input stream and introduce delay, transcode to MJPEG and add different JPEG levels and to add a delay. While I can already access the capture device, change the resolution and reduce framerate I am not sure if it's at all possible to introduce a delay since most of the times people either want to introduce a video delay to account for an audio delay or people want to reduce the video delay, not intentionally add a video delay :-) This is how I currently handle the stream:

ffmpeg -f dshow -i video="framegrabber_capture_1" -vf format=yuv420p,scale=1280x1024,fps=5 -f sdl test

As a side note: I only need the pure video, audio is not needed in the test setup.

I didn't figure out how to transcode to MJPEG, but I guess that's doable. What I couldn't find at all was information about the video delay as this would also mean that the stream is cached somewhere. Would be great if somebody could point me into the correct direction.

Thanks!

J.

1 Answer 1

0

FFmpeg is not designed for delaying the displayed video, because FFmpeg is not a video player.

We may force FFmpeg to delay the video by concatenating a short video before the video from the camera, using concat filter.
We also have to add realtime filter, for forcing FFmpeg to match the output rate to the input rate (without it, FFmpeg sends the video as fast as possible).

With MJPEG encoding, we can't use -f sdl test, we my pipe FFmpeg output to FFplay, for displaying the video.


Example for delaying the captured video by 5 seconds:

ffmpeg -an -f dshow -rtbufsize 1G -i video="framegrabber_capture_1" -filter_complex "color=white:size=1280x1024:rate=5:duration=5[w];[0:v]scale=1280x1024,setsar=1,fps=5[v0];[w][v0]concat,realtime=limit=10" -c:v mjpeg -pix_fmt yuvj420p -f mjpeg pipe: | ffplay pipe:


  • -rtbufsize 1G - increase the size of the input buffer to 1 GByte (allows storing "many" input video frames).
  • color=white:size=1280x1024:rate=5:duration=5[w] - Creates synthetic white video with duration of 5 seconds, and rate of 5fps. Store in [w].
  • [0:v]scale=1280x1024,setsar=1,fps=5 - Scale the input video, and set the framerate. Store in [v0].
  • [w][v0]concat - Concatenate the 5 seconds white video with the scaled input video (the 5 seconds white video comes first).
  • realtime=limit=10 - Slows down the output rate to the input rate (limit=10 is the pause limit).
  • -c:v mjpeg -f mjpeg - encode the output video with MJPEG codec, and store it in MJPEG container format.
  • pipe: - Use stdout as output.
  • | ffplay pipe: pass stdout pipe from FFmpeg to stdin pipe of FFplay (used as input).

Note:
The delay is not accurate, because there is an additional built-in delay.

We may (try to) improve the accuracy of the delay using higher framerate, and adding setpts=0 to FFplay command:

ffmpeg -an -f dshow -rtbufsize 1G -i video="Microsoft® LifeCam HD-3000" -filter_complex "color=white:size=1280x1024:rate=25:duration=5[w];[0:v]scale=1280x1024,setsar=1,fps=25[v0];[w][v0]concat,realtime=limit=10" -c:v mjpeg -pix_fmt yuvj420p -f mjpeg pipe: | ffplay -vf setpts=0 pipe:

6
  • The input is live, so realtime filter is superfluous.
    – Gyan
    Commented Feb 27 at 3:55
  • @Gyan without the realtime filter, the delay gets shorter over time. It's not working without it.
    – Rotem
    Commented Feb 27 at 8:35
  • I tried this, but the experience is different then I was looking for. The UI from ffplay doesn't show up for 30s. Then it looks like the buffered 30s webcam stream is played during 5 seconds. Afterwards I see the realtime webcam stream without delay.
    – Juergen
    Commented Feb 27 at 9:31
  • @Juergen, You are right that it is taking forever for FFplay to start playing, but after it starts (with my webcam), the delay is 5 seconds constantly. What is the framerate of your input video source? Is it constant or variable framerate? Is it a real webcam, or simulated webcam? The solution is working with my "Microsoft® LifeCam HD-3000" webcam. I can't guess the reason your system behavior is different. I tested using FFmpeg and FFplay version 5.1.2-full_build-www.gyan.dev.
    – Rotem
    Commented Feb 27 at 10:34
  • Try: ffmpeg -an -f dshow -rtbufsize 1G -framerate 10 -i video="Microsoft® LifeCam HD-3000" -f lavfi -i testsrc=size=1280x1024:rate=5:duration=5 -filter_complex "[0:v]fps=5,setpts='(N+25)/5/TB',scale=1280x1024,setsar=1[v0];[1:v][v0]concat,realtime=10" -c:v mjpeg -pix_fmt yuvj420p -f mjpeg pipe: | ffplay pipe:
    – Rotem
    Commented Feb 27 at 11:20

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .