0

For a video monitoring solution, I want to push thumbnails of an incoming video (TS/UDP) to a S3 bucket from where those thumbnails are shown on a web page

Currently the following bash loop is working:

while true; do
  ffmpeg -i udp://:5010 -vf "select='eq(pict_type,PICT_TYPE_I)'" -frames:v 1 -vsync vfr -qscale:v 15 -an -s 320x180 -f image2pipe - | aws s3 cp - s3://bucketname/prefix/preview.jpg
done

So it sync on the incoming TS stream, get an I-Frame, scale it to 320x180 and pipe that to AWS command line to upload on the bucket. It stops after 1 frame and then the while loop make it happen again

I'm now looking if there is a way to optimize and perform this continuously without the loop in order to have a smoother flow of thumbnails.

I tried the following:

ffmpeg -i udp://:5010 -vf "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr -qscale:v 15 -an -s 320x180 -f image2pipe - | aws s3 cp - s3://bucketname/prefix/preview.jpg

Which of course pipe the thumbnail as a continuous stream into the S3 object which is never closed so never appears.

If I write the file locally instead of piping it

ffmpeg -i udp://:5010 -vf "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr -qscale:v 15 -an -s 320x180 -update 1 preview.jpg

I can see that for each thumbnail, the file is closed and overwritten at the next picture.

So I'm looking for similar way to push each thumbnail as an individual object (same name so overwrite) to S3.

I have considered using rclone/s3fs to mount the S3 as a local path but I'm concerned about how it will handle such continuous stream of file, would it add some latency or uploading only some files.

I can do some python

Thanks in advance

0

You must log in to answer this question.

Browse other questions tagged .