0

I am trying to capture video from my webcam and send it to via RTSP stream using open cv in C++.I am not worked much on c++ before so please avoid mistakes their below is my code that writes webcam stream to a file but I want to stream it to a RTSP server.

cv::VideoWriter virtualWebcam;
HRESULT hr = CoInitialize(NULL);
if (SUCCEEDED(hr)) {
    virtualWebcam.open(
        "./file.avi", cv::CAP_ANY,
        cv::VideoWriter::fourcc('M', 'J', 'P', 'G'),
        camera_frame_rate,
        cv::Size(static_cast < int > (camera_frame_width), static_cast < int > (camera_frame_height)),
        true);

    if (!virtualWebcam.isOpened()) {
        cerr << "Error opening virtual webcam\n";
        return 1;
    }
} else {
    cerr << "Error initializing COM library for virtual webcam\n";
    return 1;
}

I also tried doing something like virtualWebcam.open("rtsp://localhost:8554/stream", cv::CAP_FFMPEG,cv::VideoWriter::fourcc('M', 'J','P', 'G'),camera_frame_rate,cv::Size(static_cast < int > (camera_frame_width), static_cast < int > (camera_frame_height)), true); and tried cv::CAP_GSTREAMER but this is also not working

Any help is much appreciated.Thanks

I want to send webcam video stream to a RTSP server/make a RTSP server from here only and send the stream over it

4
  • You should take a look at gstreamer. AFAIK OpenCV alone is enough for creating RTSP server. Commented Apr 11, 2023 at 6:49
  • 1
    According to the following post It suppose to work with cv::CAP_GSTREAMER. Last time I tested, I could only capture the stream using GStreamer (it was not working with FFplay for example). We don't have to execute CoInitialize. And I don't think RTSP supports MJPG codec. I think I can post an answer that uses ffmpeg.exe as sub-process if you want (writing OpenCV frames to stdin pipe of FFmpeg).
    – Rotem
    Commented Apr 11, 2023 at 10:36
  • @Rotem Any answer would be helpful if you can post a answer with ffmpeg it could definitely help. Commented Apr 11, 2023 at 11:31
  • 1
    @AitreyaVerma Can you please respond to my answer? I did some reading, and figured out that using GStreamer backend, the transitions protocol is RTP and not RTSP. For RTSP support we have to use GStreamer RTSP Server as described in the following answer.
    – Rotem
    Commented Apr 18, 2023 at 11:59

1 Answer 1

3

Creating RTSP stream using cv::VideoWriter is supported using cv::CAP_GSTREAMER backend, but not supported using cv::CAP_FFMPEG backend.
Using GStreamer backend is complicated, and requires to build OpenCV with GStreamer.
The following post shows an example for creating RTSP stream using GStreamer backend.
For some reason the created stream can be captured using GStreamer, but can't be captured using other applications (I can't find what's missing).

Instead of cv::VideoWriter, we may use FFmpeg CLI.
We may execute FFmpeg as sub-process, and write the video frames to stdin pipe - using the same technique that described in my following answer.


We may use FFmpeg command line as follows:

ffmpeg -re -f rawvideo -r 10 -video_size 640x480 -pixel_format bgr24 -i pipe: -vcodec libx264 -crf 24 -pix_fmt yuv420p -f rtsp rtsp://localhost:8554/stream

  • -re is used for "live streaming" - slows down the transmission to rate of the frame-rate (for simulating a "virtual webcam").
  • -r 10 -video_size 640x480 -pixel_format bgr24 - defines 10fps, 640x480 resolution, and BGR pixel format.
  • -i pipe: - Defines that FFmpeg input is stdin pipe.
  • -vcodec libx264 -crf 24 -pix_fmt yuv420p - Defines H.264 codec, with nominal quality (and color subsampling to YUV420).
  • -f rtsp rtsp://localhost:8554/stream - Defines the RTSP output format and output stream port and name.

For capturing the RTSP stream using FFplay, execute FFplay before executing the C++ code:

ffplay -rtsp_flags listen -i rtsp://localhost:8554/stream


The following C++ code sample send synthetic OpenCV images to RTSP video stream:

#include <stdio.h>
#include "opencv2/opencv.hpp"
#include <string>

//For receiving the RTSP video stream with FFplay, execute: "ffplay -rtsp_flags listen -i rtsp://localhost:8554/stream" before exectuing this program.
int main()
{
    // 10000 frames, resolution 640x480, and 10 fps
    int width = 640;
    int height = 480;
    int n_frames = 10000;
    int fps = 10;

    const std::string output_stream = "rtsp://localhost:8554/stream";   //Send RTSP to port 8554 of "localhost", with stream named "stream".

    //Open FFmpeg application as sub-process.
    //FFmpeg input PIPE : RAW images in BGR color format.
    //FFmpeg output: RTSP stream encoded with H.264 codec (using libx264 encoder).
    //Adding '-re' slows down the transmission to rate of the fps (for simulating a "virtual webcam").
    std::string ffmpeg_cmd = std::string("ffmpeg -re -f rawvideo -r ") + std::to_string(fps) +
        " -video_size " + std::to_string(width) + "x" + std::to_string(height) +
        " -pixel_format bgr24 -i pipe: -vcodec libx264 -crf 24 -pix_fmt yuv420p -f rtsp " + output_stream;

    //Execute FFmpeg as sub-process, open stdin pipe (of FFmpeg sub-process) for writing.
    //In Windows we need to use _popen and in Linux popen
#ifdef _MSC_VER
    FILE* pipeout = _popen(ffmpeg_cmd.c_str(), "wb");   //Windows (ffmpeg.exe must be in the execution path)
#else
    //https://batchloaf.wordpress.com/2017/02/12/a-simple-way-to-read-and-write-audio-and-video-files-in-c-using-ffmpeg-part-2-video/
    FILE* pipeout = popen(ffmpeg_cmd.c_str(), "w");     //Linux (assume ffmpeg exist in /usr/bin/ffmpeg (and in path).
#endif

    cv::Mat frame = cv::Mat(height, width, CV_8UC3); //Initialize frame.

    for (int i = 0; i < n_frames; i++)
    {
        //Build synthetic image for testing ("render" a video frame):
        frame = cv::Scalar(60, 60, 60); //Fill background with dark gray
        cv::putText(frame, std::to_string(i+1), cv::Point(width/2 - 100*(int)(std::to_string(i+1).length()), height/2+100), cv::FONT_HERSHEY_DUPLEX, 10, cv::Scalar(255, 30, 30), 20);  // Draw a blue number
        //cv::imshow("frame", frame); cv::waitKey(1); //Show the frame for testing

        //Write width*height*3 bytes to stdin pipe of FFmpeg sub-process (assume frame data is continuous in the RAM).
        fwrite(frame.data, 1, (size_t)width*height*3, pipeout);
    }

#ifdef _MSC_VER
    _pclose(pipeout);   //Windows
#else
    pclose(pipeout);    //Linux
#endif

    return 0;
}

Note:
For avoiding the need of executing FFplay from advance, we may execute MediaMTX as "rtsp-simple-server" (keep it in running at the background).
Than it's also possible to receive the stream using VLC video player (for example).

VLC example:
enter image description here

Not the answer you're looking for? Browse other questions tagged or ask your own question.