【问题标题】:gstreamer, rendering rtspsrc in winforms (and WPF)gstreamer,在 winforms(和 WPF)中渲染 rtspsrc
【发布时间】:2018-04-25 14:24:02
【问题描述】:

我正在尝试编写的应用程序基于我网络上的视频服务器获取视频流,并将其显示在 winforms 窗口中(稍后我希望在 WPF 中托管相同类型的控件)。我正在使用 gstreamer-sharp,因为我的应用基于 c#.net。

我根据this answer 中的代码示例成功地让videotestsrc 工作,并且能够使用VideoOverlayAdapter 和一组winForms 面板创建几个在窗口中显示的testvideosrc 实例。

当我开始让 rtspsrc 做同样的事情时,我自然会遇到一些我试图克服的障碍,我的课程的代码如下。

我相信我需要将 rtspsrc 的新焊盘链接到下一个元素(在本例中为 rtph264depay),而不是在初始化代码中链接 rtspsrc,而这正是我遇到麻烦的地方。

PadAdded 事件似乎有时会在启动程序的几秒钟内被触发,有时根本不会触发?服务器与基本教程(第 1 部分)的 gstreamer-sharp 版本配合良好,并且具有良好的延迟(轻松低于 300 毫秒,但一旦我的应用程序运行,我需要进行玻璃对玻璃的测试)。

一旦 PadAdded 事件最终触发,我在尝试将新垫链接到 rtph264depay 接收器垫时获得 NOFORMAT 状态。

我还注意到,我似乎没有收到准备窗口句柄总线同步消息,我将在 gstVideoOverlay 示例中设置视频覆盖适配器(因此即使pad 链接成功)。

我无法找到这个特殊问题(rtspsrc pad 未链接到 rtph264depay sink pad),因为类似的问题似乎是关于将其他元素链接在一起。

根据调试信息,初始化代码中剩余元素的初始链接成功。

最终目标是将帧导入 OpenCV/Emgu 并进行一些分析和基本覆盖工作。

对此的任何帮助将不胜感激。

非常感谢!

/// <summary>
/// class to create a gstreamer pipeline based on an rtsp stream at the provided URL
/// </summary>
class gstPipeline2
{
    // elements for the pipeline
    private Element rtspsrc, rtph264depay, decoder, videoConv, videoSink;
    private System.Threading.Thread mainGLibThread;
    private GLib.MainLoop mainLoop;

    // the window handle (passed in)
    private IntPtr windowHandle;
    // our pipeline
    private Pipeline currentPipeline = null;

    /// <summary>
    /// Create a new gstreamer pipeline rendering the stream at URL into the provided window handle 
    /// </summary>
    /// <param name="WindowHandle">The handle of the window to render to </param>
    /// <param name="Url">The url of the video stream</param>
    public gstPipeline2(IntPtr WindowHandle, string Url)
    {
        windowHandle = WindowHandle;    // get the handle and save it locally

        // initialise the gstreamer library and associated threads (for diagnostics)
        Gst.Application.Init(); 
        mainLoop = new GLib.MainLoop();
        mainGLibThread = new System.Threading.Thread(mainLoop.Run);
        mainGLibThread.Start();

        // create each element now for the pipeline
        // starting with the rtspsrc
        rtspsrc = ElementFactory.Make("rtspsrc", "udpsrc0");  // create an rtsp source
        rtspsrc["location"] = Url;   // and set its location (the source of the data)
        rtph264depay = ElementFactory.Make("rtph264depay", "rtph264depay0");    
        decoder = ElementFactory.Make("avdec_h264", "decoder0");    
        videoConv = ElementFactory.Make("videoconvert", "videoconvert0");   
        videoSink = ElementFactory.Make("autovideosink", "sink0");  // and finally the sink to render the video (redirected to the required window handle below in Bus_SyncMessage() ) 

        // create our pipeline which links all the elements together into a valid data flow
        currentPipeline = new Pipeline("pipeline");
        currentPipeline.Add(rtspsrc, rtph264depay, decoder, videoConv, videoSink); // add the required elements into it

        // link the various bits together in the correct order
        if(!rtph264depay.Link(decoder))
            System.Diagnostics.Debug.WriteLine("rtph264depay could not be linked to decoder (bad)");
        else
            System.Diagnostics.Debug.WriteLine("rtph264depay linked to decoder (good)");

        if (!decoder.Link(videoConv))
            System.Diagnostics.Debug.WriteLine("decoder could not be linked to videoconvert (bad)");
        else
            System.Diagnostics.Debug.WriteLine("decoder linked to videoconvert (good)");

        if (!videoConv.Link(videoSink))
            System.Diagnostics.Debug.WriteLine("videoconvert could not be linked to autovideosink (bad)");
        else
            System.Diagnostics.Debug.WriteLine("videoconvert linked to autovideosink (good)");

        rtspsrc.PadAdded += Rtspsrc_PadAdded; // subscribe to the PadAdded event so we can link new pads (sources of data?) to the depayloader when they arrive

        // subscribe to the messaging system of the bus and pipeline so we can minotr status as we go
        Bus bus = currentPipeline.Bus;
        bus.AddSignalWatch();
        bus.Message += Bus_Message;

        bus.EnableSyncMessageEmission();
        bus.SyncMessage += Bus_SyncMessage;

        // finally set the state of the pipeline running so we can get data
        var setStateReturn = currentPipeline.SetState(State.Null);
        System.Diagnostics.Debug.WriteLine("SetStateNULL returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Ready);
        System.Diagnostics.Debug.WriteLine("SetStateReady returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Playing);
        System.Diagnostics.Debug.WriteLine("SetStatePlaying returned: " + setStateReturn.ToString());
    }

    private void Rtspsrc_PadAdded(object o, PadAddedArgs args)
    {
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: called with new pad named: " + args.NewPad.Name);

        // a pad has been added to the source so we need to link it to the rest of the pipeline to ultimately display it onscreen
        Pad sinkPad = rtph264depay.GetStaticPad("sink");   // get the sink pad for the one we have recieved  so we can link to the depayloader element
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: rtps264depay sink pad returned: " + sinkPad.Name);

        PadLinkReturn ret = args.NewPad.Link(sinkPad);
        System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: link attempt returned: " + ret.ToString());
    }

    public void killProcess()
    {
        mainLoop.Quit();
    }

    private void Bus_SyncMessage(object o, SyncMessageArgs args)
    {
        if (Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(args.Message))
        {
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Message prepare window handle received by: " + args.Message.Src.Name + " " + args.Message.Src.GetType().ToString());

            if (args.Message.Src != null)
            {
                // these checks were in the testvideosrc example and failed, args.Message.Src is always Gst.Element???
                if (args.Message.Src is Gst.Video.VideoSink)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is VideoSink");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT VideoSink");

                if (args.Message.Src is Gst.Bin)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is Bin");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT Bin");

                try
                {
                    args.Message.Src["force-aspect-ratio"] = true;
                }
                catch (PropertyNotFoundException) { }

                try
                {
                    Gst.Video.VideoOverlayAdapter adapter = new VideoOverlayAdapter(args.Message.Src.Handle);
                    adapter.WindowHandle = windowHandle;
                    adapter.HandleEvents(true);
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Handle passed to adapter: " + windowHandle.ToString());
                }
                catch (Exception ex) { System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Exception Thrown (overlay stage): " + ex.Message); }
            }
        }
        else
        {
            string info;
            IntPtr prt;
            args.Message.ParseInfo(out prt, out info);
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: " + args.Message.Type.ToString() + " - " + info);
        }
    }

    private void Bus_Message(object o, MessageArgs args)
    {
        var msg = args.Message;
        //System.Diagnostics.Debug.WriteLine("HandleMessage received msg of type: {0}", msg.Type);
        switch (msg.Type)
        {
            case MessageType.Error:
                //
                GLib.GException err;
                string debug;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Error received: " + msg.ToString());
                break;
            case MessageType.StreamStatus:
                Gst.StreamStatusType status;
                Element theOwner;
                msg.ParseStreamStatus(out status, out theOwner);
                System.Diagnostics.Debug.WriteLine("Bus_Message: Case SteamingStatus: status is: " + status + " ; Owner is: " + theOwner.Name);
                break;
            case MessageType.StateChanged:
                State oldState, newState, pendingState;
                msg.ParseStateChanged(out oldState, out newState, out pendingState);
                if (newState == State.Paused)
                    args.RetVal = false;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Pipeline state changed from {0} to {1}: ; Pending: {2}", Element.StateGetName(oldState), Element.StateGetName(newState), Element.StateGetName(pendingState));
                break;
            case MessageType.Element:
                System.Diagnostics.Debug.WriteLine("Bus_Message: Element message: {0}", args.Message.ToString());
                break;
            default:
                System.Diagnostics.Debug.WriteLine("Bus_Message: HandleMessage received msg of type: {0}", msg.Type);
                break;
        }
        args.RetVal = true;
    }
}

【问题讨论】:

  • 我认为主要原因是元素不知道它们是否可以连接,除非它们真的在播放。仅仅因为正在接收的格式是未知的。也许在这种情况下尝试一些 decodebin 或 playbin 元素?我想它们的存在是出于这个特殊的原因。您还应该收到渲染器的同步/准备消息。您还可以对正在使用的视频接收器进行一些控制(如果您使用 playbin)。
  • 嗨弗洛里安,我确实想知道我是否以正确的方式接近它,我的最终目标是使用 emgu/opencv 对图像进行分析,所以我可能会跳过并使用 appsink 来获取在数据。我知道我可以使用 emgu 在 WPF 中显示帧,因为我已经从 USB 摄像头完成了一些工作。延迟问题是我前往 gstreamer 并看到一些好的结果的原因。我可能会再坚持几天,因为很高兴回答我自己的问题以结束:)谢谢。
  • 如果你对appsink感兴趣,你可能想试试playbin,并用你的RTSP uri设置uri属性,用之前实例化的appsink设置video-sink属性。
  • 嗨,弗洛里安,是的,我刚刚开始尝试使用 appsink。我设法使第一阶段工作(请参阅下面的答案)。我将做一些进一步的工作来访问单个帧/样本,这样我就可以做我接下来想做的覆盖/记录/分析工作。感谢您的提示!

标签: c# wpf winforms gstreamer


【解决方案1】:

好的,我设法克服了我遇到的问题。

第一个问题(添加的 pad 没有被一致地调用)似乎通过为 x64 而不是任何 cpu 或 x86 构建来解决。我怀疑我的 gstreamer 库的安装没有正确完成。

第二个问题(连接新垫时的 NOFORMAT)需要更多的工作。最后,我遵循了 Florian 的建议,并研究了使用 uridecodebin 作为源,并将新垫直接链接到 autovideosink...。中间没有元素。

我现在添加了一个一致的新垫,并且每次都发送准备窗口句柄总线同步消息。我现在有四个独立的 IP 流到四个具有良好延迟的 winforms 面板(仍在测试玻璃到玻璃)。

为了确保延迟得到(某种程度的)调整,我不得不深入研究 uridecodebin 的源设置信号,并假设源是 rtspsrc 类型,然后设置它的“延迟”属性。下面的代码不会验证源类型,所以 YMMV 并且您可能会在此处遇到异常。

请查看下面对我有用的类的源代码(针对 x64 编译)。

希望这对那里的任何人都有帮助。

现在进入应用程序!! :)

/// <summary>
/// class to create a gstreamer pipeline based on an rtsp stream at the provided URL
/// </summary>
class gstPipeline2
{
    // elements for the pipeline
    private Element uriDecodeBin, videoSink;
    private System.Threading.Thread mainGLibThread;
    private GLib.MainLoop mainLoop;

    // the window handle (passed in)
    private IntPtr windowHandle;
    // our pipeline
    private Pipeline currentPipeline = null;

    /// <summary>
    /// Create a new gstreamer pipeline rendering the stream at URL into the provided window handle 
    /// </summary>
    /// <param name="WindowHandle">The handle of the window to render to </param>
    /// <param name="Url">The url of the video stream</param>
    public gstPipeline2(string Url, IntPtr WindowHandle)
    {
        windowHandle = WindowHandle;    // get the handle and save it locally

        // initialise the gstreamer library and associated threads (for diagnostics)
        Gst.Application.Init();

        mainLoop = new GLib.MainLoop();
        mainGLibThread = new System.Threading.Thread(mainLoop.Run);
        mainGLibThread.Start();

        // create each element now for the pipeline
        uriDecodeBin = ElementFactory.Make("uridecodebin", "uriDecodeBin0");  // create an uridecodebin (which handles most of the work for us!!)
        uriDecodeBin["uri"] = Url;   // and set its location (the source of the data)
        videoSink = ElementFactory.Make("autovideosink", "sink0");  // and finally the sink to render the video (redirected to the required window handle below in Bus_SyncMessage() ) 

        // create our pipeline which links all the elements together into a valid data flow
        currentPipeline = new Pipeline("pipeline");
        currentPipeline.Add(uriDecodeBin, videoSink); // add the required elements into it

        uriDecodeBin.PadAdded += uriDecodeBin_PadAdded; // subscribe to the PadAdded event so we can link new pads (sources of data?) to the depayloader when they arrive
        uriDecodeBin.Connect("source-setup", SourceSetup);  // subscribe to the "source-setup" signal, not quite done in the usual C# eventing way but treat it as essentially the same

        // subscribe to the messaging system of the bus and pipeline so we can monitor status as we go
        Bus bus = currentPipeline.Bus;
        bus.AddSignalWatch();
        bus.Message += Bus_Message;

        bus.EnableSyncMessageEmission();
        bus.SyncMessage += Bus_SyncMessage;

        // finally set the state of the pipeline running so we can get data
        var setStateReturn = currentPipeline.SetState(State.Null);
        System.Diagnostics.Debug.WriteLine("SetStateNULL returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Ready);
        System.Diagnostics.Debug.WriteLine("SetStateReady returned: " + setStateReturn.ToString());
        setStateReturn = currentPipeline.SetState(State.Playing);
        System.Diagnostics.Debug.WriteLine("SetStatePlaying returned: " + setStateReturn.ToString());
    }

    private void uriDecodeBin_PadAdded(object o, PadAddedArgs args)
    {
        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: called with new pad named: " + args.NewPad.Name);

        // a pad has been added to the source so we need to link it to the rest of the pipeline to ultimately display it onscreen
        Pad sinkPad = videoSink.GetStaticPad("sink");   // get the pad for the one we have recieved  so we can link to the depayloader element
        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: queue pad returned: " + sinkPad.Name);

        PadLinkReturn ret = args.NewPad.Link(sinkPad);

        System.Diagnostics.Debug.WriteLine("uriDecodeBin_PadAdded: link attempt returned: " + ret.ToString());
    }

    void SourceSetup(object sender, GLib.SignalArgs args)
    {
        // we need to delve into the source portion of the uridecodebin to modify the "latency" property, need to add some validation here to ensure this is an rtspsrc
        var source = (Element)args.Args[0];
        System.Diagnostics.Debug.WriteLine("SourceSetup: source is named: " + source.Name + ", and is of type: " + source.NativeType.ToString());
        source["latency"] = 0;  // this COULD throw an exception if the source is not rtspsrc or similar with a "latency" property
    }

    public void killProcess()
    {
        mainLoop.Quit();
    }

    private void Bus_SyncMessage(object o, SyncMessageArgs args)
    {
        if (Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(args.Message))
        {
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Message prepare window handle received by: " + args.Message.Src.Name + " " + args.Message.Src.GetType().ToString());

            if (args.Message.Src != null)
            {
                // these checks were in the testvideosrc example and failed, args.Message.Src is always Gst.Element???
                if (args.Message.Src is Gst.Video.VideoSink)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is VideoSink");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT VideoSink");

                if (args.Message.Src is Gst.Bin)
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is Bin");
                else
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT Bin");

                try
                {
                    args.Message.Src["force-aspect-ratio"] = true;
                }
                catch (PropertyNotFoundException) { }

                try
                {
                    Gst.Video.VideoOverlayAdapter adapter = new VideoOverlayAdapter(args.Message.Src.Handle);
                    adapter.WindowHandle = windowHandle;
                    adapter.HandleEvents(true);
                    System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Handle passed to adapter: " + windowHandle.ToString());
                }
                catch (Exception ex) { System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Exception Thrown (overlay stage): " + ex.Message); }
            }
        }
        else
        {
            string info;
            IntPtr prt;
            args.Message.ParseInfo(out prt, out info);
            System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: " + args.Message.Type.ToString() + " - " + info);
        }
    }

    private void Bus_Message(object o, MessageArgs args)
    {
        var msg = args.Message;
        //System.Diagnostics.Debug.WriteLine("HandleMessage received msg of type: {0}", msg.Type);
        switch (msg.Type)
        {
            case MessageType.Error:
                //
                GLib.GException err;
                string debug;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Error received: " + msg.ToString());
                break;
            case MessageType.StreamStatus:
                Gst.StreamStatusType status;
                Element theOwner;
                msg.ParseStreamStatus(out status, out theOwner);
                System.Diagnostics.Debug.WriteLine("Bus_Message: Case SteamingStatus: status is: " + status + " ; Owner is: " + theOwner.Name);
                break;
            case MessageType.StateChanged:
                State oldState, newState, pendingState;
                msg.ParseStateChanged(out oldState, out newState, out pendingState);
                if (newState == State.Paused)
                    args.RetVal = false;
                System.Diagnostics.Debug.WriteLine("Bus_Message: Pipeline state changed from {0} to {1}: ; Pending: {2}", Element.StateGetName(oldState), Element.StateGetName(newState), Element.StateGetName(pendingState));
                break;
            case MessageType.Element:
                System.Diagnostics.Debug.WriteLine("Bus_Message: Element message: {0}", args.Message.ToString());
                break;
            default:
                System.Diagnostics.Debug.WriteLine("Bus_Message: HandleMessage received msg of type: {0}", msg.Type);
                break;
        }
        args.RetVal = true;
    }
}

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2020-12-22
    • 1970-01-01
    • 2019-09-22
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多