【发布时间】:2018-04-25 14:24:02
【问题描述】:
我正在尝试编写的应用程序基于我网络上的视频服务器获取视频流,并将其显示在 winforms 窗口中(稍后我希望在 WPF 中托管相同类型的控件)。我正在使用 gstreamer-sharp,因为我的应用基于 c#.net。
我根据this answer 中的代码示例成功地让videotestsrc 工作,并且能够使用VideoOverlayAdapter 和一组winForms 面板创建几个在窗口中显示的testvideosrc 实例。
当我开始让 rtspsrc 做同样的事情时,我自然会遇到一些我试图克服的障碍,我的课程的代码如下。
我相信我需要将 rtspsrc 的新焊盘链接到下一个元素(在本例中为 rtph264depay),而不是在初始化代码中链接 rtspsrc,而这正是我遇到麻烦的地方。
PadAdded 事件似乎有时会在启动程序的几秒钟内被触发,有时根本不会触发?服务器与基本教程(第 1 部分)的 gstreamer-sharp 版本配合良好,并且具有良好的延迟(轻松低于 300 毫秒,但一旦我的应用程序运行,我需要进行玻璃对玻璃的测试)。
一旦 PadAdded 事件最终触发,我在尝试将新垫链接到 rtph264depay 接收器垫时获得 NOFORMAT 状态。
我还注意到,我似乎没有收到准备窗口句柄总线同步消息,我将在 gstVideoOverlay 示例中设置视频覆盖适配器(因此即使pad 链接成功)。
我无法找到这个特殊问题(rtspsrc pad 未链接到 rtph264depay sink pad),因为类似的问题似乎是关于将其他元素链接在一起。
根据调试信息,初始化代码中剩余元素的初始链接成功。
最终目标是将帧导入 OpenCV/Emgu 并进行一些分析和基本覆盖工作。
对此的任何帮助将不胜感激。
非常感谢!
/// <summary>
/// class to create a gstreamer pipeline based on an rtsp stream at the provided URL
/// </summary>
class gstPipeline2
{
// elements for the pipeline
private Element rtspsrc, rtph264depay, decoder, videoConv, videoSink;
private System.Threading.Thread mainGLibThread;
private GLib.MainLoop mainLoop;
// the window handle (passed in)
private IntPtr windowHandle;
// our pipeline
private Pipeline currentPipeline = null;
/// <summary>
/// Create a new gstreamer pipeline rendering the stream at URL into the provided window handle
/// </summary>
/// <param name="WindowHandle">The handle of the window to render to </param>
/// <param name="Url">The url of the video stream</param>
public gstPipeline2(IntPtr WindowHandle, string Url)
{
windowHandle = WindowHandle; // get the handle and save it locally
// initialise the gstreamer library and associated threads (for diagnostics)
Gst.Application.Init();
mainLoop = new GLib.MainLoop();
mainGLibThread = new System.Threading.Thread(mainLoop.Run);
mainGLibThread.Start();
// create each element now for the pipeline
// starting with the rtspsrc
rtspsrc = ElementFactory.Make("rtspsrc", "udpsrc0"); // create an rtsp source
rtspsrc["location"] = Url; // and set its location (the source of the data)
rtph264depay = ElementFactory.Make("rtph264depay", "rtph264depay0");
decoder = ElementFactory.Make("avdec_h264", "decoder0");
videoConv = ElementFactory.Make("videoconvert", "videoconvert0");
videoSink = ElementFactory.Make("autovideosink", "sink0"); // and finally the sink to render the video (redirected to the required window handle below in Bus_SyncMessage() )
// create our pipeline which links all the elements together into a valid data flow
currentPipeline = new Pipeline("pipeline");
currentPipeline.Add(rtspsrc, rtph264depay, decoder, videoConv, videoSink); // add the required elements into it
// link the various bits together in the correct order
if(!rtph264depay.Link(decoder))
System.Diagnostics.Debug.WriteLine("rtph264depay could not be linked to decoder (bad)");
else
System.Diagnostics.Debug.WriteLine("rtph264depay linked to decoder (good)");
if (!decoder.Link(videoConv))
System.Diagnostics.Debug.WriteLine("decoder could not be linked to videoconvert (bad)");
else
System.Diagnostics.Debug.WriteLine("decoder linked to videoconvert (good)");
if (!videoConv.Link(videoSink))
System.Diagnostics.Debug.WriteLine("videoconvert could not be linked to autovideosink (bad)");
else
System.Diagnostics.Debug.WriteLine("videoconvert linked to autovideosink (good)");
rtspsrc.PadAdded += Rtspsrc_PadAdded; // subscribe to the PadAdded event so we can link new pads (sources of data?) to the depayloader when they arrive
// subscribe to the messaging system of the bus and pipeline so we can minotr status as we go
Bus bus = currentPipeline.Bus;
bus.AddSignalWatch();
bus.Message += Bus_Message;
bus.EnableSyncMessageEmission();
bus.SyncMessage += Bus_SyncMessage;
// finally set the state of the pipeline running so we can get data
var setStateReturn = currentPipeline.SetState(State.Null);
System.Diagnostics.Debug.WriteLine("SetStateNULL returned: " + setStateReturn.ToString());
setStateReturn = currentPipeline.SetState(State.Ready);
System.Diagnostics.Debug.WriteLine("SetStateReady returned: " + setStateReturn.ToString());
setStateReturn = currentPipeline.SetState(State.Playing);
System.Diagnostics.Debug.WriteLine("SetStatePlaying returned: " + setStateReturn.ToString());
}
private void Rtspsrc_PadAdded(object o, PadAddedArgs args)
{
System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: called with new pad named: " + args.NewPad.Name);
// a pad has been added to the source so we need to link it to the rest of the pipeline to ultimately display it onscreen
Pad sinkPad = rtph264depay.GetStaticPad("sink"); // get the sink pad for the one we have recieved so we can link to the depayloader element
System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: rtps264depay sink pad returned: " + sinkPad.Name);
PadLinkReturn ret = args.NewPad.Link(sinkPad);
System.Diagnostics.Debug.WriteLine("Rtspsrc_PadAdded: link attempt returned: " + ret.ToString());
}
public void killProcess()
{
mainLoop.Quit();
}
private void Bus_SyncMessage(object o, SyncMessageArgs args)
{
if (Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(args.Message))
{
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Message prepare window handle received by: " + args.Message.Src.Name + " " + args.Message.Src.GetType().ToString());
if (args.Message.Src != null)
{
// these checks were in the testvideosrc example and failed, args.Message.Src is always Gst.Element???
if (args.Message.Src is Gst.Video.VideoSink)
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is VideoSink");
else
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT VideoSink");
if (args.Message.Src is Gst.Bin)
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is Bin");
else
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: source is NOT Bin");
try
{
args.Message.Src["force-aspect-ratio"] = true;
}
catch (PropertyNotFoundException) { }
try
{
Gst.Video.VideoOverlayAdapter adapter = new VideoOverlayAdapter(args.Message.Src.Handle);
adapter.WindowHandle = windowHandle;
adapter.HandleEvents(true);
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Handle passed to adapter: " + windowHandle.ToString());
}
catch (Exception ex) { System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: Exception Thrown (overlay stage): " + ex.Message); }
}
}
else
{
string info;
IntPtr prt;
args.Message.ParseInfo(out prt, out info);
System.Diagnostics.Debug.WriteLine("Bus_SyncMessage: " + args.Message.Type.ToString() + " - " + info);
}
}
private void Bus_Message(object o, MessageArgs args)
{
var msg = args.Message;
//System.Diagnostics.Debug.WriteLine("HandleMessage received msg of type: {0}", msg.Type);
switch (msg.Type)
{
case MessageType.Error:
//
GLib.GException err;
string debug;
System.Diagnostics.Debug.WriteLine("Bus_Message: Error received: " + msg.ToString());
break;
case MessageType.StreamStatus:
Gst.StreamStatusType status;
Element theOwner;
msg.ParseStreamStatus(out status, out theOwner);
System.Diagnostics.Debug.WriteLine("Bus_Message: Case SteamingStatus: status is: " + status + " ; Owner is: " + theOwner.Name);
break;
case MessageType.StateChanged:
State oldState, newState, pendingState;
msg.ParseStateChanged(out oldState, out newState, out pendingState);
if (newState == State.Paused)
args.RetVal = false;
System.Diagnostics.Debug.WriteLine("Bus_Message: Pipeline state changed from {0} to {1}: ; Pending: {2}", Element.StateGetName(oldState), Element.StateGetName(newState), Element.StateGetName(pendingState));
break;
case MessageType.Element:
System.Diagnostics.Debug.WriteLine("Bus_Message: Element message: {0}", args.Message.ToString());
break;
default:
System.Diagnostics.Debug.WriteLine("Bus_Message: HandleMessage received msg of type: {0}", msg.Type);
break;
}
args.RetVal = true;
}
}
【问题讨论】:
-
我认为主要原因是元素不知道它们是否可以连接,除非它们真的在播放。仅仅因为正在接收的格式是未知的。也许在这种情况下尝试一些 decodebin 或 playbin 元素?我想它们的存在是出于这个特殊的原因。您还应该收到渲染器的同步/准备消息。您还可以对正在使用的视频接收器进行一些控制(如果您使用 playbin)。
-
嗨弗洛里安,我确实想知道我是否以正确的方式接近它,我的最终目标是使用 emgu/opencv 对图像进行分析,所以我可能会跳过并使用 appsink 来获取在数据。我知道我可以使用 emgu 在 WPF 中显示帧,因为我已经从 USB 摄像头完成了一些工作。延迟问题是我前往 gstreamer 并看到一些好的结果的原因。我可能会再坚持几天,因为很高兴回答我自己的问题以结束:)谢谢。
-
如果你对appsink感兴趣,你可能想试试
playbin,并用你的RTSP uri设置uri属性,用之前实例化的appsink设置video-sink属性。 -
嗨,弗洛里安,是的,我刚刚开始尝试使用 appsink。我设法使第一阶段工作(请参阅下面的答案)。我将做一些进一步的工作来访问单个帧/样本,这样我就可以做我接下来想做的覆盖/记录/分析工作。感谢您的提示!