【发布时间】:2022-01-20 03:19:16
【问题描述】:
我想直接使用 Gstreamer 管道和 OpenCV 来管理来自相机的图像采集。目前我没有相机,所以我一直在尝试从 URI 和本地文件中获取视频。我正在使用带有 L4T(ubuntu 18.04)的 Jetson AGX Xavier,我的 OpenCV 构建包含 Gstreamer,并且两个库似乎都可以独立工作。
我遇到的问题是,当我使用 cv2.CAP_GSTREAMER 将定义管道的字符串传递给 VideoCapture 类时,我收到一些警告,如下所示:
[WARN:0] 全局 /home/nvidia/opencv/modules/videoio/src/cap_gstreamer.cpp (854) 打开 OpenCV | GStreamer 警告:打开 bin 时出错:无法将 playbin0 链接到 我定义的任何接收器
[WARN:0] global /home/nvidia/opencv/modules/videoio/src/cap_gstreamer.cpp (597) isPipelinePlaying OpenCV | GStreamer 警告:GStreamer:管道尚未创建
我已经尝试了几个选项,您可以在下一个代码中看到它们:
bool receiver(const char* context)
{
VideoCapture cap(context, CAP_GSTREAMER);
int fail = 0;
while(!cap.isOpened())
{
cout<<"VideoCapture not opened"<<endl;
fail ++;
if (fail > 10){
return false;
}
continue;
}
Mat frame;
while(true) {
cap.read(frame);
if(frame.empty())
return true;
imshow("Receiver", frame);
if(waitKey(1) == 'r')
return false;
}
destroyWindow("Receiver");
return true;
}
int main(int argc, char *argv[])
{
GstElement *pipeline;
const char* context = "gstlaunch v udpsrc port=5000 caps=\"application/xrtp\" ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! ximagesink sync=false"; //Command for the camera that I don't have yet
const char* test_context = "gstlaunch playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm";
const char* thermal_context = "playbin uri=file:///home/nvidia/repos/vidtest/thermalVideo.avi ! appsink name=thermalsink";
const char* local_context = "playbin uri=file:///home/nvidia/repos/flir/Video.avi";
// gst_init(&argc, &argv);
// pipeline = gst_parse_launch(test_context, NULL);
bool correct_execution = receiver(thermal_context);
if(correct_execution){
cout << "openCV - gstreamer works!" << endl;
} else {
cout << "openCV - gstreamer FAILED" << endl;
}
}
对于我测试过的命令,错误 isPipelinePlaying OpenCV | GStreamer 警告:GStreamer: pipeline has not been created 是持久的,如果我没有定义 AppSink,上面显示的错误会更改为 open OpenCV | GStreamer 警告:在手动管道中找不到应用程序接收器。 从警告中我可以理解管道不完整或未正确创建但我不知道为什么,我遵循了我在网上找到的示例,它们不包含任何其他步骤。
另外,当直接使用 Gstreamer 管道可视化流时,当我尝试打开本地视频时,一切似乎都正常,但第一帧被冻结并且不显示视频,它只是停留在第一帧框架。你知道为什么会这样吗?使用 playbin uri 指向一个互联网地址,一切正常......代码是下一个:
#include <gst/gst.h>
#include <unistd.h> // for sleep function
#include <iostream>
using namespace std;
int main (int argc, char *argv[])
{
GstElement *pipeline;
GstBus *bus;
GstMessage *msg;
const char* context = "gstlaunch v udpsrc port=5000 caps=\"application/xrtp\" ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! ximagesink sync=false";
const char* local_context = "gst-launch-1.0 -v playbin uri=file:///home/nvidia/repos/APPIDE/vidtest/THERMAL/thermalVideo.avi";
const char* test_context = "gstlaunch playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm";
// Initialize gstreamer
gst_init (&argc, &argv);
// Create C pipeline from terminal command (context)
pipeline = gst_parse_launch(local_context, NULL);
// Start the pipeline
gst_element_set_state(pipeline, GST_STATE_PLAYING);
// Wait until error or EOS
bus = gst_element_get_bus (pipeline);
gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, (GstMessageType)(GST_MESSAGE_ERROR | GST_MESSAGE_EOS));
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
// g_print(msg);
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
}
【问题讨论】: