【问题标题】:Android Webrtc record video from the stream coming from the other peerAndroid Webrtc 从来自其他对等方的流中录制视频
【发布时间】:2020-07-29 14:31:17
【问题描述】:

我正在开发一个 webrtc 视频通话 Android 应用程序,它工作得很好,我需要记录其他对等点 (remoteVideoStream) 和 myStream (localVideoStream) 的视频并将其转换为像 mp4 或任何其他格式的可保存格式,我真的搜索过,但无法弄清楚如何完成这项工作。

我已阅读有关 VideoFileRenderer 的信息,我尝试将其添加到我的代码中以保存视频,但也无法使用它,它没有任何调用方法,例如 record() 或 save(),尽管它有一个名为release() 将用于结束保存视频。如果有人有任何想法,这里是课程:

@JNINamespace("webrtc::jni")
public class VideoFileRenderer implements Callbacks, VideoSink {
private static final String TAG = "VideoFileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private final FileOutputStream videoOutFile;
private final String outputFileName;
private final int outputFileWidth;
private final int outputFileHeight;
private final int outputFrameSize;
private final ByteBuffer outputFrameBuffer;
private EglBase eglBase;
private YuvConverter yuvConverter;
private ArrayList<ByteBuffer> rawFrames = new ArrayList();

public VideoFileRenderer(String outputFile, int outputFileWidth, int outputFileHeight, final Context sharedContext) throws IOException {
    if (outputFileWidth % 2 != 1 && outputFileHeight % 2 != 1) {
        this.outputFileName = outputFile;
        this.outputFileWidth = outputFileWidth;
        this.outputFileHeight = outputFileHeight;
        this.outputFrameSize = outputFileWidth * outputFileHeight * 3 / 2;
        this.outputFrameBuffer = ByteBuffer.allocateDirect(this.outputFrameSize);
        this.videoOutFile = new FileOutputStream(outputFile);
        this.videoOutFile.write(("YUV4MPEG2 C420 W" + outputFileWidth + " H" + outputFileHeight + " Ip F30:1 A1:1\n").getBytes(Charset.forName("US-ASCII")));
        this.renderThread = new HandlerThread("VideoFileRenderer");
        this.renderThread.start();
        this.renderThreadHandler = new Handler(this.renderThread.getLooper());
        ThreadUtils.invokeAtFrontUninterruptibly(this.renderThreadHandler, new Runnable() {
            public void run() {
                VideoFileRenderer.this.eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER);
                VideoFileRenderer.this.eglBase.createDummyPbufferSurface();
                VideoFileRenderer.this.eglBase.makeCurrent();
                VideoFileRenderer.this.yuvConverter = new YuvConverter();
            }
        });
    } else {
        throw new IllegalArgumentException("Does not support uneven width or height");
    }
}

public void renderFrame(I420Frame i420Frame) {
    VideoFrame frame = i420Frame.toVideoFrame();
    this.onFrame(frame);
    frame.release();
}

public void onFrame(VideoFrame frame) {
    frame.retain();
    this.renderThreadHandler.post(() -> {
        this.renderFrameOnRenderThread(frame);
    });
}

private void renderFrameOnRenderThread(VideoFrame frame) {
    Buffer buffer = frame.getBuffer();
    int targetWidth = frame.getRotation() % 180 == 0 ? this.outputFileWidth : this.outputFileHeight;
    int targetHeight = frame.getRotation() % 180 == 0 ? this.outputFileHeight : this.outputFileWidth;
    float frameAspectRatio = (float)buffer.getWidth() / (float)buffer.getHeight();
    float fileAspectRatio = (float)targetWidth / (float)targetHeight;
    int cropWidth = buffer.getWidth();
    int cropHeight = buffer.getHeight();
    if (fileAspectRatio > frameAspectRatio) {
        cropHeight = (int)((float)cropHeight * (frameAspectRatio / fileAspectRatio));
    } else {
        cropWidth = (int)((float)cropWidth * (fileAspectRatio / frameAspectRatio));
    }

    int cropX = (buffer.getWidth() - cropWidth) / 2;
    int cropY = (buffer.getHeight() - cropHeight) / 2;
    Buffer scaledBuffer = buffer.cropAndScale(cropX, cropY, cropWidth, cropHeight, targetWidth, targetHeight);
    frame.release();
    I420Buffer i420 = scaledBuffer.toI420();
    scaledBuffer.release();
    ByteBuffer byteBuffer = JniCommon.nativeAllocateByteBuffer(this.outputFrameSize);
    YuvHelper.I420Rotate(i420.getDataY(), i420.getStrideY(), i420.getDataU(), i420.getStrideU(), i420.getDataV(), i420.getStrideV(), byteBuffer, i420.getWidth(), i420.getHeight(), frame.getRotation());
    i420.release();
    byteBuffer.rewind();
    this.rawFrames.add(byteBuffer);
}

public void release() {
    CountDownLatch cleanupBarrier = new CountDownLatch(1);
    this.renderThreadHandler.post(() -> {
        this.yuvConverter.release();
        this.eglBase.release();
        this.renderThread.quit();
        cleanupBarrier.countDown();
    });
    ThreadUtils.awaitUninterruptibly(cleanupBarrier);

    try {
        Iterator var2 = this.rawFrames.iterator();

        while(var2.hasNext()) {
            ByteBuffer buffer = (ByteBuffer)var2.next();
            this.videoOutFile.write("FRAME\n".getBytes(Charset.forName("US-ASCII")));
            byte[] data = new byte[this.outputFrameSize];
            buffer.get(data);
            this.videoOutFile.write(data);
            JniCommon.nativeFreeByteBuffer(buffer);
        }

        this.videoOutFile.close();
        Logging.d("VideoFileRenderer", "Video written to disk as " + this.outputFileName + ". Number frames are " + this.rawFrames.size() + " and the dimension of the frames are " + this.outputFileWidth + "x" + this.outputFileHeight + ".");
    } catch (IOException var5) {
        Logging.e("VideoFileRenderer", "Error writing video to disk", var5);
    }

}

}

我找不到任何有用的方法。

【问题讨论】:

  • 你成功了吗?

标签: android webrtc


【解决方案1】:

VideoFileRenderer 类只是演示了如何访问远程/本地对等点的解码后的原始视频帧。 这不是录制有效的视频文件。
您应该手动实现将原始视频帧编码和混合到容器中的逻辑,例如 mp4。

主要流程是这样的:

  • 切换到最新的 webrtc 版本(目前为 v.1.0.25331)
  • 创建视频容器。例如,请参阅 Android SDK 中的 MediaMuxer
  • 实现接口 VideoSink 用于从特定视频源获取原始帧。例如见apprtc/CallActivity.javaclass ProxyVideoSink
  • 使用MediaCodec对每一帧进行编码并写入视频容器
  • 完成复用器

【讨论】:

  • 如果你提前 3 天来,我会给你 50 名声望赏金,可惜它们已经过期了。还是谢谢
  • 知道音频流的等效原理吗?我有一个相关的问题。
  • @jackz314 检查 apprtc/RecordedAudioToFileController.java 以及它在 PeerConnectionClient.java 中的使用方式
  • 请@Onix你能回答这个问题:stackoverflow.com/questions/55201774/…
【解决方案2】:

为了能够录制视频,我必须按照@Onix 所说的那样做,但幸运的是我发现这里有很多实现是我选择的: https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java

你可以在这里找到另一个实现:https://chromium.googlesource.com/external/webrtc/+/f33970b15e0eeb46548fa602f6d0c1fcfd44dd19/webrtc/api/android/java/src/org/webrtc/VideoFileRenderer.java 但是这在webrtc的更新版本中不起作用,所以我选择了上一个。

现在剩下的就是创建新类VideoFileRenderer 的实例(我正在附加的VideoSink 的实现),在流准备好并工作之后,一旦我想停止视频录制,我只需要调用方法 release()

【讨论】:

  • 您的视频大小如何?我认为没有应用任何压缩。
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多