【问题标题】:android - How to mux audio file and video file?android - 如何混合音频文件和视频文件?
【发布时间】:2015-10-12 20:32:30
【问题描述】:

我有一个从麦克风录制的 3gp 文件和一个 mp4 视频文件。 我想将音频文件和视频文件混合到一个 mp4 文件中并保存。 我搜索了很多,但没有发现任何有助于使用 Android 的 MediaMuxer api 的东西。 MediaMuxer api

更新:这是我混合两个文件的方法,我有一个例外。 原因是目标mp4文件没有任何音轨! someOne 可以帮我将音频和视频轨道添加到 muxer 吗?

异常

java.lang.IllegalStateException: Failed to stop the muxer

我的代码:

private void cloneMediaUsingMuxer( String dstMediaPath) throws IOException {
    // Set up MediaExtractor to read from the source.
    MediaExtractor soundExtractor = new MediaExtractor();
    soundExtractor.setDataSource(audioFilePath);
    MediaExtractor videoExtractor = new MediaExtractor();
    AssetFileDescriptor afd2 = getAssets().openFd("Produce.MP4");
    videoExtractor.setDataSource(afd2.getFileDescriptor() , afd2.getStartOffset(),afd2.getLength());


    //PATH
    //extractor.setDataSource();
    int trackCount = soundExtractor.getTrackCount();
    int trackCount2 = soundExtractor.getTrackCount();

    //assertEquals("wrong number of tracks", expectedTrackCount, trackCount);
    // Set up MediaMuxer for the destination.
    MediaMuxer muxer;
    muxer = new MediaMuxer(dstMediaPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    // Set up the tracks.
    HashMap<Integer, Integer> indexMap = new HashMap<Integer, Integer>(trackCount);
    for (int i = 0; i < trackCount; i++) {
        soundExtractor.selectTrack(i);
        MediaFormat SoundFormat = soundExtractor.getTrackFormat(i);
        int dstIndex = muxer.addTrack(SoundFormat);
        indexMap.put(i, dstIndex);
    }

    HashMap<Integer, Integer> indexMap2 = new HashMap<Integer, Integer>(trackCount2);
    for (int i = 0; i < trackCount2; i++) {
        videoExtractor.selectTrack(i);
        MediaFormat videoFormat = videoExtractor.getTrackFormat(i);
        int dstIndex2 = muxer.addTrack(videoFormat);
        indexMap.put(i, dstIndex2);
    }


    // Copy the samples from MediaExtractor to MediaMuxer.
    boolean sawEOS = false;
    int bufferSize = MAX_SAMPLE_SIZE;
    int frameCount = 0;
    int offset = 100;
    ByteBuffer dstBuf = ByteBuffer.allocate(bufferSize);
    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    MediaCodec.BufferInfo bufferInfo2 = new MediaCodec.BufferInfo();

    muxer.start();
    while (!sawEOS) {
        bufferInfo.offset = offset;
        bufferInfo.size = soundExtractor.readSampleData(dstBuf, offset);
        bufferInfo2.offset = offset;
        bufferInfo2.size = videoExtractor.readSampleData(dstBuf, offset);

        if (bufferInfo.size < 0) {
            sawEOS = true;
            bufferInfo.size = 0;
            bufferInfo2.size = 0;
        }else if(bufferInfo2.size < 0){
            sawEOS = true;
            bufferInfo.size = 0;
            bufferInfo2.size = 0;
        }
        else {
            bufferInfo.presentationTimeUs = soundExtractor.getSampleTime();
            bufferInfo2.presentationTimeUs = videoExtractor.getSampleTime();
            //bufferInfo.flags = extractor.getSampleFlags();
            int trackIndex = soundExtractor.getSampleTrackIndex();
            int trackIndex2 = videoExtractor.getSampleTrackIndex();
            muxer.writeSampleData(indexMap.get(trackIndex), dstBuf,
                    bufferInfo);

            soundExtractor.advance();
            videoExtractor.advance();
            frameCount++;

        }
    }

    Toast.makeText(getApplicationContext(),"f:"+frameCount,Toast.LENGTH_SHORT).show();

    muxer.stop();
    muxer.release();

}

更新 2:问题已解决!检查我对我的问题的回答。

感谢您的帮助

【问题讨论】:

  • 您愿意使用 NDK 还是希望使用纯 java 解决方案?
  • 任何解决问题的方法都是完美的。我认为纯 java 和 mediaMuxer 更好。
  • 你能提供更多的异常细节吗?在您查看详细设置时,应该在 logcat 中提供来自 MediaMuxer 的异常代码

标签: android video mediamuxer


【解决方案1】:

我在音频和视频文件的轨道上遇到了一些问题。 他们走了,我的代码一切正常,但现在您可以使用它将音频文件和视频文件合并在一起

代码:

private void muxing() {

String outputFile = "";

try {

    File file = new File(Environment.getExternalStorageDirectory() + File.separator + "final2.mp4");
    file.createNewFile();
    outputFile = file.getAbsolutePath();

    MediaExtractor videoExtractor = new MediaExtractor();
    AssetFileDescriptor afdd = getAssets().openFd("Produce.MP4");
    videoExtractor.setDataSource(afdd.getFileDescriptor() ,afdd.getStartOffset(),afdd.getLength());

    MediaExtractor audioExtractor = new MediaExtractor();
    audioExtractor.setDataSource(audioFilePath);

    Log.d(TAG, "Video Extractor Track Count " + videoExtractor.getTrackCount() );
    Log.d(TAG, "Audio Extractor Track Count " + audioExtractor.getTrackCount() );

    MediaMuxer muxer = new MediaMuxer(outputFile, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

    videoExtractor.selectTrack(0);
    MediaFormat videoFormat = videoExtractor.getTrackFormat(0);
    int videoTrack = muxer.addTrack(videoFormat);

    audioExtractor.selectTrack(0);
    MediaFormat audioFormat = audioExtractor.getTrackFormat(0);
    int audioTrack = muxer.addTrack(audioFormat);

    Log.d(TAG, "Video Format " + videoFormat.toString() );
    Log.d(TAG, "Audio Format " + audioFormat.toString() );

    boolean sawEOS = false;
    int frameCount = 0;
    int offset = 100;
    int sampleSize = 256 * 1024;
    ByteBuffer videoBuf = ByteBuffer.allocate(sampleSize);
    ByteBuffer audioBuf = ByteBuffer.allocate(sampleSize);
    MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo();
    MediaCodec.BufferInfo audioBufferInfo = new MediaCodec.BufferInfo();


    videoExtractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
    audioExtractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC);

    muxer.start();

    while (!sawEOS)
    {
        videoBufferInfo.offset = offset;
        videoBufferInfo.size = videoExtractor.readSampleData(videoBuf, offset);


        if (videoBufferInfo.size < 0 || audioBufferInfo.size < 0)
        {
            Log.d(TAG, "saw input EOS.");
            sawEOS = true;
            videoBufferInfo.size = 0;

        }
        else
        {
            videoBufferInfo.presentationTimeUs = videoExtractor.getSampleTime();
            videoBufferInfo.flags = videoExtractor.getSampleFlags();
            muxer.writeSampleData(videoTrack, videoBuf, videoBufferInfo);
            videoExtractor.advance();


            frameCount++;
            Log.d(TAG, "Frame (" + frameCount + ") Video PresentationTimeUs:" + videoBufferInfo.presentationTimeUs +" Flags:" + videoBufferInfo.flags +" Size(KB) " + videoBufferInfo.size / 1024);
            Log.d(TAG, "Frame (" + frameCount + ") Audio PresentationTimeUs:" + audioBufferInfo.presentationTimeUs +" Flags:" + audioBufferInfo.flags +" Size(KB) " + audioBufferInfo.size / 1024);

        }
    }

    Toast.makeText(getApplicationContext() , "frame:" + frameCount , Toast.LENGTH_SHORT).show();



    boolean sawEOS2 = false;
    int frameCount2 =0;
    while (!sawEOS2)
    {
        frameCount2++;

        audioBufferInfo.offset = offset;
        audioBufferInfo.size = audioExtractor.readSampleData(audioBuf, offset);

        if (videoBufferInfo.size < 0 || audioBufferInfo.size < 0)
        {
            Log.d(TAG, "saw input EOS.");
            sawEOS2 = true;
            audioBufferInfo.size = 0;
        }
        else
        {
            audioBufferInfo.presentationTimeUs = audioExtractor.getSampleTime();
            audioBufferInfo.flags = audioExtractor.getSampleFlags();
            muxer.writeSampleData(audioTrack, audioBuf, audioBufferInfo);
            audioExtractor.advance();


            Log.d(TAG, "Frame (" + frameCount + ") Video PresentationTimeUs:" + videoBufferInfo.presentationTimeUs +" Flags:" + videoBufferInfo.flags +" Size(KB) " + videoBufferInfo.size / 1024);
            Log.d(TAG, "Frame (" + frameCount + ") Audio PresentationTimeUs:" + audioBufferInfo.presentationTimeUs +" Flags:" + audioBufferInfo.flags +" Size(KB) " + audioBufferInfo.size / 1024);

        }
    }

    Toast.makeText(getApplicationContext() , "frame:" + frameCount2 , Toast.LENGTH_SHORT).show();

    muxer.stop();
    muxer.release();


} catch (IOException e) {
    Log.d(TAG, "Mixer Error 1 " + e.getMessage());
} catch (Exception e) {
    Log.d(TAG, "Mixer Error 2 " + e.getMessage());
}

}

感谢这些示例代码:MediaMuxer Sample Codes-really perfect

【讨论】:

  • 我复制了你的代码,但它不起作用,错误:无法将轨道添加到复用器?如何解决?谢谢。
  • 很好的例子。添加一些额外内容:我在尝试混入不支持格式的音轨时遇到了问题。基本上,如果您要从编码的视频和音频轨道制作 mp4 视频,这些必须来自一系列特定的 mime 类型(格式)。对于音频,这些必须是 MIMETYPE_AUDIO_AMR_NB、MIMETYPE_AUDIO_AMR_WB 或 MIMETYPE_AUDIO_AAC。否则,您将遇到“未知的 mime 类型‘音频/其他’”。错误。
  • Hello @mohamad ali gharat 如何在不使用 Asset 文件夹的情况下添加 sdcard 视频和音频?
  • @AnandDiamond 使用此库选择文件:github.com/bartwell/ExFilePicker,此库为您提供所选文件的路径,获取该路径并将其放入 videoExtractor.setDataSource();这可以获取路径作为参数。我有 90% 的把握这会奏效
  • 但在我的情况下,首先我有捕获视频,第二种情况查看此视频,点击 mp3 后自下而上获取 sd 卡 mp3,这次合并任何灵魂?
【解决方案2】:

感谢 mohamad ali gharat 的回答,这对我帮助太大了。 但是我对代码进行了一些更改才能正常工作, 第一:我变了

videoExtractor.setDataSourcevideoExtractor.setDataSource(Environment.getExternalStorageDirectory().getPath() + "/Produce.MP4");

SDCard 加载视频。 第二:我收到错误

videoBufferInfo.flags = videoExtractor.getSampleFlags();

所以改成

videoBufferInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME;

让它像这个链接所说的那样工作Android MediaMuxer failed to stop

【讨论】:

    【解决方案3】:

    在 ffmpeg 中工作需要什么。这是一个帮助解决此问题的链接:

    FFmpeg on Android

    ffmpeg 需要 Android 上的 NDK。

    一旦你完成了这项工作,你就可以使用 ffmpeg 将音频和视频混合在一起。这是一个使用 2 个视频文件的问题的链接(答案应该相似)。

    FFMPEG mux video and audio (from another video) - mapping issue

    【讨论】:

    • 谢谢。我找到了一个比 ffmpeg 更接近且更容易的源代码。但即使在 MediaMuxer 类和它的行为方面我也没有任何经验,你能帮我一个易于使用的答案吗??
    【解决方案4】:
    private const val MAX_SAMPLE_SIZE = 256 * 1024
    
    fun muxAudioVideo(destination: File, audioSource: File, videoSource: File): Boolean {
    
        var result : Boolean
        var muxer : MediaMuxer? = null
    
        try {
    
            // Set up MediaMuxer for the destination.
    
            muxer = MediaMuxer(destination.path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4)
    
            // Copy the samples from MediaExtractor to MediaMuxer.
    
            var videoFormat : MediaFormat? = null
            var audioFormat : MediaFormat? = null
        
            var muxerStarted : Boolean = false
    
            var videoTrackIndex = -1
            var audioTrackIndex = -1
    
            // extractorVideo
    
            var extractorVideo = MediaExtractor()
    
            extractorVideo.setDataSource(videoSource.path)
    
            val tracks = extractorVideo.trackCount
    
            for (i in 0 until tracks) {
    
                val mf = extractorVideo.getTrackFormat(i)
    
                val mime = mf.getString(MediaFormat.KEY_MIME)
        
                if (mime!!.startsWith("video/")) {
    
                    extractorVideo.selectTrack(i)
                    videoFormat = extractorVideo.getTrackFormat(i)
    
                    break
                }
            }
    
    
            // extractorAudio
    
            var extractorAudio = MediaExtractor()
    
            extractorAudio.setDataSource(audioSource.path)
    
            for (i in 0 until tracks) {
    
                val mf = extractorAudio.getTrackFormat(i)
    
                val mime = mf.getString(MediaFormat.KEY_MIME)
    
                if (mime!!.startsWith("audio/")) {
    
                    extractorAudio.selectTrack(i)
                    audioFormat = extractorAudio.getTrackFormat(i)
    
                    break
    
                }
    
            }
    
            val audioTracks = extractorAudio.trackCount
    
            // videoTrackIndex
    
            if (videoTrackIndex == -1) {
    
                videoTrackIndex = muxer.addTrack(videoFormat!!)
    
            }
    
            // audioTrackIndex
    
            if (audioTrackIndex == -1) {
    
                audioTrackIndex = muxer.addTrack(audioFormat!!)
    
            }
    
            var sawEOS = false
            var sawAudioEOS = false
            val bufferSize = MAX_SAMPLE_SIZE
            val dstBuf = ByteBuffer.allocate(bufferSize)
            val offset = 0
            val bufferInfo = MediaCodec.BufferInfo()
    
            // start muxer
        
            if (!muxerStarted) {
    
                muxer.start()
    
                muxerStarted = true
    
            }
    
            // write video
        
            while (!sawEOS) {
    
                bufferInfo.offset = offset
                bufferInfo.size = extractorVideo.readSampleData(dstBuf, offset)
    
                if (bufferInfo.size < 0) {
        
                    sawEOS = true
                    bufferInfo.size = 0
    
                } else {
    
                    bufferInfo.presentationTimeUs = extractorVideo.sampleTime
                    bufferInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME
                    muxer.writeSampleData(videoTrackIndex, dstBuf, bufferInfo)
                    extractorVideo.advance()
    
                }
    
            }
    
            // write audio
        
            val audioBuf = ByteBuffer.allocate(bufferSize)
    
            while (!sawAudioEOS) {
    
                bufferInfo.offset = offset
                bufferInfo.size = extractorAudio.readSampleData(audioBuf, offset)
    
                if (bufferInfo.size < 0) {
        
                    sawAudioEOS = true
                    bufferInfo.size = 0
    
                } else {
    
                    bufferInfo.presentationTimeUs = extractorAudio.sampleTime
                    bufferInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME
                    muxer.writeSampleData(audioTrackIndex, audioBuf, bufferInfo)
                    extractorAudio.advance()
    
                }
    
            }
    
            extractorVideo.release()
            extractorAudio.release()
    
            result = true
    
        } catch (e: IOException) {
    
            result = false
    
        } finally {
    
            if (muxer != null) {
                muxer.stop()
                muxer.release()
            }
    
        }
    
        return result
    
    }
    

    【讨论】:

      猜你喜欢
      • 2013-05-22
      • 1970-01-01
      • 1970-01-01
      • 2012-03-08
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2014-09-22
      • 2019-10-07
      相关资源
      最近更新 更多