【问题标题】:EXC_BAD_ACCESS KERN_INVALID_ADDRESS in my ios app我的 ios 应用程序中的 EXC_BAD_ACCESS KERN_INVALID_ADDRESS
【发布时间】:2014-04-16 22:50:23
【问题描述】:

我正在我的 ios 应用程序中录制一些视频,有时(非常不可预测)它在录制时因 EXC_BAD_ACCESS KERN_INVALID_ADDRESS 而崩溃。 (编辑:项目正在使用 ARC)

Thread : Crashed: com.myapp.myapp
0  libobjc.A.dylib                0x3b1cc622 objc_msgSend + 1
1  com.myapp.myap                 0x00156faf -[Encoder encodeFrame:isVideo:] (Encoder.m:129)
2  com.myapp.myap                 0x001342ab -[CameraController     captureOutput:didOutputSampleBuffer:fromConnection:] (CameraController.m:423)
3  AVFoundation                   0x2f918327 __74-[AVCaptureAudioDataOutput  _AVCaptureAudioDataOutput_AudioDataBecameReady]_block_invoke + 282
4  libdispatch.dylib              0x3b6abd53 _dispatch_call_block_and_release + 10
5  libdispatch.dylib              0x3b6b0cbd _dispatch_queue_drain + 488
6  libdispatch.dylib              0x3b6adc6f _dispatch_queue_invoke + 42
7  libdispatch.dylib              0x3b6b15f1 _dispatch_root_queue_drain + 76
8  libdispatch.dylib              0x3b6b18dd _dispatch_worker_thread2 + 56
9  libsystem_pthread.dylib        0x3b7dcc17 _pthread_wqthread + 298

我的变量声明:

@interface CameraController  () <AVCaptureVideoDataOutputSampleBufferDelegate,    AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection;


Encoder* _encoder;
BOOL _isRecording;
BOOL _isPaused;
BOOL _discont;
int _currentFile;
CMTime _timeOffset;
CMTime _lastVideo;
CMTime _lastAudio;

int _cx;
int _cy;
int _channels;
Float64 _samplerate;  
}
@end

这里是上下文中的 [Encoder encodeFrame:isVideo:](ntrace 中的第 1 行):

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;

@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));

            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }

    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);

    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }

    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}

// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //This is line 129
CFRelease(sampleBuffer);
}

完整使用的代码请参考:http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html - 我已经使用这个控件进行视频录制。 我知道很难解决这类问题,但我应该从哪里开始调试这个问题?感谢您的帮助

【问题讨论】:

  • google "How to debug EXC_BAD_ACCESS" 你会发现大量的资源,例如raywenderlich.com/10209/my-app-crashed-now-what-part-1
  • Encoder.m 的第 129 行是哪一行?您可能在那里引用了一个已被释放的对象。 (objc_msgSend 错误的常见原因。)
  • 。 // pss 帧到编码器 [_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //这是第 129 行 CFRelease(sampleBuffer);
  • _encoder 的声明/范围是什么?如果它是属性的后备变量,请尝试更改对 self.encoder 样式的引用。否则,我会运行 Instruments 来查看对象的分配生命周期。 (也许首先在启用僵尸的情况下运行应用程序,以完全确定这是问题对象。)
  • 抱歉范围不清楚。我已经更新了我的代码。谢谢

标签: ios objective-c avfoundation


【解决方案1】:

在您的方法中,您有以下...

CFRetain(sampleBuffer);

if (_timeOffset.value > 0)
{
    CFRelease(sampleBuffer);
    sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
}

那么最后你有另一个

CFRelease(sampleBuffer);

_timeOffset.value大于0的情况下,你不是过度释放了吗?或者你在其他地方做retain?您是否应该在if 块中再次保留它?

【讨论】:

  • 不,我没有被保留在其他地方。可能是它导致了这个问题。我添加了 CFRetain(sampleBuffer);在 if 块的末尾。感谢您的建议
  • @martin 让我们知道它是否修复了它
  • 我当然会,我已经对我的代码进行了更改,我需要对其进行正确测试。最多一天我会告诉你的。感谢您的帮助和关注
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2019-03-09
  • 2015-12-18
  • 2012-10-09
  • 2020-04-13
  • 1970-01-01
  • 1970-01-01
  • 2016-01-11
相关资源
最近更新 更多