【问题标题】:Passing Sound (wav) file to javascript from objective c将声音(wav)文件从目标 c 传递给 javascript
【发布时间】:2013-03-28 19:21:57
【问题描述】:

我正在用 Objective C 录制一个声音文件(wav 格式)。我想使用 Objective C stringByEvaluatingJavaScriptFromString 将它传递回 Javascript。我在想我必须将 wav 文件转换为 base64 字符串才能将其传递给这个函数。然后我将不得不在 javascript 中将 base64 字符串转换回 (wav/blob) 格式,以将其传递给音频标签以播放它。我不知道我该怎么做?也不确定这是否是将wave文件传回javascript的最佳方法?任何想法将不胜感激。

【问题讨论】:

    标签: javascript objective-c audio ios6


    【解决方案1】:

    嗯,这并不像我预期的那样直截了当。所以这就是我能够做到这一点的方法。

    第 1 步:我使用 AudioRecorder 以 caf 格式录制音频。

    NSArray *dirPaths;
    NSString *docsDir;
    
    dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    
    docsDir = [dirPaths objectAtIndex:0];
    
    soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"];
    
    NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
    
    NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
        [NSNumber numberWithInt:AVAudioQualityMin],
        AVEncoderAudioQualityKey,
        [NSNumber numberWithInt:16],
        AVEncoderBitRateKey,
        [NSNumber numberWithInt:2],
        AVNumberOfChannelsKey,
        [NSNumber numberWithFloat:44100],
                                    AVSampleRateKey,
        nil];
    
    NSError *error = nil;
    
    audioRecorder = [[AVAudioRecorder alloc]
                     initWithURL:soundFileURL
                     settings:recordSettings error:&error];
    
    if(error)
    {
        NSLog(@"error: %@", [error localizedDescription]);
    } else {
        [audioRecorder prepareToRecord];
    }
    

    之后,您只需要调用 audioRecorder.record 即可录制音频。它会被记录下来 以caf 格式。如果你想看我的 recordAudio 功能,那就在这里。

      (void) recordAudio
       {
        if(!audioRecorder.recording)
         {
             _playButton.enabled = NO;
             _recordButton.title = @"Stop";
             [audioRecorder record];
             [self animate1:nil finished:nil context:nil];
    
         }
        else
        {
           [_recordingImage stopAnimating];
           [audioRecorder stop];
           _playButton.enabled = YES;
          _recordButton.title = @"Record";
        }
      }
    

    第 2 步:将 caf 格式转换为 wav 格式。我可以使用以下功能执行此操作。

     -(BOOL)exportAssetAsWaveFormat:(NSString*)filePath
    {
       NSError *error = nil ;
    
    NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
                                  [ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
                                  [ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
                                  [ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                                  [ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
                                  [ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
                                  [ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
                                  [ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
                                  [ NSData data], AVChannelLayoutKey, nil ];
    
    NSString *audioFilePath = filePath;
    AVURLAsset * URLAsset = [[AVURLAsset alloc]  initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil];
    
    if (!URLAsset) return NO ;
    
    AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
    if (error) return NO;
    
    NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
    if (![tracks count]) return NO;
    
    AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
                                                   assetReaderAudioMixOutputWithAudioTracks:tracks
                                                   audioSettings :audioSetting];
    
    if (![assetReader canAddOutput:audioMixOutput]) return NO ;
    
    [assetReader addOutput :audioMixOutput];
    
    if (![assetReader startReading]) return NO;
    
    
    
    NSString *title = @"WavConverted";
    NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *docDir = [docDirs objectAtIndex: 0];
    NSString *outPath = [[docDir stringByAppendingPathComponent :title]
                         stringByAppendingPathExtension:@"wav" ];
    
    if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL])
    {
        return NO;
    }
    
    soundFilePath = outPath;
    
    NSURL *outURL = [NSURL fileURLWithPath:outPath];
    AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
                                                          fileType:AVFileTypeWAVE
                                                             error:&error];
    if (error) return NO;
    
    AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
                                                                                outputSettings:audioSetting];
    assetWriterInput. expectsMediaDataInRealTime = NO;
    
    if (![assetWriter canAddInput:assetWriterInput]) return NO ;
    
    [assetWriter addInput :assetWriterInput];
    
    if (![assetWriter startWriting]) return NO;
    
    
    //[assetReader retain];
    //[assetWriter retain];
    
    [assetWriter startSessionAtSourceTime:kCMTimeZero ];
    
    dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );
    
    [assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
    
        NSLog(@"start");
    
        while (1)
        {
            if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) {
    
                CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];
    
                if (sampleBuffer) {
                    [assetWriterInput appendSampleBuffer :sampleBuffer];
                    CFRelease(sampleBuffer);
                } else {
                    [assetWriterInput markAsFinished];
                    break;
                }
            }
        }
    
        [assetWriter finishWriting];
    
        //[self playWavFile];
        NSError *err;
        NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err];
        [self.audioDelegate doneRecording:audioData];
        //[assetReader release ];
        //[assetWriter release ];
        NSLog(@"soundFilePath=%@",soundFilePath);
        NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err];
        NSLog(@"size of wav file = %@",[dict objectForKey:NSFileSize]);
        //NSLog(@"finish");
    }];
    

    在这个函数中,我使用 audioData 调用 audioDelegate 函数 doneRecording wav 格式。这是 doneRecording 的代码。

    -(void) doneRecording:(NSData *)contents
    {
    myContents = [[NSData dataWithData:contents] retain];
    [self returnResult:alertCallbackId args:@"Recording Done.",nil];
    }
    
    // Call this function when you have results to send back to javascript callbacks
     // callbackId : int comes from handleCall function
    
    // args: list of objects to send to the javascript callback
    - (void)returnResult:(int)callbackId args:(id)arg, ...;
    {
      if (callbackId==0) return;
    
      va_list argsList;
      NSMutableArray *resultArray = [[NSMutableArray alloc] init];
    
      if(arg != nil){
        [resultArray addObject:arg];
        va_start(argsList, arg);
        while((arg = va_arg(argsList, id)) != nil)
          [resultArray addObject:arg];
        va_end(argsList);
      }
    
       NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil];
       [self performSelectorOnMainThread:@selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:@"NativeBridge.resultForCallback(%d,%@);",callbackId,resultArrayString] waitUntilDone:NO];
       [resultArray release];    
    }
    

    第 3 步:现在是时候与 UIWebView 中的 javascript 进行通信了,我们已经完成了记录 音频,以便您可以开始接受我们的数据块。我正在使用 websockets 将数据传输回 javascript。数据将分块传输 因为我使用的 server(https://github.com/benlodotcom/BLWebSocketsServer) 是使用 libwebsockets(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/)。

    这是在委托类中启动服务器的方式。

    - (id)initWithFrame:(CGRect)frame 
    {
      if (self = [super initWithFrame:frame]) {
    
          [self _createServer];
          [self.server start];
          myContents = [NSData data];
    
        // Set delegate in order to "shouldStartLoadWithRequest" to be called
        self.delegate = self;
    
        // Set non-opaque in order to make "body{background-color:transparent}" working!
        self.opaque = NO;
    
        // Instanciate JSON parser library
        json = [ SBJSON new ];
    
        // load our html file
        NSString *path = [[NSBundle mainBundle] pathForResource:@"webview-document" ofType:@"html"];
        [self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]];
    
    
    
      }
      return self;
    }
    -(void) _createServer
    {
        /*Create a simple echo server*/
        self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol];
        [self.server setHandleRequestBlock:^NSData *(NSData *data) {
    
            NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
            NSLog(@"Received Request...%@",convertedString);
    
            if([convertedString isEqualToString:@"start"])
            {
                NSLog(@"myContents size: %d",[myContents length]);
    
                int contentSize = [myContents length];
                int chunkSize = 64*1023;
                chunksCount = ([myContents length]/(64*1023))+1;
    
                NSLog(@"ChunkSize=%d",chunkSize);
                NSLog(@"chunksCount=%d",chunksCount);
    
                chunksArray =  [[NSMutableArray array] retain];
    
                int index = 0;
                //NSRange chunkRange;
    
                for(int i=1;i<=chunksCount;i++)
                {
    
                    if(i==chunksCount)
                    {
                        NSRange chunkRange = {index,contentSize-index};
                        NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index);
                        NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                        [chunksArray addObject:dataChunk];
                        break;
                    }
                    else
                    {
                        NSRange chunkRange = {index, chunkSize};
                        NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize);
                        NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                        index += chunkSize;
                        [chunksArray addObject:dataChunk];
                    }
                }
    
                return [chunksArray objectAtIndex:0];
    
            }
            else
            {
                int chunkNumber = [convertedString intValue];
    
                if(chunkNumber>0 && (chunkNumber+1)<=chunksCount)
                {
                    return [chunksArray objectAtIndex:(chunkNumber)];
                }
    
    
            }
    
            NSLog(@"Releasing Array");
            [chunksArray release];
            chunksCount = 0;
            return [NSData dataWithBase64EncodedString:@"Stop"];
        }];
    }
    

    javascript 端的代码是

    var socket;
    var chunkCount = 0;
    var soundBlob, soundUrl;
    var smallBlobs = new Array();
    
    function captureMovieCallback(response)
    {
        if(socket)
        {
            try{
                socket.send('start');
            }
            catch(e)
            {
                log('Socket is not valid object');
            }
    
        }
        else
        {
            log('socket is null');
        }
    }
    
    function closeSocket(response)
    {
        socket.close();
    }
    
    
    function connect(){
        try{
            window.WebSocket = window.WebSocket || window.MozWebSocket;
    
            socket = new WebSocket('ws://127.0.0.1:9000',
                                          'echo-protocol');
    
            socket.onopen = function(){
            }
    
            socket.onmessage = function(e){
                var data = e.data;
                if(e.data instanceof ArrayBuffer)
                {
                    log('its arrayBuffer');
                }
                else if(e.data instanceof Blob)
                {
                    if(soundBlob)
                       log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size);
    
                    if(e.data.size != 3)
                    {
                        //log('its Blob of size = '+ e.data.size);
                        smallBlobs[chunkCount]= e.data;
                        chunkCount = chunkCount +1;
                        socket.send(''+chunkCount);
                    }
                    else
                    {
                        //alert('End Received');
                        try{
                        soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
                        var myURL = window.URL || window.webkitURL;
                        soundUrl = myURL.createObjectURL(soundBlob);
                        log('soundURL='+soundUrl);
                        }
                        catch(e)
                        {
                            log('Problem creating blob and url.');
                        }
    
                        try{
                            var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record';
                            var xhr = new XMLHttpRequest();
                            xhr.open('POST',serverUrl,true);
                            xhr.setRequestHeader("content-type","multipart/form-data");
                            xhr.send(soundBlob);
                        }
                        catch(e)
                        {
                            log('error uploading blob file');
                        }
    
                        socket.close();
                    }
    
                    //alert(JSON.stringify(msg, null, 4));
                }
                else
                {
                    log('dont know');
                }
            }
    
            socket.onclose = function(){
                //message('<p class="event">Socket Status: '+socket.readyState+' (Closed)');
                log('final blob size:'+soundBlob.size);
            }
    
        } catch(exception){
           log('<p>Error: '+exception);
        }
    }
    
    function log(msg) {
        NativeBridge.log(msg);
    }
    function stopCapture() {
        NativeBridge.call("stopMovie", null,null);
    }
    
    function startCapture() {
        NativeBridge.call("captureMovie",null,captureMovieCallback);
    }
    

    NativeBridge.js

    var NativeBridge = {
      callbacksCount : 1,
      callbacks : {},
    
      // Automatically called by native layer when a result is available
      resultForCallback : function resultForCallback(callbackId, resultArray) {
        try {
    
    
        var callback = NativeBridge.callbacks[callbackId];
        if (!callback) return;
        console.log("calling callback for "+callbackId);
        callback.apply(null,resultArray);
        } catch(e) {alert(e)}
      },
    
      // Use this in javascript to request native objective-c code
      // functionName : string (I think the name is explicit :p)
      // args : array of arguments
      // callback : function with n-arguments that is going to be called when the native code returned
      call : function call(functionName, args, callback) {
    
        //alert("call");
        //alert('callback='+callback);
        var hasCallback = callback && typeof callback == "function";
        var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0;
    
        if (hasCallback)
          NativeBridge.callbacks[callbackId] = callback;
    
        var iframe = document.createElement("IFRAME");
        iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args)));
        document.documentElement.appendChild(iframe);
        iframe.parentNode.removeChild(iframe);
        iframe = null;
    
      },
    
        log : function log(message) {
    
            var iframe = document.createElement("IFRAME");
            iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message)));
            document.documentElement.appendChild(iframe);
            iframe.parentNode.removeChild(iframe);
            iframe = null;
    
        }
    
    };
    
    1. 我们在 html 端的 body 负载上调用 javascript 端的 connect()

    2. 一旦我们从 startCapture 函数接收到回调(captureMovieCallback),我们发送 开始消息,表明我们已准备好接受数据。

    3. 目标 c 端的服务器将 wav 音频数据拆分为小块 chunksize=60*1023 并存储在数组中。

    4. 将第一个块发送回 javascript 端。

    5. javascript 接受此块并从服务器发送它需要的下一个块的编号。

    6. 服务器发送此编号指示的块。重复这个过程,直到我们 将最后一个块发送到 javascript。

    7. 最后,我们将停止消息发送回 javascript 端,表明我们已完成。它 显然是 3 个字节的大小(用作打破此循环的标准。)

    8. 每个块都存储为数组中的小块。现在我们从这些创建一个更大的斑点 使用以下行的小斑点

      soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });

      这个 blob 被上传到服务器,服务器将这个 blob 写成 wav 文件。 我们可以将 url 作为音频标签的 src 传递给这个 wav 文件,以便在 javascript 端重放它。

    9. 我们在发送 blob 到服务器后关闭 websocket 连接。

      希望这足够清楚,可以理解。

    【讨论】:

      【解决方案2】:

      如果您只想播放声音,那么最好使用 iOS 中的一种原生音频播放系统而不是 HTML 音频标签。

      【讨论】:

      • 我是开发项目的一部分,我们正在构建一个系统,该系统将用于不同平台,如 Andriod、PC、MAC 和 iOS。我想将音频('caf')文件从目标 c(在 iPad 上)返回到 javascript,以便将音频上传到服务器的代码可以在不同平台之间共享。我想尽量减少原生平台的参与。我知道像 apache cardova 这样的平台做同样的事情,但不知道他们是怎么做的。
      • Cordova 不会按照您要求的方式在 JavaScript 和 Native 之间传递音频 data,它只是使用本机平台音频播放库在请求的文件路径播放音频文件通过 JavaScript。
      • 本,我已经在上面发布了解决方案。这个解决方案的性能一点也不差。
      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2019-07-04
      • 1970-01-01
      • 2011-04-08
      • 2020-09-19
      • 1970-01-01
      • 1970-01-01
      • 2012-02-10
      相关资源
      最近更新 更多