Trying to stream microphone audio between devices on iOS, where someone could speak into their iPhone and it would play on another speaker somewhere, the connection created with sockets. I found a similar question here but it covers the networking part of the program in objective-C, and I'm handling the networking with python and hoping to record with objective-C and AVFoundation maybe? I've got the networking part sorted out, I'm just not sure how to record with objective-C. Most examples I've found do something along the lines of [recorder start];, but how am I supposed to use that to send audio through a socket object? I was hoping for something like
var = mic.read(); socket.send(var, addr)
or just any object for me to send through the socket that represents audio data that can be handled accordingly on the other side of the socket. Maybe even something like AudioObjectOfSomeSort *var = [recorder start];? Then in python socket.send(var, addr)?
I found this code online:
- (IBAction)startRecord:(id)sender {
NSArray *path = [NSArray arrayWithObjects:[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject], @"myRecording.m4a", nil];
NSURL *url = [NSURL fileURLWithPathComponents:path];
AVAudioSession *session = [[AVAudioSession alloc] init];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
NSMutableDictionary *setting = [[NSMutableDictionary alloc] init];
[setting setValue:[NSNumber numberWithInteger:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[setting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[setting setValue:[NSNumber numberWithInteger:1] forKey:AVNumberOfChannelsKey];
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:setting error:nil];
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
[recorder record];
}
- (IBAction)stopRecord:(id)sender {
[recorder stop];
AVAudioSession *session = [[AVAudioSession alloc] init];
[session setActive:NO error:nil];
}
- (IBAction)play:(id)sender {
if (!recorder.recording){
player = [[AVPlayer alloc] initWithURL:recorder.url];
[player play];
}
}
This code just records and saves to a file, and I need to stream audio across a socket continuously. I did find this and this but I don't understand Swift all to well and don't have enough know how to translate it into Objective-C, I'm a python programmer. Also, would the data sent through the socket be able to be decoded and read and played by Java on an Android device in some way? I haven't found anything else so far. Any help would be greatly appreciated.