Detecting blow in the iPhone Mic with Cocos2D - audio

I'm using Cocos2D and the system particles and I hope I wrote this correctly in English.
I'm having problems to recognize sounds with the iPhone mic in certain way.
I have different sections in my application, in one of them I use the mic to detect if someone is "blowing air" into the mic. This part works fine at the beginning, but if you go to other section of the app that plays sound and later you return to this area and try to blow air, it won't work.
I debuged the code, and the levelTimeCallback is always working even if I'm not in this scene. I don't really know what's happening. I've stopped all sounds using
[[SimpleAudioEngine sharedEngine] stopBackgroundMusic];
Anyone knows what I'm doing wrong? BTW works perfectly in simulator, but not in iPhone.
The recorder is set in the onEnter method
-(void) onEnter {
[super onEnter];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
NSLog(#"I'm in the recorder");
} else
NSLog(#"recorder error");
}
This is the levelTimerCallback method were the sound is "checked"
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
if (lowPassResults > 0.55)
{
NSLog(#"Mic blow detected");
self.emitter = [[CCParticleExplosion alloc] initWithTotalParticles:5];
[self addChild: emitter z:1];
emitter.texture = [[CCTextureCache sharedTextureCache] addImage: #"hoja.png"];
emitter.autoRemoveOnFinish = YES;
}
NSLog(#"Inside levelTimerCallback");}

Same problem.
There are 2 ways to solved this.
1: Use AVAudioPlayer to play music or sound effect.
2: Add #import "CDAudioManager.h"to AppDelegate.m and Add
[CDAudioManager initAsynchronously:kAMM_PlayAndRecord];
to
- (void) applicationDidFinishLaunching:(UIApplication*)application`
source

Finally, after reading the Audio Session cookbook and different kind of post related I found the solution. Jus put this code block in the init method:
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
AudioSessionSetActive(true);

Related

How can I play a custom sound in WatchOS 3 that will playback on the watch speakers

I've read that we can now play custom sounds on the apple watch in watchos 3.
According to the announcement from Apple so apparently there is but I don't have an example to test it out: 3D spatial audio implemented using SCNAudioSource or SCNAudioPlayer. Instead, use playAudioSource:waitForCompletion: or the WatchKit sound or haptic APIs. Found here: https://developer.apple.com/library/prerelease/content/releasenotes/General/WhatsNewInwatchOS/Articles/watchOS3.html
Can someone place a simple example of this. I'm not using SceneKit in my app as I don't need it but if that's the only way to play a custom sound then I'd like to know the minimum code required to accomplish this. Preferably in Objective c but I'll take it in whatever shape. I'm ok using SpriteKit if that's easier also.
Here's what I have so far but it doesn't work:
SCNNode * audioNode = [[SCNNode alloc] init];
SCNAudioSource * audioSource = [SCNAudioSource audioSourceNamed:#"mysound.mp3"];
SCNAudioPlayer * audioPlayer = [SCNAudioPlayer audioPlayerWithSource:audioSource];
[audioNode addAudioPlayer:audioPlayer];
SCNAction * play = [SCNAction playAudioSource:audioSource waitForCompletion:YES];
[audioNode runAction:play];
I can confirm, that #ApperleyA solution really works!
Here is the swift version:
var _audioPlayer : AVAudioPlayerNode!
var _audioEngine : AVAudioEngine!
func playAudio()
{
if (_audioPlayer==nil) {
_audioPlayer = AVAudioPlayerNode()
_audioEngine = AVAudioEngine()
_audioEngine.attach(_audioPlayer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2)
_audioEngine.connect(_audioPlayer, to: _audioEngine.mainMixerNode, format: stereoFormat)
do {
if !_audioEngine.isRunning {
try _audioEngine.start()
}
} catch {}
}
if let path = Bundle.main.path(forResource: "test", ofType: "mp3") {
let fileUrl = URL(fileURLWithPath: path)
do {
let asset = try AVAudioFile(forReading: fileUrl)
_audioPlayer.scheduleFile(asset, at: nil, completionHandler: nil)
_audioPlayer.play()
} catch {
print ("asset error")
}
}
}
This is Objective-c but can be translated into Swift
I ended up using AVAudioEngine and AVAudioPlayerNode to play audio on the Apple watch.
The gist of how to do this is as follows:
I call the following inside the init method of my AudioPlayer (it's an NSObject subclass to encapsulate the functionality)
_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];
if (!_audioEngine.isRunning) {
NSError* error;
[_audioEngine startAndReturnError:&error];
}
I have a cache setup so I don't recreate the AVAudioFile assets every time I want to play a sound but you don't need to.
So next create an AVAudioFile object:
NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:key ofType:#"aifc"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url &error];
Then play that file:
[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];
UPDATE: If the app goes to sleep or is put to the background there is a chance the audio will stop playing/fade out. By activating an Audio Session this will be prevented.
NSError *error;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
NSLog(#"AVAudioSession setCategory ERROR: %#", error.localizedDescription);
}
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"AVAudioSession setActive ERROR: %#", error.localizedDescription);
}
I didn't go over handling any errors but this should work. Don't forget to #import <AVFoundation/AVFoundation.h> at the top of your implementation file.
This worked for me in the simulator
let soundPath = Bundle.main.path(forResource: "cheerMP3", ofType: "mp3")
let soundPathURL = URL(fileURLWithPath: soundPath!)
let audioFile = WKAudioFileAsset(url: soundPathURL)
let audioItem = WKAudioFilePlayerItem(asset: audioFile)
let audioPlayer = WKAudioFilePlayer.init(playerItem: audioItem)
if audioPlayer.status == .readyToPlay
{
audioPlayer.play()
}
else
{
print("Not ready!!")
}
but only if I had a breakpoint at both audioPlayer.play() and after the last }.
dunqan, what did you put at the top of the file, the import statements? I wasn't able to include
import AVFoundation
without an error using Xcode 8.2.1

GPUImageVideoCamera save video on ios 7

I'm trying to achieve the square video recording like 300*300 so I choose GPUImage but its not working on IOS 7 and giving errors like [UIView nextAvailableTextureIndex]: unrecognized selector sent to instance the error starts when we build the even the sample code
when trying to save the GPUImageVideoCamera
some times its stucks at [movieWriter startRecording]; 
is the GPUImage compatible with ios 7 or we have made some changes ?
here is the code
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
filter = [[GPUImageSepiaFilter alloc] init];
initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
}
- (IBAction)stopRecording:(id)sender {
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
}
- (IBAction)startRecording:(id)sender {
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
[videoCamera startCameraCapture];
}
My guess is that you modified the .xib or storyboard and didn't set the class of the view that is showing the camera preview to GPUImageView.

Error recording to movie file with AVFoundation

This is a strange problem. I have not changed any code involving this in my project but my video recording has randomly stopped working. When I try to save to a movie to a file I get the following error:
Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)"
I start my capture with the following code:
- (void)initVideoCapture {
self.captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [self frontFacingCameraIfAvailable];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:nil];
[self.captureSession addInput:videoInput];
aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[self.captureSession addOutput:aMovieFileOutput];
[self detectVideoOrientation:aMovieFileOutput];
[self.captureSession setSessionPreset:AVCaptureSessionPresetMedium];
[self.captureSession startRunning];
}
I then call this method from the viewController to start recording:
- (void) startRecord {
NSDateFormatter *outputFormatter = [[NSDateFormatter alloc] init];
[outputFormatter setDateFormat:#"yyyyMMddHHmmss"];
NSString *newDateString = [outputFormatter stringFromDate:[NSDate date]];
[outputFormatter release];
NSString * fileString = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.mov",newDateString]];
recordFileURL = [[NSURL alloc] initFileURLWithPath:fileString];
[aMovieFileOutput startRecordingToOutputFileURL:recordFileURL recordingDelegate:self];
}
At this time I get the error in this function.
(void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
What is really weird is that it randomly works sometimes. Like, I will compile the project and it will work 100% of the time. Next time I compile it will work 0%. What could I be doing wrong? Anything obvious?
I've gotten -12780 when the orientation of the device was UIDeviceOrientationFaceUp, UIDeviceOrientationFaceDown and UIDeviceOrientationUnknown. Since a recoded video's orientation has to be portrait or landscape, it'll error out on you. I had to write a quick method that checks for those three, and just translates them to portrait.
this seems to be a bug with apple. i solved it by using AVAssetWriter and AVAssetWriterInput

AVAssetWriter startWriting fails on iPod Touch 3rd Gen / OS 4.2.1

My app creates an mp4 file. I've verified that the code I have works on the following devices:
iPad (OS 4.3.2)
iPhone4 (OS 4.2.1)
iPhone 3GS (OS 4.2.1)
.. but the init fails on my iPod Touch 3rd Gen running OS 4.2.1.
This is related to another question on here, but I'm seeing it on a a different iOS device than he was and I've included my init code here. Like the other question, I've tried different pixel formats as well as bitrates, but the AVAssetWriter's status always changes to AVAssetWriterStatusFailed after calling its startWriting function.
Any ideas would be greatly appreciated. I know that mp4 creation is possible on this device because I've downloaded another app that does it just fine on the same device that my code fails on.
Here is the minimal code to do the video setup.
#import "AVFoundation/AVFoundation.h"
#import "AVFoundation/AVAssetWriterInput.h"
#import "AVFoundation/AVAssetReader.h"
#import "Foundation/NSUrl.h"
void VideoSetupTest()
{
int width = 320;
int height = 480;
// Setup the codec settings.
int nBitsPerSecond = 100000;
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt: nBitsPerSecond], AVVideoAverageBitRateKey,
nil];
// Create the AVAssetWriterInput.
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings, AVVideoCompressionPropertiesKey,
[NSNumber numberWithInt: width], AVVideoWidthKey,
[NSNumber numberWithInt: height], AVVideoHeightKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[assetWriterInput retain];
// Create the AVAssetWriterPixelBufferAdaptor.
NSDictionary *pixelBufferAdaptorAttribs = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt: kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt: width], kCVPixelBufferWidthKey,
[NSNumber numberWithInt: height], kCVPixelBufferHeightKey,
nil];
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor alloc];
[pixelBufferAdaptor initWithAssetWriterInput: assetWriterInput sourcePixelBufferAttributes: pixelBufferAdaptorAttribs];
// Figure out a filename.
NSArray *paths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES );
NSString *documentsDirectory = [paths objectAtIndex:0];
char szFilename[256];
sprintf( szFilename, "%s/test.mp4", [documentsDirectory UTF8String] );
unlink( szFilename );
printf( "Filename:\n%s\n\n", szFilename );
// Create the AVAssetWriter.
NSError *pError;
NSURL *url = [NSURL fileURLWithPath: [NSString stringWithUTF8String: szFilename]];
AVAssetWriter *assetWriter = [AVAssetWriter alloc];
[assetWriter initWithURL:url fileType:AVFileTypeMPEG4 error:&pError];
// Bind these things together and start!
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];
//
// ** NOTE: Here's the call where [assetWriter status] starts returning AVAssetWriterStatusFailed
// on an iPod 3rd Gen running iOS 4.2.1.
//
[assetWriter startWriting];
}
I've come to the conclusion that the iPod Touch 3rd Gen just can't create .mp4 videos, period. This would make sense - why include video encoding hardware on a device that doesn't even have a camera to capture videos with?
The thing that was making me think that it could create .mp4 videos was that I had another app that was able to create them from my iPod Touch 3. But upon analyzing that app's .mp4 video, it was actually an H.263 (MPEG-4 part-2) video created with ffmpeg. (Whereas AVFoundation will create H.264 videos if it can do it at all on a device).

AVPlayer does not retain AVPlayerItem

Does somebody know why this code is crashing somewhere in the release pool (after 'eject' is called)?
I saw in AVPlayer class reference that the 'currentItem' property is NOT declared as 'retain' http://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009530-CH1-SW21
Is it a bug in the AVPlayer class or should I retain it somewhere else?
Thanks!
- (void) viewDidLoad {
NSURL *url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playerItem = [[AVPlayerItem alloc] initWithURL:url];
player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
}
- (IBAction) eject {
[player release];
[playerItem release];
}
I typically use this to setup a player:
if (!self.player) {
player = [[AVPlayer alloc] init];
}
[self.player replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithURL:videoURL]];
I believe that AVPlayer retains AVPlayerItem in initWithPlayerItem: function, so you are possibly leaking memory with your AVPlayerItem. "currentItem" is readonly property and should not be "retain" which is only for writable properties.

Resources