How can I play a custom sound in WatchOS 3 that will playback on the watch speakers - audio

I've read that we can now play custom sounds on the apple watch in watchos 3.
According to the announcement from Apple so apparently there is but I don't have an example to test it out: 3D spatial audio implemented using SCNAudioSource or SCNAudioPlayer. Instead, use playAudioSource:waitForCompletion: or the WatchKit sound or haptic APIs. Found here: https://developer.apple.com/library/prerelease/content/releasenotes/General/WhatsNewInwatchOS/Articles/watchOS3.html
Can someone place a simple example of this. I'm not using SceneKit in my app as I don't need it but if that's the only way to play a custom sound then I'd like to know the minimum code required to accomplish this. Preferably in Objective c but I'll take it in whatever shape. I'm ok using SpriteKit if that's easier also.
Here's what I have so far but it doesn't work:
SCNNode * audioNode = [[SCNNode alloc] init];
SCNAudioSource * audioSource = [SCNAudioSource audioSourceNamed:#"mysound.mp3"];
SCNAudioPlayer * audioPlayer = [SCNAudioPlayer audioPlayerWithSource:audioSource];
[audioNode addAudioPlayer:audioPlayer];
SCNAction * play = [SCNAction playAudioSource:audioSource waitForCompletion:YES];
[audioNode runAction:play];

I can confirm, that #ApperleyA solution really works!
Here is the swift version:
var _audioPlayer : AVAudioPlayerNode!
var _audioEngine : AVAudioEngine!
func playAudio()
{
if (_audioPlayer==nil) {
_audioPlayer = AVAudioPlayerNode()
_audioEngine = AVAudioEngine()
_audioEngine.attach(_audioPlayer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2)
_audioEngine.connect(_audioPlayer, to: _audioEngine.mainMixerNode, format: stereoFormat)
do {
if !_audioEngine.isRunning {
try _audioEngine.start()
}
} catch {}
}
if let path = Bundle.main.path(forResource: "test", ofType: "mp3") {
let fileUrl = URL(fileURLWithPath: path)
do {
let asset = try AVAudioFile(forReading: fileUrl)
_audioPlayer.scheduleFile(asset, at: nil, completionHandler: nil)
_audioPlayer.play()
} catch {
print ("asset error")
}
}
}

This is Objective-c but can be translated into Swift
I ended up using AVAudioEngine and AVAudioPlayerNode to play audio on the Apple watch.
The gist of how to do this is as follows:
I call the following inside the init method of my AudioPlayer (it's an NSObject subclass to encapsulate the functionality)
_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];
if (!_audioEngine.isRunning) {
NSError* error;
[_audioEngine startAndReturnError:&error];
}
I have a cache setup so I don't recreate the AVAudioFile assets every time I want to play a sound but you don't need to.
So next create an AVAudioFile object:
NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:key ofType:#"aifc"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url &error];
Then play that file:
[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];
UPDATE: If the app goes to sleep or is put to the background there is a chance the audio will stop playing/fade out. By activating an Audio Session this will be prevented.
NSError *error;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
NSLog(#"AVAudioSession setCategory ERROR: %#", error.localizedDescription);
}
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"AVAudioSession setActive ERROR: %#", error.localizedDescription);
}
I didn't go over handling any errors but this should work. Don't forget to #import <AVFoundation/AVFoundation.h> at the top of your implementation file.

This worked for me in the simulator
let soundPath = Bundle.main.path(forResource: "cheerMP3", ofType: "mp3")
let soundPathURL = URL(fileURLWithPath: soundPath!)
let audioFile = WKAudioFileAsset(url: soundPathURL)
let audioItem = WKAudioFilePlayerItem(asset: audioFile)
let audioPlayer = WKAudioFilePlayer.init(playerItem: audioItem)
if audioPlayer.status == .readyToPlay
{
audioPlayer.play()
}
else
{
print("Not ready!!")
}
but only if I had a breakpoint at both audioPlayer.play() and after the last }.
dunqan, what did you put at the top of the file, the import statements? I wasn't able to include
import AVFoundation
without an error using Xcode 8.2.1

Related

How we remove sounds from Video through SCPlayer?

I am trying to remove Audio from Video and i am using SCRecorder Class.
but still there is Audio play. So Is there a way to remove Audio from Video using SCRecorder Class.I try following Code in my Project.
SCRecorder *recorder = [SCRecorder recorder]; // You can also use +[SCRecorder sharedRecorder]
SCAudioConfiguration *audio = recorder.audioConfiguration;
// Whether the audio should be enabled or not
audio.enabled = NO;
[_player play];
IN SCRecoder Class you need to stop or comment
this bunch of code
// if (self.audioConfiguration.enabled) {
// if (_audioOutput == nil) {
// _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
// [_audioOutput setSampleBufferDelegate:self queue:_audioQueue];
// }
//
// if ([session canAddOutput:_audioOutput]) {
// [session addOutput:_audioOutput];
// _audioOutputAdded = YES;
// } else {
// audioError = [SCRecorder createError:#"Cannot add audioOutput inside the sesssion"];
// }
// }
ann you find this code in below method
- (void)openSession:(void(^)(NSError *sessionError, NSError *audioError, NSError *videoError, NSError *photoError))completionHandler {

GPUImage saving video in background issue

I'm having an issue with saving video from a GPUImage videoCamera to the Camera Roll when my app goes into the background. The file is only saved to the camera roll when the app returns to the foreground / is restarted. I'm no doubt making a beginners code error , if anyone can point it out that would be appreciated.
- (void)applicationDidEnterBackground:(UIApplication *)application {
if (isRecording){
[self stopRecording];
};
if (self.isViewLoaded && self.view.window){
[videoCamera stopCameraCapture];
};
runSynchronouslyOnVideoProcessingQueue(^{
glFinish();
});
NSLog(#"applicationDidEnterBackground");
and then
-(void)stopRecording {
[filterBlend removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSString *path = [NSTemporaryDirectory() stringByAppendingPathComponent:#"file.mov"];
ALAssetsLibrary *al = [[ALAssetsLibrary alloc] init];
[al writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:path] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error %#", error);
} else {
NSLog(#"Success");
}
}];
isRecording = NO;
NSLog(#"Stop recording");
It was exactly as Brad pointed out in his, as usual, insightful comment, the -writeVideoAtPathToSavedPhotosAlbum:completionBlock: wasn't completing till after the app returned to the foreground, I solved it by adding
self.backgroundTask = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Background handler called. Not running background tasks anymore.");
[[UIApplication sharedApplication] endBackgroundTask:self.backgroundTask];
self.backgroundTask = UIBackgroundTaskInvalid;
}];
and
#property (nonatomic) UIBackgroundTaskIdentifier backgroundTask;
Found this solution at http://www.raywenderlich.com/29948/backgrounding-for-ios

Using GCD to Parse KML With Apple's KMLViewer

I'm using Apple's KMLViewer to load a KML file and display it in a MapView. There are over 50,000 lines of coordinates in the KML file which, of course, causes it to load slowly. In an attempt to speed things up, I'm trying to perform the parsing in another thread using GCD.
I have it working reasonably well as far as it is displaying properly and the speed is acceptable. However, I'm getting intermittent runtime errors when loading the map. I suspect it is because the way I have things laid out, the UI is being updated within the GCD block. Everything I'm reading says the UI should be updated in the main thread or else runtime errors can occur which are intermittent and hard to track down. Well, that's what I'm seeing.
The problem is, I can't figure out how to update the UI in the main thread. I'm still new to iOS programming so I'm just throwing things against the wall to see what works. Here is my code, which is basically Apple's KMLViewerViewController.m with some modifications:
#import "KMLViewerViewController.h"
#implementation KMLViewerViewController
- (void)viewDidLoad
{
[super viewDidLoad];
activityIndicator.hidden = TRUE;
dispatch_queue_t myQueue = dispatch_queue_create("My Queue",NULL);
dispatch_async(myQueue, ^{
// Locate the path to the route.kml file in the application's bundle
// and parse it with the KMLParser.
NSString *path = [[NSBundle mainBundle] pathForResource:#"BigMap" ofType:#"kml"];
NSURL *url = [NSURL fileURLWithPath:path];
kmlParser = [[KMLParser alloc] initWithURL:url];
[kmlParser parseKML];
dispatch_async(dispatch_get_main_queue(), ^{
// Update the UI
// Add all of the MKOverlay objects parsed from the KML file to the map.
NSArray *overlays = [kmlParser overlays];
[map addOverlays:overlays];
// Add all of the MKAnnotation objects parsed from the KML file to the map.
NSArray *annotations = [kmlParser points];
[map addAnnotations:annotations];
// Walk the list of overlays and annotations and create a MKMapRect that
// bounds all of them and store it into flyTo.
MKMapRect flyTo = MKMapRectNull;
for (id <MKOverlay> overlay in overlays) {
if (MKMapRectIsNull(flyTo)) {
flyTo = [overlay boundingMapRect];
} else {
flyTo = MKMapRectUnion(flyTo, [overlay boundingMapRect]);
}
}
for (id <MKAnnotation> annotation in annotations) {
MKMapPoint annotationPoint = MKMapPointForCoordinate(annotation.coordinate);
MKMapRect pointRect = MKMapRectMake(annotationPoint.x, annotationPoint.y, 0, 0);
if (MKMapRectIsNull(flyTo)) {
flyTo = pointRect;
} else {
flyTo = MKMapRectUnion(flyTo, pointRect);
}
}
// Position the map so that all overlays and annotations are visible on screen.
map.visibleMapRect = flyTo;
});
});
}
-(void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
activityIndicator.hidden = FALSE;
[activityIndicator startAnimating];
}
#pragma mark MKMapViewDelegate
- (MKOverlayView *)mapView:(MKMapView *)mapView viewForOverlay:(id <MKOverlay>)overlay
{
return [kmlParser viewForOverlay:overlay];
}
- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id <MKAnnotation>)annotation
{
return [kmlParser viewForAnnotation:annotation];
}
- (void)mapViewDidFinishLoadingMap:(MKMapView *)mapView
{
[activityIndicator stopAnimating];
activityIndicator.hidden = TRUE;
}
#end
Suggestions?

Detecting blow in the iPhone Mic with Cocos2D

I'm using Cocos2D and the system particles and I hope I wrote this correctly in English.
I'm having problems to recognize sounds with the iPhone mic in certain way.
I have different sections in my application, in one of them I use the mic to detect if someone is "blowing air" into the mic. This part works fine at the beginning, but if you go to other section of the app that plays sound and later you return to this area and try to blow air, it won't work.
I debuged the code, and the levelTimeCallback is always working even if I'm not in this scene. I don't really know what's happening. I've stopped all sounds using
[[SimpleAudioEngine sharedEngine] stopBackgroundMusic];
Anyone knows what I'm doing wrong? BTW works perfectly in simulator, but not in iPhone.
The recorder is set in the onEnter method
-(void) onEnter {
[super onEnter];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
NSLog(#"I'm in the recorder");
} else
NSLog(#"recorder error");
}
This is the levelTimerCallback method were the sound is "checked"
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
if (lowPassResults > 0.55)
{
NSLog(#"Mic blow detected");
self.emitter = [[CCParticleExplosion alloc] initWithTotalParticles:5];
[self addChild: emitter z:1];
emitter.texture = [[CCTextureCache sharedTextureCache] addImage: #"hoja.png"];
emitter.autoRemoveOnFinish = YES;
}
NSLog(#"Inside levelTimerCallback");}
Same problem.
There are 2 ways to solved this.
1: Use AVAudioPlayer to play music or sound effect.
2: Add #import "CDAudioManager.h"to AppDelegate.m and Add
[CDAudioManager initAsynchronously:kAMM_PlayAndRecord];
to
- (void) applicationDidFinishLaunching:(UIApplication*)application`
source
Finally, after reading the Audio Session cookbook and different kind of post related I found the solution. Jus put this code block in the init method:
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
AudioSessionSetActive(true);

Find assets in library - add to a AVMutableComposition - export = crash

I've been struggling with adding assets from the iPhone Photo Library to a AVMutableComposition and then export them. Here is what I got:
Finding the assets: (here I grab the AVURLAsset)
-(void) findAssets {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just videos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
[group enumerateAssetsUsingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop){
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
NSURL *url = [representation url];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
// Do something interesting with the AV asset.
[thumbs addObject:alAsset];
[assets addObject:avAsset];
}else if(alAsset == nil){
[self createScroll];
}
}];
}
failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
NSLog(#"No groups");
}];
[library release];
}
Here I add a asset to my composition (I use the first object in the array for testing only.
-(void) addToCompositionWithAsset:(AVURLAsset*)_asset{
NSError *editError = nil;
composition = [AVMutableComposition composition];
AVURLAsset* sourceAsset = [assets objectAtIndex:0];
Float64 inSeconds = 1.0;
Float64 outSeconds = 2.0;
// calculate time
CMTime inTime = CMTimeMakeWithSeconds(inSeconds, 600);
CMTime outTime = CMTimeMakeWithSeconds(outSeconds, 600);
CMTime duration = CMTimeSubtract(outTime, inTime);
CMTimeRange editRange = CMTimeRangeMake(inTime, duration);
[composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
if (!editError) {
CMTimeGetSeconds (composition.duration);
}
}
And finally I export the comp and here it crashes
-(void)exportComposition {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
NSLog (#"can export: %#", exportSession.supportedFileTypes);
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME];
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;//#"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog (#"i is in your block, exportin. status is %d",
exportSession.status);
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
case AVAssetExportSessionStatusCompleted: {
[self performSelectorOnMainThread:#selector (exportDone:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
}
Does anyone have an idea of what it might be? It crashes on
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
And I tried different presets and outputFileTypes.
Thanks
* SOLVED *
I have to answer my own question now when I have solved. It's amazing that I've been struggling with this for a whole day and then I fix it right after posting a question :)
I changed and moved:
composition = [AVMutableComposition
composition];
to:
composition = [[AVMutableComposition
alloc] init];
I think I was too tired when I was working on this yesterday. Thanks guys!

Resources