My app creates an mp4 file. I've verified that the code I have works on the following devices:
iPad (OS 4.3.2)
iPhone4 (OS 4.2.1)
iPhone 3GS (OS 4.2.1)
.. but the init fails on my iPod Touch 3rd Gen running OS 4.2.1.
This is related to another question on here, but I'm seeing it on a a different iOS device than he was and I've included my init code here. Like the other question, I've tried different pixel formats as well as bitrates, but the AVAssetWriter's status always changes to AVAssetWriterStatusFailed after calling its startWriting function.
Any ideas would be greatly appreciated. I know that mp4 creation is possible on this device because I've downloaded another app that does it just fine on the same device that my code fails on.
Here is the minimal code to do the video setup.
#import "AVFoundation/AVFoundation.h"
#import "AVFoundation/AVAssetWriterInput.h"
#import "AVFoundation/AVAssetReader.h"
#import "Foundation/NSUrl.h"
void VideoSetupTest()
{
int width = 320;
int height = 480;
// Setup the codec settings.
int nBitsPerSecond = 100000;
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt: nBitsPerSecond], AVVideoAverageBitRateKey,
nil];
// Create the AVAssetWriterInput.
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings, AVVideoCompressionPropertiesKey,
[NSNumber numberWithInt: width], AVVideoWidthKey,
[NSNumber numberWithInt: height], AVVideoHeightKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[assetWriterInput retain];
// Create the AVAssetWriterPixelBufferAdaptor.
NSDictionary *pixelBufferAdaptorAttribs = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt: kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt: width], kCVPixelBufferWidthKey,
[NSNumber numberWithInt: height], kCVPixelBufferHeightKey,
nil];
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor alloc];
[pixelBufferAdaptor initWithAssetWriterInput: assetWriterInput sourcePixelBufferAttributes: pixelBufferAdaptorAttribs];
// Figure out a filename.
NSArray *paths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES );
NSString *documentsDirectory = [paths objectAtIndex:0];
char szFilename[256];
sprintf( szFilename, "%s/test.mp4", [documentsDirectory UTF8String] );
unlink( szFilename );
printf( "Filename:\n%s\n\n", szFilename );
// Create the AVAssetWriter.
NSError *pError;
NSURL *url = [NSURL fileURLWithPath: [NSString stringWithUTF8String: szFilename]];
AVAssetWriter *assetWriter = [AVAssetWriter alloc];
[assetWriter initWithURL:url fileType:AVFileTypeMPEG4 error:&pError];
// Bind these things together and start!
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];
//
// ** NOTE: Here's the call where [assetWriter status] starts returning AVAssetWriterStatusFailed
// on an iPod 3rd Gen running iOS 4.2.1.
//
[assetWriter startWriting];
}
I've come to the conclusion that the iPod Touch 3rd Gen just can't create .mp4 videos, period. This would make sense - why include video encoding hardware on a device that doesn't even have a camera to capture videos with?
The thing that was making me think that it could create .mp4 videos was that I had another app that was able to create them from my iPod Touch 3. But upon analyzing that app's .mp4 video, it was actually an H.263 (MPEG-4 part-2) video created with ffmpeg. (Whereas AVFoundation will create H.264 videos if it can do it at all on a device).
Related
I'm trying to achieve the square video recording like 300*300 so I choose GPUImage but its not working on IOS 7 and giving errors like [UIView nextAvailableTextureIndex]: unrecognized selector sent to instance the error starts when we build the even the sample code
when trying to save the GPUImageVideoCamera
some times its stucks at [movieWriter startRecording];
is the GPUImage compatible with ios 7 or we have made some changes ?
here is the code
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
filter = [[GPUImageSepiaFilter alloc] init];
initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
}
- (IBAction)stopRecording:(id)sender {
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
}
- (IBAction)startRecording:(id)sender {
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
[videoCamera startCameraCapture];
}
My guess is that you modified the .xib or storyboard and didn't set the class of the view that is showing the camera preview to GPUImageView.
I'm currently working on an iPhone App, that only plays the Audio Track of an .mp4. The Player starts playing, but I can't hear any sound.
Here is the Code:
NSURL *videoUrl = [NSURL URLWithString:#"http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:videoUrl options:nil];
AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
NSError *error = nil;
BOOL success = [track insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:audioTrack atTime:kCMTimeZero error:&error];
if (!success)
{
NSLog(#"error: %#", error);
}
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[self.player play];
I had a similar problem and just found that AVFoundation is very unforgiving in how you format your composition. I eventually answered my own question - which was the same as your's here.
Check out my post:
iOS AVFoundation Export Session is missing audio
i am unable to post my photos on instagram using ios App. I got the following code from a post here on stackoverflow but it does not work. When Instagram starts, it displays error unable to open file. kindly help me.
NSURL *instagramURL = [NSURL URLWithString:#"instagram://location?id=1"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL]) {
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"Image.ig"];
NSData *imageData = UIImagePNGRepresentation([UIImage imageNamed:#"Icon.png"]);
[imageData writeToFile:savedImagePath atomically:YES];
NSURL *imageUrl = [NSURL fileURLWithPath:savedImagePath];
docController = [[UIDocumentInteractionController alloc] init];
docController.delegate = self;
[docController retain];
docController.UTI = #"com.instagram.photo";
[docController setURL:imageUrl];
[docController presentOpenInMenuFromRect:CGRectZero inView:self.view animated:YES];
}else{
UIAlertView *errorToShare = [[UIAlertView alloc] initWithTitle:#"Instagram unavailable " message:#"You need to install Instagram in your device in order to share this image" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[errorToShare show];
[errorToShare release];
}
Minimum resolution for instagram is 612x612. Always JPG.
For the sake of completion, the minimum resolution remains at 612x612 (this is due to displaying images on retina devices), but Instagram now supports both JPEG and PNG images. All non-square images sent to Instagram must be cropped before posting.
I'm trying to copy an item from the iPod Library to my local storage space - for later playback. I've got the item URl but it's (ipod-library://item/item.mp3?id=2398084975506389321) any idea how to access the actual file?
Thanks,
Rick
This will work https://gist.github.com/3304992
-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection{
NSString *tempPath = NSTemporaryDirectory();
int i=1;
for (MPMediaItem *theItem in mediaItemCollection.items) {
NSURL *url = [theItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"com.apple.coreaudio-format";
NSString *fname = [[NSString stringWithFormat:#"%d",i] stringByAppendingString:#".caf"];
++i;
NSString *exportFile = [tempPath stringByAppendingPathComponent: fname];
exporter.outputURL = [NSURL fileURLWithPath:exportFile];
[exporter exportAsynchronouslyWithCompletionHandler:^{
//Code for completion Handler
}];
}
[picker dismissViewControllerAnimated:YES completion:Nil];
}
use MPMediaPickerController to pick the media
This is how I'm doing it in Objective-C:
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudio.h>
// or [NSURL URLWithString:#"ipod-library://item/item.mp3?id=2398084975506389321"]
NSURL *assetURL = [item valueForProperty:MPMediaItemPropertyAssetURL];
NSMutableData *data = [[NSMutableData alloc] init];
const uint32_t sampleRate = 16000;
const uint16_t bitDepth = 16;
const uint16_t channels = 2;
NSDictionary *opts = [NSDictionary dictionary];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:assetURL options:opts];
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:NULL];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:(float)sampleRate], AVSampleRateKey,
[NSNumber numberWithInt:bitDepth], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetReaderTrackOutput *output = [[AVAssetReaderTrackOutput alloc] initWithTrack:[[asset tracks] objectAtIndex:0] outputSettings:settings];
[asset release];
[reader addOutput:output];
[reader startReading];
// read the samples from the asset and append them subsequently
while ([reader status] != AVAssetReaderStatusCompleted) {
CMSampleBufferRef buffer = [output copyNextSampleBuffer];
if (buffer == NULL) continue;
CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(buffer);
size_t size = CMBlockBufferGetDataLength(blockBuffer);
uint8_t *outBytes = malloc(size);
CMBlockBufferCopyDataBytes(blockBuffer, 0, size, outBytes);
CMSampleBufferInvalidate(buffer);
CFRelease(buffer);
[data appendBytes:outBytes length:size];
free(outBytes);
}
[output release];
Here data will contain the raw PCM data of the track. Please note that you cannot directly access the file of a song or video, only its data through this method. You can compress it using e. g. FLAC (that's how I'm processing it in my tweak).
Since MonoTouch has an 1:1 mapping to Objective-C class and method names, this should be fairly easy to copy over. :)
I'm using Cocos2D and the system particles and I hope I wrote this correctly in English.
I'm having problems to recognize sounds with the iPhone mic in certain way.
I have different sections in my application, in one of them I use the mic to detect if someone is "blowing air" into the mic. This part works fine at the beginning, but if you go to other section of the app that plays sound and later you return to this area and try to blow air, it won't work.
I debuged the code, and the levelTimeCallback is always working even if I'm not in this scene. I don't really know what's happening. I've stopped all sounds using
[[SimpleAudioEngine sharedEngine] stopBackgroundMusic];
Anyone knows what I'm doing wrong? BTW works perfectly in simulator, but not in iPhone.
The recorder is set in the onEnter method
-(void) onEnter {
[super onEnter];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
NSLog(#"I'm in the recorder");
} else
NSLog(#"recorder error");
}
This is the levelTimerCallback method were the sound is "checked"
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
if (lowPassResults > 0.55)
{
NSLog(#"Mic blow detected");
self.emitter = [[CCParticleExplosion alloc] initWithTotalParticles:5];
[self addChild: emitter z:1];
emitter.texture = [[CCTextureCache sharedTextureCache] addImage: #"hoja.png"];
emitter.autoRemoveOnFinish = YES;
}
NSLog(#"Inside levelTimerCallback");}
Same problem.
There are 2 ways to solved this.
1: Use AVAudioPlayer to play music or sound effect.
2: Add #import "CDAudioManager.h"to AppDelegate.m and Add
[CDAudioManager initAsynchronously:kAMM_PlayAndRecord];
to
- (void) applicationDidFinishLaunching:(UIApplication*)application`
source
Finally, after reading the Audio Session cookbook and different kind of post related I found the solution. Jus put this code block in the init method:
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
AudioSessionSetActive(true);