I had developed an ios app for ios 4.0.That was navigation based application.Now I want it also support for iPhone-5 I think I changed xib after checking device version,I am facing problem xib is changed but it's view Height is not changed.How it can possible if some else face this problem please share ideas with me.Thanks.
IN APP Delegate:-
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions{
CGSize iOSDeviceScreenSize = [[UIScreen mainScreen] bounds].size;
//----------------HERE WE SETUP FOR IPHONE 4/4s/iPod----------------------
if(iOSDeviceScreenSize.height == 480){
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController_4inch" bundle:nil];
NSLog(#"iPhone 4: %f", iOSDeviceScreenSize.height);
}
//----------------HERE WE SETUP FOR IPHONE 5----------------------
if(iOSDeviceScreenSize.height == 568){
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController_5inch" bundle:nil];
NSLog(#"iPhone 5: %f", iOSDeviceScreenSize.height);
}
return YES;
}
Its Works !!!!!!
Set iOS 6 as the Base SDK and use the Auto Layout feature to make screens that can scale for all type of screens. You'll need Xcode 4.5 to do this.
Get started with Auto Layout here:
http://www.raywenderlich.com/20881/beginning-auto-layout-part-1-of-2
http://www.raywenderlich.com/20897/beginning-auto-layout-part-2-of-2
If you still want to support iOS 4.0, have separate .xib files for different screen sizes and load them appropriately at launch.
To load different nib files based on your screen size, in your app delegate, you will need to add/replace the following code in - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
CGRect screenBounds = [[UIScreen mainScreen] bounds];
if (screenBounds.size.height == 568) {
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController_4inch" bundle:nil];
} else {
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
}
where ViewController_4inch is the name of the nib file that is designed for iPhone 5 screen
Related
I'm trying to achieve the square video recording like 300*300 so I choose GPUImage but its not working on IOS 7 and giving errors like [UIView nextAvailableTextureIndex]: unrecognized selector sent to instance the error starts when we build the even the sample code
when trying to save the GPUImageVideoCamera
some times its stucks at [movieWriter startRecording];
is the GPUImage compatible with ios 7 or we have made some changes ?
here is the code
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
filter = [[GPUImageSepiaFilter alloc] init];
initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
}
- (IBAction)stopRecording:(id)sender {
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
}
- (IBAction)startRecording:(id)sender {
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
[videoCamera startCameraCapture];
}
My guess is that you modified the .xib or storyboard and didn't set the class of the view that is showing the camera preview to GPUImageView.
My app uses UIWebview, and it works well in iOS 5 and iOS 6. However, it doesn't load the webpage in iOS 7 when I build in Xcode 5 and run the same code.
- (void)webViewDidFinishLoad:(UIWebView *)webView {}
- (void)webViewDidStartLoad:(UIWebView *)webView {}
- (void)webView:(UIWebView *)webView didFailLoadWithError:(NSError *)error {}
All delegate function is not called. But I do set delegate in xib file and code
self.theWebView.delegate = self;
I didn't find any information via google. Thank you for your help.
I moved the loadRequest method to the completion handler of a presentViewController and it works in iOS 5, 6 and 7:
[self presentViewController:gwvc animated:YES completion:^(void){
[gwvc.wv loadRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:#"http://www.walkjogrun.net/about/eukanuba.html"]]];
}];
I found the root cause.
maybe I incorrectly used UIWebView, but it works in iOS5 and iOS6.
I don't know why it works in earlier iOS versions...
Moreover, it works in iOS7 when I build code with SDK 6.1.
Here's my old code.
RechargeWebPageViewController *webPageViewController;
webPageViewController = [[ RechargeWebPageViewController alloc] initWithNibName:#"WebPage" bundle:nil];
if (webPageViewController != nil) {
webPageViewController.hidesBottomBarWhenPushed = YES;
webPageViewController.delegate=self;
[self.navigationController pushViewController:webPageViewController animated:YES];
NSURL *url = [NSURL URLWithString:#"https://xxx.php"];
NSMutableURLRequest *request = [[NSMutableURLRequest alloc]initWithURL: url];
[webPageViewController loadRequest:request];
[request release];
}
I moved the loadRequest from the viewDidLoad method to the ViewWillAppear method, then it worked.
I think maybe UIWebView is not initialized correctly in my old code for iOS7.
In my iOS app I have a Map View that I called mappa.
I just add a UIActivityViewController that appears when a callout is pressed from an annotation View.
I use this code to display it:
- (void)mapView:(MKMapView *)mapView
annotationView:(MKAnnotationView *)view
calloutAccessoryControlTapped:(UIControl *)control {
[mappa deselectAnnotation:view.annotation animated:YES];
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 6.0) {
UIActivityViewController *activityViewController = [[UIActivityViewController alloc] initWithActivityItems:#[self, [NSURL URLWithString:#"http://www.AppStore.com/iToretto"]]
applicationActivities:nil];
activityViewController.excludedActivityTypes = #[UIActivityTypePostToWeibo, UIActivityTypeAssignToContact, UIActivityTypeCopyToPasteboard ];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPad) {
UIPopoverController *pop = [[UIPopoverController alloc] initWithContentViewController:activityViewController];
self.annotationPopoverController = pop;
//show the popover next to the annotation view (pin)
[pop presentPopoverFromRect:view.bounds
inView:view
permittedArrowDirections:UIPopoverArrowDirectionAny
animated:YES];
self.annotationPopoverController.delegate = self;
}
}
}
My problem is that when I rotate this popover, I have a frame Issue (like this). I understand that the solution is to override :
- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation
duration:(NSTimeInterval)duration;
But when I try to display popover in landscape mode in this way I have multiple errors:
if (UIInterfaceOrientationLandscapeLeft && [self.annotationPopoverController isPopoverVisible]) {
[self.annotationPopoverController presentPopoverFromRect:view.bounds
inView:view
permittedArrowDirections:UIPopoverArrowDirectionAny
animated:YES];
}
Because "view.bounds" and "view" is a local declaration of MKAnnotation
So, How can I fix it?! Does anyone have a solution?! Thanks in advance.
I'm using Cocos2D and the system particles and I hope I wrote this correctly in English.
I'm having problems to recognize sounds with the iPhone mic in certain way.
I have different sections in my application, in one of them I use the mic to detect if someone is "blowing air" into the mic. This part works fine at the beginning, but if you go to other section of the app that plays sound and later you return to this area and try to blow air, it won't work.
I debuged the code, and the levelTimeCallback is always working even if I'm not in this scene. I don't really know what's happening. I've stopped all sounds using
[[SimpleAudioEngine sharedEngine] stopBackgroundMusic];
Anyone knows what I'm doing wrong? BTW works perfectly in simulator, but not in iPhone.
The recorder is set in the onEnter method
-(void) onEnter {
[super onEnter];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
NSLog(#"I'm in the recorder");
} else
NSLog(#"recorder error");
}
This is the levelTimerCallback method were the sound is "checked"
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
if (lowPassResults > 0.55)
{
NSLog(#"Mic blow detected");
self.emitter = [[CCParticleExplosion alloc] initWithTotalParticles:5];
[self addChild: emitter z:1];
emitter.texture = [[CCTextureCache sharedTextureCache] addImage: #"hoja.png"];
emitter.autoRemoveOnFinish = YES;
}
NSLog(#"Inside levelTimerCallback");}
Same problem.
There are 2 ways to solved this.
1: Use AVAudioPlayer to play music or sound effect.
2: Add #import "CDAudioManager.h"to AppDelegate.m and Add
[CDAudioManager initAsynchronously:kAMM_PlayAndRecord];
to
- (void) applicationDidFinishLaunching:(UIApplication*)application`
source
Finally, after reading the Audio Session cookbook and different kind of post related I found the solution. Jus put this code block in the init method:
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
AudioSessionSetActive(true);
The movie plays just fine but there is a quick black flash right before it plays. Is this a quirk resulting from setting the controlstyle to MPMovieControlStyleNone?
NSString *url = [[NSBundle mainBundle] pathForResource:#"00" ofType:#"mov"];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL fileURLWithPath:url]];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:player];
//---play video in implicit size---
player.view.frame = CGRectMake(80, 64, 163, 246);
[self.view addSubview:player.view];
// Hide video controls
player.controlStyle = MPMovieControlStyleNone;
//---play movie---
[player play];
I just had this problem and fixed it by adding an observer to the default NSNotificationCenter to find out when the movie was completely ready to play, and THEN adding the view as a subview to my main view.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(checkMovieStatus:) name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
...
if(moviePlayer.loadState & (MPMovieLoadStatePlayable | MPMovieLoadStatePlaythroughOK))
{
[pageShown.view addSubview:moviePlayer.view];
[moviePlayer play];
}
In IOS 6 mpmoviewplayer added a new property :readyForDisplay
this is what I'm playing around with and so far so good:
create mpmovieplayer ,add to stage, hide it.
add notification for play state on the movieController
wait for the displayState to Change and once its ready show the video controller:
- (void)moviePlayerPlayState:(NSNotification *)noti {
if (noti.object == self.movieController) {
MPMoviePlaybackState reason = self.movieController.playbackState;
if (reason==MPMoviePlaybackStatePlaying) {
[[NSNotificationCenter defaultCenter] removeObserver:self name: MPMoviePlayerPlaybackStateDidChangeNotification object:nil];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
while (self.movieController.view.hidden)
{
NSLog(#"not ready");
if (self.movieController.readyForDisplay) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
NSLog(#"show");
self.movieController.view.hidden=NO;
});
}
usleep(50);
}
});
}
}
}
When the play state changes to MPMoviePlaybackStatePlaying we start checking for the readyDisplayState to change.
Evidently there is a flash of black in the movie rect until enough movie loads so it can start playback. Here is my workaround:
Create a UIImageView and put the MPMoviePlayerController in it. That way you can set the alpha to 0.
As soon as you call [player play]; to play the video, set up a .5 second timer.
When the time is done, change the alpha to 1.
This will make the player invisible for 1/2 second (which hides the black flash).
Create video without addSubview and play instructions:
NSString *filepath = [[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"];
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
MPMoviePlayerController *moviePlayerController = [[MPMoviePlayerController alloc] initWithContentURL:fileURL];
[moviePlayerController.view setFrame:CGRectMake(80, 64, 163, 246)];
moviePlayerController.controlStyle = MPMovieControlStyleNone;
Prepare video to play and add notification:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(checkMovieStatus:) name:MPMoviePlayerLoadStateDidChangeNotification object:nil];
[moviePlayerController prepareToPlay];
Create function checkMovieStatus with addSubview and play instructions:
- (void)checkMovieStatus:(NSNotification *)notification {
if(moviePlayerController.loadState & (MPMovieLoadStatePlayable | MPMovieLoadStatePlaythroughOK)) {
[self.view addSubview:moviePlayerController.view];
[moviePlayerController play];
}
}
Or simply change the color of the view, that is what your actually seeing...
player.view.backgroundColor = [UIColor colorWithRed:1 green:1 blue:1 alpha:0];
I believe the black flash may be related to the movieSourceType property of MPMoviePlayerController.
If you don't set it, it defaults to MPMovieSourceTypeUnknown which causes the UI to be delayed until the file is loaded.
Try adding this line:
player.movieSourceType = MPMovieSourceTypeFile;
Right after initializing player.
To avoid the black flash, use a MPMoviePlayerViewController instead of a MPMoviePlayerController. I think that this class creates the background on view display, rather than on video load (like MPMoviePlayerController does).
Before adding the moviePlayerViewController.moviePlayer.view to your display view, you have to add a white subview (or a subview appropriate for your content) to the backgroundView, like this:
[moviePlayerViewController.moviePlayer.view setFrame:[displayView bounds]];
UIView *movieBackgroundView = [[UIView alloc] initWithFrame:[displayView bounds]];
movieBackgroundView.backgroundColor = [UIColor whiteColor];
[moviePlayerViewController.moviePlayer.backgroundView addSubview:movieBackgroundView];
[movieBackgroundView release];
Fond solution here http://joris.kluivers.nl/blog/2010/01/04/mpmovieplayercontroller-handle-with-care/
from iOS 6
you need to use [self.moviePlayer prepareToPlay]; and catch MPMoviePlayerReadyForDisplayDidChangeNotification to use [self.moviePlayer play];