My iPhone application downloads image files from a server, stores it into NSTemporaryDirectory() and then loads the image in the UI asynchronously. Code flow is like this:
Show view with loading activity indicator and run a image downloader in the background.
Once the image is downloaded, it is written to a file.
A timer in the loading view keep checking for the availability of file in the temp directory and once available, loads the image from file and adds the image to the UI.
Before adding the image, it is scaled to required size.
Problem is, I use UIGraphicsGetImageFromCurrentImageContext to scale the image. Looks like the memory used by the image context is not getting cleaned. The app memory just keeps increasing as more files get downloaded.
Some code below:
Code to scale the image:
-(UIImage*)scaleToSize:(CGSize)size image:(UIImage *)imageref
{
UIGraphicsBeginImageContext(size);
[imageref drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Loading image from temp directory:
-(void)loadImageFromFile: (NSString *) path
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
UIImage * imm = [[[UIImage alloc] initWithContentsOfFile:path] autorelease];
[self performSelectorOnMainThread:#selector(insertImage:) withObject:imm waitUntilDone:YES];
[pool release];
}
Adding image to view (subset of code):
self.imageContainer = [[UIImageView alloc] initWithFrame:CGRectMake(0,80,320,250)];
[self addSubview:self.imageContainer];
self.imageContainer.image = [self scaleToSize:CGSizeMake(320.0f, 250.0f) image:imm];
[imageContainer release];
What am I missing here ?
One way of avoiding the leak from UIGraphicsGetImageFromCurrentImageContext is to not call it at all by resizing the container instead of resizing image directly:
self.imageContainer.contentMode = UIViewContentModeScaleAspectFit;
self.imageContainer.frame = CGRectMake(self.imageContainer.frame.origin.x, self.imageContainer.frame.origin.y, 320.0f, 250.0f);
Related
I've setup my AVAudioEngine in its own method like this:
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setPreferredSampleRate:[session sampleRate] error:nil];
[session setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:nil];
[session setActive:YES error:nil];
audioEngine_ = [[AVAudioEngine alloc] init];
//tap the outputNode to create the singleton
[audioEngine_ outputNode];
NSError* error;
[audioEngine_ startAndReturnError:&error];
There is no error starting it up. I have another method to attach an AVAudioUnitMIDIInstrument from an AudioComponentDescription I've received from an Inter-App Audio instrument. My connect method looks like this:
-(void)connectInstrument:(AudioComponentDescription)desc {
instNode_ = nil;
instNode_ = [[AVAudioUnitMIDIInstrument alloc] initWithAudioComponentDescription:desc];
[audioEngine_ attachNode:instNode_];
//Crashes here
[audioEngine_ connect:instNode_ to:[audioEngine_ outputNode] format:nil];
[audioEngine_ startAndReturnError:nil];
NSLog(#"done connecting");
}
The crash gives me no useful information. I get this:
invalid mode 'kCFRunLoopCommonModes' provided to CFRunLoopRunSpecific - break on _CFRunLoopError_RunCalledWithInvalidMode to debug. This message will only appear once per execution.
* Terminating app due to uncaught exception 'NSRangeException', reason: '* -[__NSArrayM objectAtIndexedSubscript:]: index 0 beyond bounds for empty array'
If I do a test and try to create a new mixer node and attach/connect it, there is no crash.
I have more relevant information
If I do this:
AVAudioFormat* instFormat = [instUnit_ outputFormatForBus:0];
I get the same error:
-[__NSArrayM objectAtIndexedSubscript:]: index 0 beyond bounds for empty array
It's almost as though there is no AVAudioFormat set for any output busses, nor any input busses (I checked inputFormatForBus as well). This is odd because if I do:
AudioStreamBasicDescription f1;
UInt32 outsize = sizeof(f1);
AudioUnitGetProperty(instNode_.audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &f1, &outsize);
Then f1 is a valid AudioStreamBasicDescription showing a standard 44.1khz sample rate, 32 bit float, 2 channels. So there is an output stream format, but it doesn't seem to be attached to any output bus of the AVAudioUnitMIDIInstrument instance.
EDIT - Further information
Even if I just try to access instUnite_.name I get the NSRangeException. I'm wondering now if this is a problem with how I'm getting the component description. I'm attempting to use Inter-App Audio (I have done all of the proper entitlements, capabilities, and app ID setup). This is how I am discovering Inter-App Audio components:
NSMutableArray* units = [[NSMutableArray alloc] init];
AudioComponentDescription searchDesc = { kAudioUnitType_RemoteInstrument, 0, 0, 0, 0 };
NSArray* componentArray = [[AVAudioUnitComponentManager sharedAudioUnitComponentManager] componentsMatchingDescription:searchDesc];
for(AVAudioUnitComponent* component in componentArray) {
AudioComponentDescription description = component.audioComponentDescription;
InterAppAudioUnit* unit = [[InterAppAudioUnit alloc] init];
unit.componentDescription = description;
unit.icon = AudioComponentGetIcon(component.audioComponent, 44.0f);
unit.name = component.name;
unit.avComponent = component;
[units addObject:unit];
}
_units = units;
[self.tableView reloadData];
This is all in a presented UITableViewController. After clicking one I simply execute:
[self connectInstrument:unit.componentDescription];
If I instead hand build a component description for a local unit, the AVAudioUnit is instantiated just fine and nothing crashes.
I found the answer. The AUAudioUnit object inside of the AVAudioUnit object has no output busses upon creation. I'm not sure why. You can fix it by doing this:
AUAudioUnit* auInst = instUnit_.AUAudioUnit;
NSMutableArray<AUAudioUnitBus*>* busArray = [[NSMutableArray alloc] init];
AVAudioFormat* outFormat = [[self->audioEngine_ outputNode] outputFormatForBus:0];
AUAudioUnitBus* bus = [[AUAudioUnitBus alloc] initWithFormat:outFormat error:&err];
[busArrays addObject:bus];
[[auInst outputBusses] replaceBusses:busArray];
working on iPhone osm maps app (Route me).well initialising and downloading online maps was easy but real problem lies in saving the tiles through the code while u are online and reuse them while you are offline.i checked blogs regarding the same but everyone is saving the images externally and importing it in project and then showing them,which is not my requirement.please help me to save the tile image route me picks from online source
here is how i am using online route me maps
-(void) viewDidLoad
{
[RMMapView class];
mapView.contents.tileSource = [[RMOpenStreetMapSource alloc] init];
currentMarker = [[RMMarker alloc]initWithUIImage:[UIImage imageNamed:#"radarLocatorLite.png"] anchorPoint:CGPointMake(0.5, 0.5)];
markerManager = [mapView markerManager];
locationManager.delegate=self;
locationManager.desiredAccuracy = kCLLocationAccuracyBest ;
locationManager.distanceFilter =0;
[mapView.contents setZoom:17.0f];
[markerManager addMarker:currentMarker AtLatLong:currentLocation.coordinate];
[self initCompassView];
[locationManager startUpdatingLocation];
[locationManager startUpdatingHeading];
}
-(void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation
{
currentLocation =newLocation;
[mapView moveToLatLong:newLocation.coordinate];
[markerManager moveMarker:currentMarker AtLatLon: newLocation.coordinate];
[currentRoutePath addLineToLatLong:newLocation.coordinate];
[[mapView.contents overlay] addSublayer:currentRoutePath];
// NSLog(#"i reached inside location update%f",currentRoutePath.lineWidth);
}
I have an iOS app that uses static map images saved in a sqlite database. There are some references as to how to do that, but it took me lots of trial-and-error effort to make sense of them and make it work.
It seems that you should be able to have a sqlite database and save the downloaded images into it as your app downloads them. Then you'd have to know what tile source to use: the sqlite database if the app is offline, the OSM site when online.
The structure of the database is:
tilekey text // a hash that route-me uses to locate the correct tile
zoom integer
row integer
col integer
zoom integer
image blob this stores the actual image of the map
I use a Python script to populate the database, as I want the app to always use the static map images from the database, never to use a real-time download from OSM.
Please let me know if you'd like more information, but if you search for using static maps with route-me, you should find how this is done. Good luck!
finally resolved problem by just a minor change in few places
Step 1: Go to this site "http://shiki.me/blog/offline-maps-in-ios-using-openstreetmap-and-route-me/" and follow instructions to download tile images from online and create of zip of the folder.remember the tile images folder are in order ->zoom level folder->x coord foler->y coord image respectively.
step 2: unzip the zip file in ur app at some folder
step 3:go to the file "RMAbstractMercatorWebSource.m" in map view project
and replace the following folders
-(NSString*) tileFile: (RMTile) tile
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0]; // Get documents folder
NSString *path = [documentsDirectory stringByAppendingPathComponent:#"Tiles"];
NSString *absPath=[NSString stringWithFormat:#"%#/%d/%d/%d.png", path,tile.zoom, tile.x, tile.y];
NSLog(#"file path >>>.............%#",absPath);
return absPath;
}//I unzipped the zip file at tiles folder
-(NSString*) tilePath
{
return nil;
}
-(RMTileImage *)tileImage:(RMTile)tile
{
RMTileImage *image;
tile = [tileProjection normaliseTile:tile];
NSString *file = [self tileFile:tile];
if(file && [[NSFileManager defaultManager] fileExistsAtPath:file])
{
image = [RMTileImage imageForTile:tile fromFile:file];
}
else if(networkOperations)
{
image = [RMTileImage imageForTile:tile withURL:[self tileURL:tile]];
}
else
{
image = [RMTileImage dummyTile:tile];
}
return image;
}
this in turns first look in cache then check the specified directory and finally go for online osm tile images
I am currently trying to save video files to iCloud. I am using Core Data to save filename strings (filename.MOV) for each video, to then retrieve them from the ubiquity container. It all works locally (files save, and can be accessed from their URLs), but I am struggling to obtain the videos over iCloud. The Core Data syncs, so I have access to the file names, but when I try to obtain the video from the URL, I am unable to.
This is how I save the video after obtaining its url (videoURL below) from UIImagePicker, and creating a unique string from the current date:
NSString *videoFileName = [stringFromDate stringByAppendingPathExtension:#"MOV"];
NSURL *ubiquityContainer = [[NSFileManager defaultManager] URLForUbiquityContainerIdentifier:nil];
NSURL *saveToURL = [ubiquityContainer URLByAppendingPathComponent:videoFileName];
BOOL ok;
ok = [[NSFileManager defaultManager] setUbiquitous:YES itemAtURL:videoURL destinationURL:saveToURL error:nil];
if (!ok) NSLog(#"error saving");
I then have a Core Data table view to list all of the videos. Here I observe changes in the Core Data to sync with iCloud and reload (this all still works fine):
- (void)viewDidLoad {
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(reloadFetchedResults:)
name:#"SomethingChanged"
object:[[UIApplication sharedApplication] delegate]];
}
- (void)reloadFetchedResults:(NSNotification*)note {
[self performFetch];
}
At this point, I want the ubiquity container to update, so that when I choose a video, and segue to a view controller to watch it, the video file can be found. (self.video is my Core Data video entity) (asset is is the video asset, which I can play back)
NSURL *ubiquityContainer = [[NSFileManager defaultManager] URLForUbiquityContainerIdentifier:nil];
ubiquityContainer = [ubiquityContainer URLByAppendingPathComponent:self.video.url];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:ubiquityContainer options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
This is where I run in to trouble. On the device where I took the video it works, but on another device, no file is found (objectAtIndex:0 is beyond bounds).
This is the metadata query I call at view did load:
NSMetadataQuery * query = [[NSMetadataQuery alloc] init];
NSString * filePattern = [NSString stringWithFormat:#"%#", self.video.url];
[query setPredicate:[NSPredicate predicateWithFormat:#"%K LIKE %#",
NSMetadataItemFSNameKey, filePattern]];
[query startQuery];
My metadata query may be at fault, or there may be more issues. Any help would be greatly appreciated!
I assume you are on iOS, which means on your second device the media file hasn't been downloaded (iOS doesn't download iCloud files until you actually access them, OS X downloads everything - see the docs).
To ensure the file is on the device use startDownloadingUbiquitousItemAtURL:error: or coordinateReadingItemAtURL:options:error:byAccessor: if you want to know when it's done (in the Accessor block). You will need to call the later anyway to do your coordinated read, so the first method has limited usefulness.
The code below is the image picker callback that runs after the user takes a photo with the camera / or picks a photo from the library.
Can someone explain to me why the first version works and second version throws an error?
The first version passes a UIImage from a synthesised UIImageView to method scaleAndRotateImage.
The second version declares a local UIImageView and passes the image to method scaleAndRotateImage.
This is the first version that works::
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
VC_Create_Preview *vc_create_preview = [[VC_Create_Preview alloc] initWithNibName:#"VC_Create_Preview" bundle:nil];
//UIImageView *temp = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//temp.image = [self scaleAndRotateImage:temp.image];
//vc_create_preview.origImage = temp;
srcImageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
srcImageView.image = [self scaleAndRotateImage:srcImageView.image];
vc_create_preview.origImage = srcImageView;
[self.navigationController pushViewController:vc_create_preview animated:YES];
}
But the 2nd version below does not, and throws an error when calling method scaleAndRotateImage (note in debug i cannot even step "into" scaleAndRotateImage)
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
VC_Create_Preview *vc_create_preview = [[VC_Create_Preview alloc] initWithNibName:#"VC_Create_Preview" bundle:nil];
UIImageView *temp = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
temp.image = [self scaleAndRotateImage:temp.image];
vc_create_preview.origImage = temp;
//srcImageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//srcImageView.image = [self scaleAndRotateImage:srcImageView.image];
//vc_create_preview.origImage = srcImageView;
[self.navigationController pushViewController:vc_create_preview animated:YES];
}
The error thrown by the 2nd version is
Pending breakpoint 1 - ""VC_Create_Capture.m":97" resolved
2012-01-04 20:33:52.674 MultiInterfaceTest[430:f803] -[UIImage image]: unrecognized selector sent to instance 0x68aac00
2012-01-04 20:33:52.717 MultiInterfaceTest[430:f803] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[UIImage image]: unrecognized selector sent to instance 0x68aac00'
*** First throw call stack:
(0x13c2052 0x1553d0a 0x13c3ced 0x1328f00 0x1328ce2 0x35df 0x211c52 0xbfaa305 0xbfbe5fd 0xc022fef 0x2dde39 0x2dd143 0x2de3cf 0x2e0a31 0x2e098c 0x2d93e7 0x41812 0x41ba2 0x28384 0x1baa9 0x12acfa9 0x13961c5 0x12fb022 0x12f990a 0x12f8db4 0x12f8ccb 0x12ab879 0x12ab93e 0x19a9b 0x23b8 0x2315)
terminate called throwing an exceptionCurrent language: auto; currently objective-c
"-[UIImage image]: unrecognized selector sent to instance 0x68aac00"
It seems that the [UIImageView image] accessor method gets called on an UIImage object. Most likely it's a memory management error: the UIImageView gets deallocated, and its memory is filled with the UIImage instance, erronously. (Unrelated: you're also leaking memory in the first approach. -pushViewController:animated: retains the view controller, so you must -release it.
This is because some where you have called image API on UIImage Class. image is a method of UIImageView class not UIImage. thats why Xcode is throwing this error.
To resolve it make sure that you are calling
So please sure that you are calling the image API on UIImage View class not UIImage. Check all the variables you have declare in your .h file and make sure you have used UIImageView class instead of UIImage.
Unrecongnized selector means that your class does not support the any image API that you have called inside your code.
Sorry Guys i found the answer.
The following is wrong because the [info objectForKey....] returns a UIImage, not a UIImageView.
Hence its probably that temp wasn't probably created hence an error from the 2nd line of the code.
UIImageView *temp = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
temp.image = [self scaleAndRotateImage:temp.image];
vc_create_preview.origImage = temp;
Thanks everyone!
I have UIImagePicker to select the image. After selecting the image I am editing it and now I want to save that image.
Can anyone please tell me how can I save the image to Photo Album?
take a look at this tutorial
http://iosdevelopertips.com/camera/camera-application-to-take-pictures-and-save-images-to-photo-album.html
here is my code to get image if the image is edited then edited image will be taken into account.
if not edited then original image will be taken
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
UIImage *image;
if (picker.editing == YES) {
image = [info objectForKey:#"UIImagePickerControllerEditedImage"];
}
else {
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
}
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
// [picker release];
// [self dismissModalViewControllerAnimated:YES];
}
You can use this function
UIImageWriteToSavedPhotosAlbum(UIImage *yourImage, id completionTarget, SEL completionSelector, void *contextInfo);
See the answer to this question...And about completionTarget and completionSelector, from documentation...
The use of the completionTarget, completionSelector, and contextInfo
parameters is optional and necessary only if you want to be notified
asynchronously when the function finishes writing the image to the
user’s Camera Roll or Saved Photos album. If you do not want to be
notified, pass nil for these parameters.