I have an MKAnnotationView where I want to rotate the image property only.
If I use the following...
self.image= image;
[self setTransform:CGAffineTransformMakeRotation(DEG2RAD([item.heading floatValue]))];
It rotates the whole annotationView and the subsequent callouts which I do not want. Is there a way to just rotate the image property?
The simplest way is probably to put the image in a UIImageView, add that as a subview to the MKAnnotationView, and affine that.
If you are targeting iOS 5.0 you could look at CIAffineTransform but I don't know enough about CoreImage to advise.
Related
I have a Qt3D scene in which I have only one 3D object. I would like to set the center of rotation of the scene(camera) to the center of this 3D object. Currently, the 3D model goes out of view when the scene is rotated with the mouse.
There is also a OrbitCameraController, which has the purpose to look at a certain position. You could let the camera position track your object's position.
QML example code:
Camera {
id: myCamera
viewCenter: YOUROBJECTPOSITION
}
OrbitCameraController { camera: myCamera }
// FirstPersonCameraController { camera: myCamera }
I'm not using pyqt like you do. Hope this helps.
I am using a Mesh with a custom 3D model as "source" the qt docs don't
really indicate any property of the Mesh object that will return its
3Dvector position.
If you import your obj file in a scene the origin of your mesh is placed at the origin of the scene. If you didn't transform it that origin is where you want the camera to look at.
If you used a transform, then use that new position to look at.
Qt3D uses an ECS (Entitiy Component System) Basically, you make an enitity and add components to it like a mesh and transform in your case. That's why the mesh doesn't have a property that reflects it's position. The transform component holds that information.
I suggest you read the following in the Qt docs: Qt3D Architecture
The above solution is just in case you have smaller objects and your camera is far enough. But if you import a larger mesh fi. like a spaceship you might need to get the coordinates of the spot you want to look at. You can get those coordinates by using an object picker.
I'm following the instructions for adding stylekit images to storyboards by adding an object, setting it's class to stylekit and then right click dragging to the UIImage from the stylekit object - however I'm not seeing the images show up in UIImageView preview in the storyboard. The image is there when I run the code however.
Is there a different series of steps I need to take? I'm using ios9 xcode 7.
Interface Builder doesn’t allow previewing this kind of connections. To see the image in IB, you will need to subclass the view, mark it as #IBDesignable and override drawRect method, where you call the StyleKit drawing method.
I am attempting to rotate an MKMapView using MapKit.
I can display a map and rotate it, however not very efficiently. I create an MKMapView larger than the view and rotate it using CGAffineTransformMakeRotation, so the the grey areas behind the view are not visible. Although I have clip subviews checked in Interface Builder, I still have the feeling this is not the correct implementation.
This method does allow me to rotate any annotations displayed as MKPinAnnotationView conforms to the CGAffineTransformMakeRotation function, but I come into problems when trying to add an overlay to the map.
I can place an overlay on using the boundingMapRect property in the class declaration but the image remains unrotated on the display. Is there a way to achieve this? Or alternatively should I be rotating the MKMapView and annotations in a different method?
Thanks in advance for any advice or information.
Are you rotating the map so that it is facing the same way the user is? (i.e. not just North = Up). If so you don't need to do any transformation stuff at all, just set the MKUserTrackingMode to MKUserTrackingModeFollowWithHeading
Does anyone have an example of how to zoom an MKMapView to the area of all visible annotations using the annotationVisibleRect property on MKMapView? I have seen this post which offers a decent solution, but it seems that this annotationVisibleRect property would be the simplest solution.
Short answer: There is not a solution to this problem using annotationVisibleRect.
There is no example because this property cannot be used in this way. The limited documentation provided for it is certainly misleading for someone who is looking for something convenient from MapKit to do a somewhat common task.
annotationVisibleRect is the rect with regard to the MKAnnotationContainerView coordinate system. MKAnnotationContainerView is the superview for your annotations. If you look in MKMapView.h, you'll find this:
// annotationVisibleRect is the visible rect where the annotations views are currently displayed.
// The delegate can use annotationVisibleRect when animating the adding of the annotations views in mapView:didAddAnnotationViews:
#property (nonatomic, readonly) CGRect annotationVisibleRect;
Its specific purpose is for manipulation (animation) of the annotation views by providing a rectangle in their superview's coordinate system that matches the map view's viewport.
You might think (as I did) that this or similar will do the trick:
CGRect visibleRect = self.mapView.annotationVisibleRect;
MKCoordinateRegion visibleRegion = [self.mapView convertRect:visibleRect toRegionFromView:self.mapView];
[self.mapView setRegion:visibleRegion animated:YES];
It won't. Calling setRegion:animated: may cause the application to crash because the "fromView" is the incorrect coordinate system and may cause the latitude or longitude to go over their min/max values. You'd actually have to do something like this:
- (void)mapView:(MKMapView *)mapView didAddAnnotationViews:(NSArray *)views
{
if(views.count > 0) {
MKAnnotationView *view = [views objectAtIndex:0];
CGRect visibleRect = self.mapView.annotationVisibleRect;
MKCoordinateRegion visibleRegion = [self.mapView convertRect:visibleRect toRegionFromView:view.superview];
[self.mapView setRegion:visibleRegion animated:YES];
}
}
This won't crash the application, but it won't change the region either. If you compare visibleRegion to self.mapView.region, you will find that they are identical. That is because the annotationVisibleRect represents the same area that is visible in the map view -- just in a different coordinate system to make it convenient for you to do things like make the map pins come flying in from the edge of the view. See this answer for details on how it is used.
Also, for reference, here's where the MKAnnotationView sits in relation to MKMapView:
MKMapView
+-UIView
+-MKScrollContainerView
+-MKAnnotationContainerView <-- coordinate system of annotationVisibleRect
+-MKAnnotationView
Hope that helps clear some things up -- if not, ask away.
SWIFT 4
I think what your looking for is
mapView.showAnnotations(annotations:[MKAnnotation], animated: Bool)
You simply pass in an array of all the annotations you're trying to show which is the MKMapView.annotations
mapView.showAnnotations(mapView.annotations, animated: true)
Does any one have an idea how to access the UIScroller class , which is the default subview of UIWebView ?
I want to handle the touches, zooming , panning and scrolling features inside the webview .
Thanks..
I know this thread is old but if anyone comes across it there's a new way.
As of iOS 5 UIWebView now has a property called scrollView which you can't replace but you can set the properties of it. Most people just want to disable zooming/bouncing/scrolling all together which can be done by setting the properties of the scrollView for example if webview is a UIWebView:
webview.scrollView.bounces = NO; //Disables webview from bouncing
webview.scrollView.minimumZoomScale = webview.scrollView.maximumZoomScale = 1.0; //Forces zoom to be at 1 (can be whatever you fancy) and disables zooming
webview.scrollView.bouncesZoom = NO; //Disables bouncing when zooming exceeds minimum or maximum zoom
I suppose you could set the delegate for the scrollView if you want more control, though to be on the safe side you might want to store the original delegate and call its methods appropriately in your custom delegate.
Handling the touches would be more difficult since you can't replace the scrollView to provide your own handlers. Best you could do is add some gesture recognizers as part of the UIView and try to handle them there, but I think UIWebView will still receive the events. Alternatively in iOS 5 they let you access the gesture recognizers directly on UIScrollView.
You can find this by going like this:
[webview objectAtIndex:0]
That should be where it is. If not, put this in your code somewhere and run the program to search for the index of the UIScroller, and replace the 0 above with that index:-
for (UIView *subview in [webView subviews]){
NSLog(#"subviews of webView : %#", [[subview class] description]);
}