in Obj-c i made a switch statement which i used to move around in my app for iPad using UIsplitviewcontroller
now i want to do the same in swift... i tried for a couple of hours and now the only thing i haven't been able to try is the code because it says some sort of compile error
anyway
here's what i got in Obj-c
-(void)initialSite:(int)viewId {
UIViewController *viewController;
switch (viewId) {
case 0:{
viewController = self.initital;
NSString *star = [NSString stringWithFormat:#"Velkommen til %#'s Bog",[data valueForKey:#"navn"]];
self.navigationItem.title = star;}
break;
case 1:{
viewController = self.startSide;
NSString *start = [NSString stringWithFormat:#"%#'s Bog, start-side",[data valueForKey:#"navn"]];
self.navigationItem.title = start;}
break;
}
[self showChildViewController:viewController];
}
and here's what i come up with so far in swift. still new to this and to understand it is a little hard even tho i have the swift programming language book
here's what i got so far in swift
let viewController = UIViewController()
switch viewController {
case "initial":
let initial : UIStoryboard = UIStoryboard(name: "Main", bundle: nil)
let vc0 : UIViewController = initial.instantiateViewControllerWithIdentifier("initial") as UIViewController
self.presentViewController(vc0, animated: true, completion: nil)
let rowData: NSDictionary = self.menuItemArray[indexPath.row] as NSDictionary!
self.navigation.title = rowData["navn"] as? String
case "startSide":
let startSide : UIStoryboard = UIStoryboard(name: "Main", bundle: nil)
let vc1 : UIViewController = startSide.instantiateViewControllerWithIdentifier("startSide") as UIViewController
let rowData: NSDictionary = self.manuItemArray[indexPath.row] as NSDictionary!
self.presentViewController(vc1, animated: true, completion: nil)
self.navigation.title = rowData["navn"] as? String
default:
}
the error is : Expected declaration at the line with
let viewController = UIViewcontroller()
Let's start with your Obj-C implementation:
-(void)initialSite:(int)viewId
{
UIViewController *viewController;
switch (viewId)
{
case 0:
{
viewController = self.initital;
NSString *star = [NSString stringWithFormat:#"Velkommen til %#'s Bog",[data valueForKey:#"navn"]];
self.navigationItem.title = star;
}
break;
case 1:
{
viewController = self.startSide;
NSString *start = [NSString stringWithFormat:#"%#'s Bog, start-side",[data valueForKey:#"navn"]];
self.navigationItem.title = start;
}
break;
}
[self showChildViewController:viewController];
}
Now this same snippet in Swift:
func initialSite(viewID:Int)
{
var viewController : UIViewController?
switch (viewID)
{
case 0:
viewController = self.initial
let navn = self.data["navn"] as? String
let star = "Velkommen til \(navn)'s Bog"
self.navigationItem.title = star
case 1:
viewController = self.startSide
let navn = self.data["navn"] as? String
let star = "\(navn)'s Bog, start-side"
self.navigationItem.title = star
default:
viewController = nil
// do nothing
}
self.showChildViewController(viewController)
}
The main thing you have to remember is the difference with var vs let. Typically you will use let to create things unless those things will have their value changed later, which you use var.
The other thing is the use of optionals, with the ? suffix. This is when the value may be nil (unset), otherwise it must contain a value.
Looks like SiLo beat me to it. Anyway I have my solution so I will post it. This is how I would do it:
func initialSite(viewId: Int) -> () {
var viewController: UIViewController?
let dataValue = data["navn"];
var start: String?
switch viewId {
case 1:
viewController = self.initital
start = "Velkommen til \(dataValue)'s Bog"
case 2:
viewController = self.startSide
start = "\(dataValue)'s Bog, start-side"
default:
break;
}
self.navigationItem.title = start!
showChildViewController(viewController!)
}
Related
I've read that we can now play custom sounds on the apple watch in watchos 3.
According to the announcement from Apple so apparently there is but I don't have an example to test it out: 3D spatial audio implemented using SCNAudioSource or SCNAudioPlayer. Instead, use playAudioSource:waitForCompletion: or the WatchKit sound or haptic APIs. Found here: https://developer.apple.com/library/prerelease/content/releasenotes/General/WhatsNewInwatchOS/Articles/watchOS3.html
Can someone place a simple example of this. I'm not using SceneKit in my app as I don't need it but if that's the only way to play a custom sound then I'd like to know the minimum code required to accomplish this. Preferably in Objective c but I'll take it in whatever shape. I'm ok using SpriteKit if that's easier also.
Here's what I have so far but it doesn't work:
SCNNode * audioNode = [[SCNNode alloc] init];
SCNAudioSource * audioSource = [SCNAudioSource audioSourceNamed:#"mysound.mp3"];
SCNAudioPlayer * audioPlayer = [SCNAudioPlayer audioPlayerWithSource:audioSource];
[audioNode addAudioPlayer:audioPlayer];
SCNAction * play = [SCNAction playAudioSource:audioSource waitForCompletion:YES];
[audioNode runAction:play];
I can confirm, that #ApperleyA solution really works!
Here is the swift version:
var _audioPlayer : AVAudioPlayerNode!
var _audioEngine : AVAudioEngine!
func playAudio()
{
if (_audioPlayer==nil) {
_audioPlayer = AVAudioPlayerNode()
_audioEngine = AVAudioEngine()
_audioEngine.attach(_audioPlayer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2)
_audioEngine.connect(_audioPlayer, to: _audioEngine.mainMixerNode, format: stereoFormat)
do {
if !_audioEngine.isRunning {
try _audioEngine.start()
}
} catch {}
}
if let path = Bundle.main.path(forResource: "test", ofType: "mp3") {
let fileUrl = URL(fileURLWithPath: path)
do {
let asset = try AVAudioFile(forReading: fileUrl)
_audioPlayer.scheduleFile(asset, at: nil, completionHandler: nil)
_audioPlayer.play()
} catch {
print ("asset error")
}
}
}
This is Objective-c but can be translated into Swift
I ended up using AVAudioEngine and AVAudioPlayerNode to play audio on the Apple watch.
The gist of how to do this is as follows:
I call the following inside the init method of my AudioPlayer (it's an NSObject subclass to encapsulate the functionality)
_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];
if (!_audioEngine.isRunning) {
NSError* error;
[_audioEngine startAndReturnError:&error];
}
I have a cache setup so I don't recreate the AVAudioFile assets every time I want to play a sound but you don't need to.
So next create an AVAudioFile object:
NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:key ofType:#"aifc"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url &error];
Then play that file:
[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];
UPDATE: If the app goes to sleep or is put to the background there is a chance the audio will stop playing/fade out. By activating an Audio Session this will be prevented.
NSError *error;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
NSLog(#"AVAudioSession setCategory ERROR: %#", error.localizedDescription);
}
[[AVAudioSession sharedInstance] setActive:YES error:&error];
if (error) {
NSLog(#"AVAudioSession setActive ERROR: %#", error.localizedDescription);
}
I didn't go over handling any errors but this should work. Don't forget to #import <AVFoundation/AVFoundation.h> at the top of your implementation file.
This worked for me in the simulator
let soundPath = Bundle.main.path(forResource: "cheerMP3", ofType: "mp3")
let soundPathURL = URL(fileURLWithPath: soundPath!)
let audioFile = WKAudioFileAsset(url: soundPathURL)
let audioItem = WKAudioFilePlayerItem(asset: audioFile)
let audioPlayer = WKAudioFilePlayer.init(playerItem: audioItem)
if audioPlayer.status == .readyToPlay
{
audioPlayer.play()
}
else
{
print("Not ready!!")
}
but only if I had a breakpoint at both audioPlayer.play() and after the last }.
dunqan, what did you put at the top of the file, the import statements? I wasn't able to include
import AVFoundation
without an error using Xcode 8.2.1
My code worked fine from iOS 7 to 8. With the update yesterday the custom images on my pins were replaced by the standard pin image.
Any suggestions?
My code:
extension ViewController: MKMapViewDelegate {
func mapView(mapView: MKMapView, viewForAnnotation annotation: MKAnnotation) -> MKAnnotationView! {
if annotation is MKUserLocation {
return nil
}
let reuseId = String(stringInterpolationSegment: annotation.coordinate.longitude)
var pinView = mapView.dequeueReusableAnnotationViewWithIdentifier(reuseId) as? MKPinAnnotationView
if pinView == nil {
pinView = MKPinAnnotationView(annotation: annotation, reuseIdentifier: reuseId)
pinView!.canShowCallout = true
pinView!.image = getRightImage(annotation.title!!)
}
let button = UIButton(type: UIButtonType.DetailDisclosure)
pinView?.rightCalloutAccessoryView = button
return pinView
}
}
The function to get the image returns a UIImage based on the name:
func getRightImage (shopName:String)-> UIImage{
var correctImage = UIImage()
switch shopName
{
case "Kaisers":
correctImage = UIImage(named: "Kaisers.jpg")!
default:
correctImage = UIImage(named: "sopiconsmall.png")!
}
return correctImage
}
No the map looks like this:
Instead of creating an MKPinAnnotationView, create a plain MKAnnotationView.
The MKPinAnnotationView subclass tends to ignore the image property since it's designed to show the standard red, green, purple pins only (via the pinColor property).
When you switch to MKAnnotationView, you'll have to comment out the animatesDrop line as well since that property is specific to MKPinAnnotationView.
Following code works perfectly on all iOS 6 to iOS 9 devices:
- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id<MKAnnotation>)annotation
{
// create a proper annotation view, be lazy and don't use the reuse identifier
MKAnnotationView *view = [[MKAnnotationView alloc] initWithAnnotation:annotation
reuseIdentifier:#"identifier"];
// create a disclosure button for map kit
UIButton *disclosure = [UIButton buttonWithType:UIButtonTypeContactAdd];
[disclosure addGestureRecognizer:[[UITapGestureRecognizer alloc] initWithTarget:self
action:#selector(disclosureTapped)]];
view.rightCalloutAccessoryView = disclosure;
view.enabled = YES;
view.image = [UIImage imageNamed:#"map_pin"];
return view;
}
For Swift 4
func mapView(mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
if annotation is MKUserLocation {
return nil
}
let reuseId = String(stringInterpolationSegment: annotation.coordinate.longitude)
var pinView = mapView.dequeueReusableAnnotationView(withIdentifier: reuseId)
if pinView == nil {
pinView = MKAnnotationView(annotation: annotation, reuseIdentifier: reuseId)
pinView!.canShowCallout = true
pinView!.image = getRightImage(annotation.title!!)
}
let button = UIButton(type: UIButtonType.DetailDisclosure)
pinView?.rightCalloutAccessoryView = button
return pinView
}
I have a UIWebView that loads text from an htmlString.
I need when the user selects a part of the text and presses a button, i will be capable of extracting it in order to use it elsewhere, so i am using this code :
// The JS File
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"HighlightedString" ofType:#"js" inDirectory:#""];
NSData *fileData = [NSData dataWithContentsOfFile:filePath];
NSString *jsString = [[NSMutableString alloc] initWithData:fileData encoding:NSUTF8StringEncoding];
[WebV2 stringByEvaluatingJavaScriptFromString:jsString];
// The JS Function
NSString *startSearch = [NSString stringWithFormat:#"getHighlightedString()"];
[WebV2 stringByEvaluatingJavaScriptFromString:startSearch];
NSString *selectedText = [NSString stringWithFormat:#"selectedText"];
NSString * highlightedString = [WebV2 stringByEvaluatingJavaScriptFromString:selectedText];
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Highlighted String"
message:highlightedString
delegate:nil
cancelButtonTitle:#"Oh Yeah"
otherButtonTitles:nil];
[alert show];
Along with HighlightedString.js :
/*!
------------------------------------------------------------------------
// Search Highlighted String
------------------------------------------------------------------------
*/
var selectedText = "";
function getHighlightedString() {
var text = window.getSelection();
selectedText = text.anchorNode.textContent.substr(text.anchorOffset, text.focusOffset - text.anchorOffset);
}
// ...
function stylizeHighlightedString() {
var range = window.getSelection().getRangeAt(0);
var selectionContents = range.extractContents();
var span = document.createElement("span");
span.appendChild(selectionContents);
span.setAttribute("class","uiWebviewHighlight");
span.style.backgroundColor = "black";
span.style.color = "white";
range.insertNode(span);
}
// helper function, recursively removes the highlights in elements and their childs
function uiWebview_RemoveAllHighlightsForElement(element) {
if (element) {
if (element.nodeType == 1) {
if (element.getAttribute("class") == "uiWebviewHighlight") {
var text = element.removeChild(element.firstChild);
element.parentNode.insertBefore(text,element);
element.parentNode.removeChild(element);
return true;
} else {
var normalize = false;
for (var i=element.childNodes.length-1; i>=0; i--) {
if (uiWebview_RemoveAllHighlightsForElement(element.childNodes[i])) {
normalize = true;
}
}
if (normalize) {
element.normalize();
}
}
}
}
return false;
}
// the main entry point to remove the highlights
function uiWebview_RemoveAllHighlights() {
selectedText = "";
uiWebview_RemoveAllHighlightsForElement(document.body);
}
I always get nothing as a result ... The alert view shows nothing...What's the problem with this code ? Any help ? Any ideas ? It will be really appreciated.
The solution was actually pretty simple and no need for all the above code!
For any future users just use:
NSString *textToSpeech = [WebV2 stringByEvaluatingJavaScriptFromString: #"window.getSelection().toString()"];
NSLog(#" -**-*--****-*---**--*-* This is the new select text %#",[WebV2 stringByEvaluatingJavaScriptFromString: #"window.getSelection().toString()"] );
NSString *theSelectedText = [self.webView stringByEvaluatingJavaScriptFromString:#"window.getSelection().toString()"];
This will pass your selection to the string variable.
i am working with Mapkit and i am on SDK 4.2, i am having a strange bug here, in fact i have 3 annotation types, "blue.png", red.png,black.png. I am loading these by a flux and depending on the type its will select these annotation types. Everything works fine when the map is loaded i have the the different annotation view, but when i move , zoom in or zoom out the annotation view changes i.e where it was supposed to be blue.png it becomes black.png.
I am actually testing it on device.
Thank you very much :)
Hey veer the problem is that this method is called if the user pans the map to view another location and then comes back to the place where the annotations are plotted.
- (MKAnnotationView *)mapView:(MKMapView *)mapview viewForAnnotation:(id <MKAnnotation>)annotation
I have seen many sample code for map application and this in what most of the people are using.
- (MKAnnotationView *)mapView:(MKMapView *)mapview viewForAnnotation:(id <MKAnnotation>)annotation
{
if ([annotation isKindOfClass:[MKUserLocation class]])
return nil;
static NSString* AnnotationIdentifier = #"AnnotationIdentifier";
MKAnnotationView *annotationView = [mapView dequeueReusableAnnotationViewWithIdentifier:AnnotationIdentifier];
if(annotationView)
return annotationView;
else
{
MKPinAnnotationView* pinView = [[[MKPinAnnotationView alloc]
initWithAnnotation:annotation reuseIdentifier:AnnotationIdentifier] autorelease];
pinView.animatesDrop=YES;
pinView.canShowCallout=YES;
pinView.draggable = YES;
pinView.pinColor = MKPinAnnotationColorGreen;
return pinView;
}
return nil;
}
i found the solution - in fact i am using a custom annotation view and having 3 diff types of images :
Soln:
- (AnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id <MKAnnotation>)annotation
{
AnnotationView *annotationView = nil;
// determine the type of annotation, and produce the correct type of annotation view for it.
AnnotationDetails* myAnnotation = (AnnotationDetails *)annotation;
if(myAnnotation.annotationType == AnnotationTypeGeo)
{
// annotation for your current position
NSString* identifier = #"geo";
AnnotationView *newAnnotationView = (AnnotationView *)[self.mapView dequeueReusableAnnotationViewWithIdentifier:identifier];
if(nil == newAnnotationView)
{
newAnnotationView = [[[AnnotationView alloc] initWithAnnotation:myAnnotation reuseIdentifier:identifier] autorelease];
}
annotationView = newAnnotationView;
}
else if(myAnnotation.annotationType == AnnotationTypeMyfriends)
{
NSString* identifier = #"friends";
AnnotationView *newAnnotationView = (AnnotationView *)[self.mapView dequeueReusableAnnotationViewWithIdentifier:identifier];
if(nil == newAnnotationView)
{
newAnnotationView = [[[AnnotationView alloc] initWithAnnotation:myAnnotation reuseIdentifier:identifier] autorelease];
}
annotationView = newAnnotationView;
}
}
I am trying to capture tap event on my MKMapView, this way I can drop a MKPinAnnotation on the point where user tapped. Basically I have a map overlayed with MKOverlayViews (an overlay showing a building) and I would like to give user more information about that Overlay when they tap on it by dropping a MKPinAnnotaion and showing more information in the callout.
Thank you.
You can use a UIGestureRecognizer to detect touches on the map view.
Instead of a single tap, however, I would suggest looking for a double tap (UITapGestureRecognizer) or a long press (UILongPressGestureRecognizer). A single tap might interfere with the user trying to single tap on the pin or callout itself.
In the place where you setup the map view (in viewDidLoad for example), attach the gesture recognizer to the map view:
UITapGestureRecognizer *tgr = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleGesture:)];
tgr.numberOfTapsRequired = 2;
tgr.numberOfTouchesRequired = 1;
[mapView addGestureRecognizer:tgr];
[tgr release];
or to use a long press:
UILongPressGestureRecognizer *lpgr = [[UILongPressGestureRecognizer alloc]
initWithTarget:self action:#selector(handleGesture:)];
lpgr.minimumPressDuration = 2.0; //user must press for 2 seconds
[mapView addGestureRecognizer:lpgr];
[lpgr release];
In the handleGesture: method:
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state != UIGestureRecognizerStateEnded)
return;
CGPoint touchPoint = [gestureRecognizer locationInView:mapView];
CLLocationCoordinate2D touchMapCoordinate =
[mapView convertPoint:touchPoint toCoordinateFromView:mapView];
MKPointAnnotation *pa = [[MKPointAnnotation alloc] init];
pa.coordinate = touchMapCoordinate;
pa.title = #"Hello";
[mapView addAnnotation:pa];
[pa release];
}
I setup a long press (UILongPressGestureRecognizer) in viewDidLoad: but it just detect the only one touch from the first.
Where can i setup a long press to detect all touch? (it means the map ready everytime waiting user touch to screen to push a pin)
The viewDidLoad: method!
- (void)viewDidLoad {
[super viewDidLoad];mapView.mapType = MKMapTypeStandard;
UILongPressGestureRecognizer *longPressGesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPressGesture:)];
[self.mapView addGestureRecognizer:longPressGesture];
[longPressGesture release];
mapAnnotations = [[NSMutableArray alloc] init];
MyLocation *location = [[MyLocation alloc] init];
[mapAnnotations addObject:location];
[self gotoLocation];
[self.mapView addAnnotations:self.mapAnnotations];
}
and the handleLongPressGesture method:
-(void)handleLongPressGesture:(UIGestureRecognizer*)sender {
// This is important if you only want to receive one tap and hold event
if (sender.state == UIGestureRecognizerStateEnded)
{NSLog(#"Released!");
[self.mapView removeGestureRecognizer:sender];
}
else
{
// Here we get the CGPoint for the touch and convert it to latitude and longitude coordinates to display on the map
CGPoint point = [sender locationInView:self.mapView];
CLLocationCoordinate2D locCoord = [self.mapView convertPoint:point toCoordinateFromView:self.mapView];
// Then all you have to do is create the annotation and add it to the map
MyLocation *dropPin = [[MyLocation alloc] init];
dropPin.latitude = [NSNumber numberWithDouble:locCoord.latitude];
dropPin.longitude = [NSNumber numberWithDouble:locCoord.longitude];
// [self.mapView addAnnotation:dropPin];
[mapAnnotations addObject:dropPin];
[dropPin release];
NSLog(#"Hold!!");
NSLog(#"Count: %d", [mapAnnotations count]);
}
}
If you want to use a single click/tap in the map view, here's a snippet of code I'm using. (Cocoa and Swift)
let gr = NSClickGestureRecognizer(target: self, action: "createPoint:")
gr.numberOfClicksRequired = 1
gr.delaysPrimaryMouseButtonEvents = false // allows +/- button press
gr.delegate = self
map.addGestureRecognizer(gr)
in the gesture delegate method, a simple test to prefer the double-tap gesture …
func gestureRecognizer(gestureRecognizer: NSGestureRecognizer, shouldRequireFailureOfGestureRecognizer otherGestureRecognizer: NSGestureRecognizer) -> Bool {
let other = otherGestureRecognizer as? NSClickGestureRecognizer
if (other?.numberOfClicksRequired > 1) {
return true; // allows double click
}
return false
}
you could also filter the gesture in other delegate methods if you wanted the Map to be in various "states", one of which allowed the single tap/click
For some reason, the UIGestureRecognizer just didn't work for me in Swift. When I use the UIGestureRecognizer way. When I used the touchesEnded method, it returns a MKNewAnnotationContainerView. It seems that this MKNewAnnotationContainerView blocked my MKMapView. Fortunately enough, it's a subview of MKMapView. So I looped through MKNewAnnotationContainerView's superviews till self.view to get the MKMapView. And I managed to pin the mapView by tapping.
Swift 4.1
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
let t = touches.first
print(t?.location(in: self.view) as Any)
print(t?.view?.superview?.superview.self as Any)
print(mapView.self as Any)
var tempView = t?.view
while tempView != self.view {
if tempView != mapView {
tempView = tempView?.superview!
}else if tempView == mapView{
break
}
}
let convertedCoor = mapView.convert((t?.location(in: mapView))!, toCoordinateFrom: mapView)
let pin = MKPointAnnotation()
pin.coordinate = convertedCoor
mapView.addAnnotation(pin)
}