I made a custom iOS8 keyboard but I need to detect which app is using the keyboard.
f.e. I need to know if its Whatsapp or just normal Messages to handle a different logic.
I haven't found any related question on this. And frankly I don't think this is possible, or even allowed by Apple.
The closest other Q/A I could find was this.
Related
I have read several stuff in the internet and understood that its not possible to use bluetooth with two iOS devices in Background. It is only partially possible.
If the peripheral is in Background the UUID is no more visible. And if the central is in Background it is in passive scanning mode which means it can only search for a specific UUID, which is not visible when the peripheral is in Background.
I checked the "new" Exposure Notification service from Apple and Google and in this case I was able to see a UUID with 16-bits and a custom ID. This also works in background really well. Do you think there is any possibility to use this to enable the iOS peripheral also works in Background. If Its possible to change the ID or at least to read our which ID is currently advertised would be also helpful.
What do you guys think is this possible?
I am developing a gtkmm application. I am using linux.
I was wondering if there is a way to provide audio feedback to the user when he/she executes some action?
I found a related post on audio feedback for gtkmm here. But it does not provide a proper solution.
There is libcanberra for triggering event sounds from the system's sound theme. I don't know if it is available from gtkmm.
If you want more flexibility to play music, etc., then you'll want gstreamermm.
I use OpenAL to play sound from network streaming in my NPAPI plugin (browser plugin). When I open more than one browser's tab, I close one of the tabs, the others' sound also dispeared. Anyone could give me some help? Or anyone can give me some advice for playing stream audio, such as audio from mic or such devices. When I use audio queue service, it still cannot surport multi instances. For work's reason, I cannot paste my code in there, sorry about that. I use OpenAL some way as Apple's Developer's example - oalTouch, the link is
https://developer.apple.com/library/ios/samplecode/oalTouch/Introduction/Intro.html
I use the system default device to play sound, which means I use alcOpenDevice(NULL) to open the device.
When a page with your plugin closes, the corresponding instance of your plugin is destroyed; there's one instance per plugin that is simultaneously active in the browser. Presumably the problem is that you are doing something in your plugin instance teardown that tears down something global.
Without being able to see anything about what your code is doing, it's hard to see how anyone can help figure out what that thing is.
Now I have found the reason. OpenAL do support multi audio source, but when I use it as the oalTouch example, I did not change anything. The reason why I close one browser's tab, the sound stopped is that the device is closed by me when I destroy the plugin instance. So if I want to use it in multi instances, I should design it carefully, and this is my design: I use static var to indicate if the default device is opened, if it is opened, I do not open it again (use alOpenDevice(NULL) method). But for different tabs, I use different sourceId (alGenSource(...) method is used). When I close the browser's teb, the plugin instance is destroyed too. I use the static variable to check if there still have plugin instance use the default audio device, if yes, do nothing, otherwise close the device.
I've spent two days on this and have gotten nowhere. I'm trying to use [MPMusicPlayerController applicationMusicPlayer] to play audio chosen from the user's iPod library and have it run in the background as well as support remote events. Now getting the music actually playing is the easy part. Get the instance, pick the songs, assign the music queue and play. Done and done. BUT... a) I can't get it to play in the background, and b) even when in the foreground I can't get the remote control events to work at all!
And before you ask, yes, I have set the plist entries, the audio session category, the call to say I'm interested in getting remote events and set up a first responder to listen for them, so please know, yes, I've read read every single document on the subject that I could find* (*a task I blame Apple for for not being clear at all on this topic, nor having ANY example code for it!) and I've watched every one of the WWDC videos relating to it (even freezing the screen to copy the code exactly from their example...) so unless I've missed something not in this list, replying with any of those answers is not going to help.
One more thing... I am explicitly talking about using the MPMusicPlayerController which according to the docs, never uses an application session. It always uses the system session. (Maybe that in itself answers my question, but the docs don't clearly say that so I'm not sure, hence this question.)
That said, after two days, my thoughts are this:
When using the MPMusicPlayerController, regardless of what methods you call or what plist entries you set, your app will never run in the background. Period. If you use the ipodMusicPlayer instance, the music keeps playing, but that's because it's the iPod that's playing, not your app. If you use the applicationMusicPlayer instance instead, when going to the background your music stops. In both cases, your app is suspended.
Regardless of your using the ipodMusicPlayer or applicationMusicPlayer instances, all remote events go to the iPod application itself, not yours, even if you've explicitly asked for them. If you are using the applicationMusicPlayer instance and you use the remote to select 'Play', the iPod app receives the command so your audio ducks out and is interrupted and playback begins in the iPod app. If you've chosen the ipodMusicPlayer instead, then of course it doesn't matter as you have explicitly said you're basically just interested in remotely controlling the iPod app which again, is what actually receives the remote events.
The icon in the quick-switch controls at the bottom never changes to your app's icon because again, your app is never actually set up to receive the events. The iPod application is, which is why its icon does appear there.
So what I want to know is... am I wrong here? Has anyone successfully been able to use MPMusicPlayerController and been able to intercept the remote events? While I'd prefer to use the applicationMusicPlayer with background music support so I don't muck with the user's iPod, the bigger thing is remote control notifications, meaning if I have to use the ipodMusicControl and keep my app in the foreground to intercept those messages, so be it. It's ugly that way, but at least it's something.
Code examples, or at least explicit steps against one of the built-in app templates would be GREATLY appreciated. (Don't even need the implementation... just the steps. Hopefully that will appease the inevitable 'It's still under NDA' thing that people keep answering questions with.)
Mark
I solved it. The info is in my other question over here...
Stack Overflow: Play iPod music while receiving remote control events
...but the short version is you have to use AVPlayer (but not AVAudioPlayer. No idea why that is!) with the asset URL from the MPMediaItem you got from the library, then set the audio session's category to Playable (do NOT enable mixable!) and add the appropriate keys to your info.plist file telling the OS your app wants to support background audio.
This lets you play items from your iPod library (except Audible.com files for some reason!) and still get remote events. Granted you have to do more work, and since this is your audio player which is separate from, and will interrupt the iPod app (which may or may not be desirable. And again, don't enable mixing or the iPod app will hijack the remote control events) but those are the breaks!
For anyone who wants to know, I found out to get the audio playing in the background, you have to set the audio session's category to Playable and then background audio works just fine. If you also want to play your own sounds at the same time, you have to mark the category as mixable. That solved the background music part. But what I've found out is any time the iPod is playing, it doesn't seem possible for you to get remote notifications.
Here's the updated thread...
How can you play music from the iPod app while still receiving remote control events in your app?
M
I want to send almost 4k size data to any cellphone by using bluetooth.
Firstly to do this, I need to find what stacks are in my phone and what stacks are acting when I send a data.
I really struggle to find the way; however, it is really hard.
If you know how to find it, please give me some information!!
See http://32feet.NET if you are talking about Windows Mobile. It is a managed library for Bluetooth, OBEX, and IrDA. We support both the Microsoft stack, but also have support for Widcomm. And now also Bluesoleil and Stonestreet One Bluetopia.
You can either send and receive the data as an OBEX message, or over a simple bluetooth connection. See a copy of the user's guide at http://www.alanjmcf.me.uk/comms/bluetooth/32feet.NET%20—%20User’s%20Guide.html
Goto "https://www.bluetooth.org/tpg/listings.cfm" type the name of your mobile. This will give you information on the Bluetooth stack inside and the profiles supported.