Create MediaSource from AudioPlaybackConnection - audio

I'm trying to make my windows computer a valid output for bluetooth audio from my phone. Enabling the actual audio was easy enough using the winrt AudioPlaybackConnection, but I'm trying to get metadata working and running into dead ends in the Windows UWP documentation. I'm familiar with the MediaPlayer class, but I can't see how to set the source to the AudioPlaybackConnection. My next thought was to create a MediaPlayer and handle the controls/metadata myself, but I can't see how to access the metadata for the AudioPlaybackConnection either. I tried getting the BluetoothDevice matching the same phone since I see the properties for the actual device list AVRCP Transport and A2DP SNK as two separate hardware "devices" making up the phone device, but I have no more luck accessing metadata with the BluetoothDevice. I know Windows 10 supports Bluetooth's AVRCP and can handle metadata/controls (source), but I'm beginning to think it's under a different device in winrt and I don't have the winrt know-how to track it down.

I've consulted the Bluetooth team about this. But currently, control like this is not supported in Windows at this time. You could submit a feature request about this in the Feedback Hub. Please select Developer Platform->API Feedback as the category when you submit your request. The related team will check the request.

Related

Bluetooth data to HID for BLED112

We have bought BLED112 to interface our target via BT.
An android app interacting with target via BT & USB (HID).
We have used some Bluetooth communication to write a program and send data to dongle.
Now can somebody here having any experince in converting that BT data to a HID signal.
Have anybody tried that?
Is there any BGScript code which we need to write to achieve that?
Please let me know if the thought is completely wrong.
Referring to a comment above which states,
We are writing an Android App which can send data to BLED112 over BLE interface or GATT. My question is how can I convert that data (basically a command) to an HID (key event), correct me if my understanding is wrong?
If I understand the use-case correctly, I think, in the initial stages of the development, you will need to use the BLE-GUI utility that BlueGiga provides.
With that utility you can see the communication between the BLED112 Dongle and the BLE112 Module. BLED112 shall be simulating what the android app would do?
First, you will need to know the GATT structure stored in BLED112 to write to or read from the BLED112.
Secondly, the way BLE112 works is an event-based implementation. Going through the API reference document for BLE112 shall help you understand the events generation conditions and codes that are generated modified when a characteristic value is updated by the android application, or read by android application. You get events for connection, disconnection, read from, write to, notification enabled for, indication enabled for, etc.
On the BLE112 side, depending upon what service and what characteristics in that service is going to be used for data transfer between Client (Android App) and Server (BLE112), you need to write suitable implementation in event callback handlers.
There is a standard service called Human Interface Device which has a reserved UUID: 0x1812.
Once you configure your BLE112 as a HID over GATT device, your android app shall see a service with UUID: 0x1812. Parse the service descriptor and get the characteristics bundled up into the service. You can read from or write to that service depending upon access parameters set in gatt.xml
As an example, say, if it is a Keyboard, you can send the scancode for (make and break) of the key depending upon what key is pressed. How to get a scancode is out of the scope of this question anyway, and sadly I had worked on PS2 keyboards, so I don't really know how to get the scancode from a USB keyboard.
So, you have the scancode for the key pressed, and you know the characteristics to write that into. Write it, the application should enable the Notifications for that characteristics, so that it is notified whenever the key is pressed and value is written into the characteristics. To let application enable notifications or indications for the characteristics, study the developer guide that talks about how to write a gatt.xml for Bluegiga-based BLE devices. I'll give you a hint: in xml, in the characteristics configuration you have to write notify="true".
About parsing of the service and characteristics in Android, Unfortunately I am not an android developer, but an embedded developer, I know how the BLE112 module part is to be implemented, while I have no insight of how android parses the data. But, there are plenty of question and discussions about it online, which you might understand better than me since you have an android background.

iOS BLE - How to keep app active in the background?

I am trying to find a clever way to keep a BLE app active in the background on iOS 6, without breaking any of Apple's rules. I plan to use the phone as a peripheral device and another BLE circuit as the central. My app will automatically be opened when a user arrives to a building using geofencing. After that the iPhone will connect to the first BLE central device it sees (the device will be in its white list). The user will then be able to move throughout the building switching to different BLE "nodes".
My question is: What do I need to do in the background when a user is stationary at their desk so that the app does not get suspended due to memory resources?
My idea is based on this solution for a separate problem: There could potentially (not regularly) be 10-50 users in an area with only a few BLE "nodes" and I read at bluetooth.org that I could setup a dynamic connection system, basically rotating connections through all the users.
My idea is to setup a similar dynamic system where the central device (not the iPhone) disconnects the device on regular intervals (30-40 minutes) and then the iPhone will reconnect.
Is this something that some feasible? Is this against the iOS development guidelines? I was unable to find anything explicit about this. I have also asked on the iOS developer forum, but unfortunately it is not as popular as this site.
Thanks in advance!
Xcode -> Project target -> Capabilities -> Enable background mode
Check Uses Bluetooth LE Accessories
Capabilities
Also enable the following key in .plist file
Required background modes
App communicates using CoreBluetooth
Plist

Record audio from various internal devices in Android (via undocumented API)

I was wondering whether it is possible to capture audio data from other sources like the system out, FM radio, bluetooth headset, etc. I'm particularly interested in capturing audio from the FM radio and already investigated all possibilities including trying to sniff the raw bluetooth communication between the phone and the radio device with no luck. It's too bad Android only allows recording audio from the MIC.
I've looked at the Android source code and couldn't find a backdoor to allow me to do that without rooting the device. Do you, at least, have any idea how to use other devices (maybe access somehow /dev/audio) say via NDK or even better - Java (maybe Reflection?) to trick the system to capture the audio stream from say, the FM radio. (in my case I'm trying to develop the app for the HTC Desire)
PS. And for those of you who are against using undocumented APIs, please don't post here - I'm writing an app that will be for my personal use or even if I ever publish it I will warn the user of possible incompatibilities.
I've spent quite some time deciphering the audio stack, and I think you may try to hijack libaudio. You'll have trouble speaking directly to the hardware (/dev/*) because many devices use proprietary audio drivers. There's no rule in this regard.
However, the audio hardware abstraction layer (HAL) provided by /system/lib/libaudio.so should expose the API described at http://source.android.com/porting/audio.html
The Android system, and especially audioflinger, uses this libaudio HAL to find available devices, deal with routing, and of course to read/write PCM data.
So, you could hijack the interaction between audioflinger and libaudio, by renaming the later, and providing your own libaudio which decorates the real one. Doing so, you should be able to log what happens and very possibly intercept FM radio output, provided that this is not directly handled by the hardware.
Of course, all this requires rooting. Please comment if you manage to do this, that interests me.

Is there a way to enumerate the video devices on a Java ME phone?

I recently downloaded a barcode reading application for my phone, an LG KU990i (AKA the Viewty) However, there's a problem that renders the application nearly useless: the Viewty has 2 cameras -- the main one, and a secondary camera located on the face of the unit -- and it is the secondary camera that is unfortunately set as the phone's default video capture device. As you can't point the secondary at anything and see what it's pointing at at the same time, it makes it a bit difficult to snap a barcode!
According to the JSR-135 spec, it is possible to specify a video capture device other than the default... if you know the device name. This does not appear to be documented anywhere on LG's Web site, nor does the JSR-135 spec describe any way of enumerating the devices on a phone... or is there? Failing that, are there any naming conventions for video devices commonly in use that LG might be using?
I've logged a ticket with LG, but as it's an old device, I don't imagine them breaking their backs in getting back to me... I should also point out that this is purely for my own curiosity so no-one here should feel obliged to break their backs either!
As far as I know there is no way to get list of all available catpure:// urls.
All urls I know:
capture://image,
capture://video
capture://devcam0
capture://devcam1
Source:
http://www.forum.nokia.com/info/sw.nokia.com/id/bc00e4ce-7df3-4527-962c-d39843a808d0/MIDP_Mobile_Media_API_Support_In_Nokia_Devices_v1_0_en.pdf.html
LG responded to my support ticket. Apparently, it's not possible to access the primary camera on the Viewty from Java, making it pretty much useless for barcode scanning. Answer reproduced here for search engines.
You support ticket has been answered. Please visit the LG Mobile Developer Network and login to check the answer at [My Page > My Tickets].
KU990i default video capture device is the secondary camera
Answer :
Hi,
KU990i have to Two camera module
differently.
Main camera using Joran chipset and
sub(front camera) using Qualcomm
chipset.
Joran chip doesn’t supported JSR135.
Therefore, we couldn’t supported to
the JSR135 using for main camera.
(it is H/W limitation)
It was inform to operator already and
we remember operator was confirm it.
So that, we only supported sub camera
for JSR135.
BR,

Auto Detect Windows Mobile Device programmatically

I am writing a windows application (written entirely in C++) which reads files from a storage card on a mobile phone running Windows Mobile. The tough part is, I don't know how to make my application detect the event that a user has connected the mobile phone to the USB of laptop. I did some reading on MSDN and have written a small code using RegisterDeviceNotification, which detects whenever a USB disk is attached/removed from the laptop. However, I am unable to tweak this to make it work for phone type devices. Please help me out through any links/tutroials which explains this(preferrably C++, as I don't know .NET or C#).
Thanks
Alok
According to this article you can use RegisterDeviceNotification to get notifications when activesync detects a device has been plugged/unplugged. (See option 3 at the end of the article)
It may just be a matter of setting up the correct notification filter.
Windows Mobile devices use RNDIS, a network interface protocol behind the scenes. Hence, the RegisterDeviceNotification method still works, but you're looking for a DEV_BROADCAST_DEVICEINTERFACE, not DEV_BROADCAST_VOLUME. (i.e. dbch_devicetype==DBT_DEVTYP_DEVICEINTERFACE)
You can use RAPI or RAPI2 to detect when a Windows Mobile device connects to a PC via Active Sync or Windows Mobile Device Center. RAPI can also be used to read the files on the storage card and much more.
RAPI is simpler to program because it is a C based API. RAPI2 has more functionality than RAPI, but is an object oriented COM API. If your needs are simple and you only care about one device/connection at a time then RAPI is good enough. There are two RAPI functions used to detect connections: CeRapiInit (blocking), and CeRapiInitEx (signals an event upon connection).

Resources