Xamarin Local Notification custom sound not playing - audio

Currently working on implementing local notifications in my app and so far it's working exactly as intended but when attempting to replace the default sound with my sound file I no longer get any sound when the notification triggers and I cannot figure out why.
The code for the local notification:
public void GetLocalNotification(string message)
{
Android.Net.Uri sound = Android.Net.Uri.Parse("android.resource://" + Application.Context.PackageName + "/" + Resource.Raw.alarm);
// Build the notification:
NotificationCompat.Builder builder = new NotificationCompat.Builder(Application.Context)
.SetAutoCancel(true) // Dismiss from the notif. area when clicked
.SetContentTitle("Notification") // Set its title
.SetSmallIcon(Resource.Drawable.icon) // Display this icon
.SetDefaults(1 | 2) //Sets sound and vibration
.SetSound(sound)
.SetContentText(String.Format(
message)); // The message to display.
// Finally, publish the notification:
NotificationManager notificationManager =
(NotificationManager)Application.Context.GetSystemService(Context.NotificationService);
notificationManager.Notify(ButtonClickNotificationId, builder.Build());
}

Try setting this line: .SetDefaults(1 | 2) //Sets sound and vibration to .SetDefaults(0), which should stand for 'all'.
I've seen cases where this only plays sound when you set it to 0. I which I could say why, I have no clue to be honest, but it works.

#Gerald Versluis: thanks, but noticed that .SetDefaults(0) adds sounds but then also removes vibration. So to add on this questions, do.SetDefaults(0 | 2), with 2 being (int)NotificationDefaults.Vibrate.
Thus, my solution to add sound and vibrate is:
.SetDefaults(0 | (int)NotificationDefaults.Vibrate)

Related

Camerax produces different images than android's cam app and OpenCamera

I am using Camerax to capture 4032x3024 images but getting different results than other apps like open camera or android's camera app (different scale ? different fov?).
See that attached android's app
and the camerax:
Both apps use the back camera and same resolution.
Here is my code to bind to camera
private void bindCamera(#NonNull ProcessCameraProvider cameraProvider, CameraSelector cameraSelector, Size captureResolution, ImageAnalysis.Analyzer imageAnalyzer) {
var imageAnalysisBuilder = new ImageAnalysis.Builder();
var imageAnalysis = imageAnalysisBuilder.setImageQueueDepth(1)
.setTargetResolution(captureResolution)
.setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.build();
imageAnalysis.setAnalyzer(analyzerExecutor, imageAnalyzer);
cameraProvider.unbindAll();
if (lifecycleOwner.getLifecycle().getCurrentState() != DESTROYED) {
this.camera = cameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, imageAnalysis);
}
}
How can I init cameraX differently ?
I found the answer myself. Seems that in pixel 6 pro, video stabilization is enabled by default in CameraX, which results in cropping and smaller FOV (see https://issuetracker.google.com/issues/230013960?pli=1)
So to fix this I do the following before binding the use case:
new Camera2Interop.Extender<>(myUseCase)
.setCaptureRequestOption(CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE,CameraMetadata.CONTROL_VIDEO_STABILIZATION_MODE_OFF);

UWP How to check incoming requests from BLE device?

How to check all incoming requests from paired BLE device to current device?
I think it possible with Events, maybe UWP have needle event, or i must implement custom event, but where is the right way?
Microsoft have explainations about GATT Server, i think it's not what i need, 'cause i don't need a server with services and characteristics, i need only check incoming request and parse passed data in my application.
I'm not found sure way for checking incoming requests, but i make some trick.
Application can subscribe for notifications from device (in my case it's Mi Band 2) and receive some data from this device across ValueChanged.
One time i call ValueChanged handler in App.xaml.cs after connecting and pairing device and this working on all application, i don't need call it again and again.
Here is App.xaml.cs part of code.
protected async override void OnLaunched(LaunchActivatedEventArgs e)
{
Frame rootFrame = Window.Current.Content as Frame;
MiBand2SDK.MiBand2 band = new MiBand2SDK.MiBand2();
var page = typeof(Views.AuthPage);
// Checking for device availability and current session
if (_LocalSettings.Values["isAuthorized"] != null
&& await band.ConnectAsync())
{
if (e.PreviousExecutionState == ApplicationExecutionState.NotRunning && await band.Auth.AuthenticateAsync())
page = typeof(Views.MainPage);
else if (band.Auth.IsAuthenticated())
page = typeof(Views.MainPage);
// Here we are, this notification handler of responses from the band.
band.HeartRate.SetNotificationHandler();
}
else
{
System.Diagnostics.Debug.WriteLine("Not Authenticated...");
}
// other part of code...
Here is HeartRate.SetNotificationHandler() code:
public async void SetNotificationHandler()
{
_heartRateMeasurementCharacteristic = await Gatt.GetCharacteristicByServiceUuid(HEART_RATE_SERVICE, HEART_RATE_MEASUREMENT_CHARACTERISTIC);
Debug.WriteLine("Subscribe for HeartRate notifications from band...");
if (await _heartRateMeasurementCharacteristic.WriteClientCharacteristicConfigurationDescriptorAsync(GattClientCharacteristicConfigurationDescriptorValue.Notify) == GattCommunicationStatus.Success)
// Just subscribe for notifications and set ValueChanged. It's all.
_heartRateMeasurementCharacteristic.ValueChanged += HeartRateMeasurementCharacteristicValueChanged;
}
Hope it helps someone...

Can't publish data from CC3100 + MSP430F5529 on PUBNUB

I followed the following tutorial: http://www.pubnub.com/blog/pubnub-streaming-texas-instruments-iot/
step by step and I managed to compile and code and connect to my Wi-Fi access point.
I think I managed to connect to PubNub (the code prints on the terminal screen "PubNub Set Up" but in the code there is no real verification that it was indeed set up.
I opened an account on PubNub and I named my channel "testing" (I named it the same in the code I uploaded - I checked that a million times) and when I go to the Dev Console and click on subscribe I can't see anything! I mean I can post messages through the Dev Console but what I really want to see are the messages from the CC3100.
I checked the UART terminal on my computer and I see the data being printed constantly so I know it is working.
I went over the tutorial again and again and I'm doing the same thing but it just doesn't work.
Any help would be appreciated!
What am I missing?
Thanks
First to verify your PubNub account is properly configured and your local Wi-Fi connectivity is working - are you able to publish messages from the dev console in one browser and receive them in the dev console on another browser? (both using the same channel name, of course). If that works, please send a message to help (at) pubnub (dot) com with your sub-key info and information about your project and we will try to assist you tracking down the issue.
This answer is posted really late. I admit I forgot about this post so I just decided to update it (a few years late though).
I started digging to try and see what was the problem and I think I found it. First of all, I saw that PubNub.publish() wasn't working properly with the json_String because the json_String was 90% gibrish. So I erased most of the code that constructed the json_String (the part that inserts the analog values) and made it simpler.
I then also added a part of code at the end which was needed for proper performance of the client variable which I got off of a part of code which was used for an arduino based project using the CC3100.
Anyway, the new code is the one below and now it works FINE! I finally see all the input streaming on PubNub! Thanks a lot! :D
/*PubNub sample JSON-parsing client with WiFi support
This combines two sketches: the PubNubJson example of PubNub library
and the WifiWebClientRepeating example of the WiFi library.
This sample client will properly parse JSON-encoded PubNub subscription
replies using the aJson library. It will send a simple message, then
properly parsing and inspecting a subscription message received back.
This is achieved by integration with the aJson library. You will need
a version featuring Wiring Stream integration, that can be found
at http://github.com/pasky/aJson as of 2013-05-30.
Please refer to the PubNubJson example description for some important
notes, especially regarding memory saving on Arduino Uno/Duemilanove.
You can also save some RAM by not using WiFi password protection.
created 30 May 2013
by Petr Baudis
https://github.com/pubnub/pubnub-api/tree/master/arduino
This code is in the public domain.
*/
#include <SPI.h>
#include <WiFi.h>
#include <PubNub.h>
#include <aJSON.h>
static char ssid[] = "NetSSID_Name"; // your network SSID (name)
static char pass[] = "NetworkdPassword"; // your network password
static int keyIndex = 0; // your network key Index number (needed only for WEP)
const static char pubkey[] = "pub-c-51eb45ec-b647-44da-b2aa-9bf6b0b98705";
const static char subkey[] = "sub-c-7e78ed9c-991d-11e4-9946-02ee2ddab7fe";
const static char channel[] = "testing";
#define NUM_CHANNELS 4 // How many analog channels do you want to read?
const static uint8_t analog_pins[] = {23, 24, 25, 26}; // which pins are you reading?
void setup()
{
Serial.begin(9600);
Serial.println("Start WiFi");
WiFi.begin(ssid, pass);
while(WiFi.localIP() == INADDR_NONE) {
Serial.print(".");
delay(300);
}
Serial.println("WiFi set up");
PubNub.begin(pubkey, subkey);
Serial.println("PubNub set up");
delay(5000);
}
void loop()
{
WiFiClient *client;
// create JSON objects
aJsonObject *msg, *analogReadings;
msg = aJson.createObject();
aJson.addItemToObject(msg, "analogReadings", analogReadings = aJson.createObject());
// get latest sensor values then add to JSON message
/*for (int i = 0; i < NUM_CHANNELS; i++) {
String analogChannel = String(analog_pins[i]);
char charBuf[analogChannel.length()+1];
analogChannel.toCharArray(charBuf, analogChannel.length()+1);
int analogValues = analogRead(analog_pins[i]);
aJson.addNumberToObject(analogReadings, charBuf, analogValues);
}*/
// convert JSON object into char array, then delete JSON object
char *json_String = aJson.print(msg);
aJson.deleteItem(msg);
// publish JSON formatted char array to PubNub
Serial.print("publishing a message: ");
Serial.println(json_String);
Serial.println(channel);
client = PubNub.publish(channel, json_String);
Serial.println(*client);
free(json_String);
if (!client) {
Serial.println("publishing error");
delay(1000);
return;
}
client->stop();
delay(500);
}
//- See more at: http://www.pubnub.com/blog/pubnub-streaming-texas-instruments-iot/#sthash.tbQXMIzw.dpuf

How to get information/data from platformRequest() in J2ME?

I want to implement a behavior similar to Whatsapp, where when the user can upload an image. I tried opening the images in my app, but if the image is too large, I will have an out of memory error.
To solve this, I'm opening forwarding the images to be open in the phone's native image viewer using the platformRequest() method.
However, I want to know how is it Whatsapp modifies the phone's native image viewer to add a "Select" button, with which the user selects the image he wants to upload. How is that information sent back to the J2ME application and how is the image resized?
Edit:
I tried this in two different ways, both of which gave me the OOME.
At first, I tried the more direct method:
FileConnection fc = (FileConnection) Connector.open("file://localhost/" + currDirName + fileName);
if (!fc.exists()) {
throw new IOException("File does not exists");
}
InputStream fis = fc.openInputStream();
Image im = Image.createImage(fis);
fis.close();
When that didn't work, I tried a more "manual" approach, but that gave me an error as well.
FileConnection fc = (FileConnection) Connector.open("file://localhost/" + currDirName + fileName);
if (!fc.exists()) {
throw new IOException("File does not exists");
}
InputStream fis = fc.openInputStream();
ByteArrayOutputStream file = new ByteArrayOutputStream();
int c;
byte[] data = new byte[1024];
while ((c = fis.read(data)) != -1) {
file.write(data, 0, c);
}
byte[] fileData = null;
fileData = file.toByteArray();
fis.close();
fc.close();
file.close();
Image im = Image.createImage(fileData, 0, fileData.length);
When I call the createImage method, the out of memory error occurs in both cases.
This varies with the devices. An E72 gives me the error with 3MB images, while a newer device will give me the error with images larger than 10MBs.
MIDP 2 (JSR 118) does not have API for that, you need to find another way to handle big images.
As for WhatsApp, it looks like they do not rely on MIDP in supporting this functionality. If you check the Wikipedia page you'll note that they don't claim general Java ME as supported platform, but instead, list narrower platforms like Symbian, S40, Blackberry etc.
This most likely means that they implement "problematic features" like one you're asking about using platform-specific API of particular target devices, having essentially separate projects / releases for every platform listed.
If this feature is really necessary in your application, you likely will have to do something like this.
In this case, consider also encapsulating problematic features in a way to make it easier to switch just part of your source code when building it for different platforms. For example, Class.forName(String) can be used to load platform specific implementation depending on target platform.
//...
Image getImage(String resourceName) {
// ImageUtil is an interface with method getImage
ImageUtil imageUtil = (ImageUtil) Class.forName(
// get platform-specific implementation, eg
// "mypackage.platformspecific.s40.S40ImageUtil"
// "mypackage.platformspecific.bb.BBImageUtil"
// "mypackage.platformspecific.symbian.SymbialImageUtil"
"mypackage.platformspecific.s40.S40ImageUtil");
return imageUtil.getImage(resourceName);
}
//...

Java Me video player realize error with http - MediaException

I'm using the below code (references from, http://www.java-tips.org/java-me-tips/midp/playing-video-on-j2me-devices.html). It fails at 'realize()', with the javax.microedition.media.MediaException, "Unable to create native player". What is the problem here?
I tried this using both Eclipse and Netbeans. Am I missing some "internet" permissions or using any incorrect encoding, the video is an external 'mpg' test-resource and does work fine when downloaded through a desktop browser.
public void run()
{
String url = "http://www.fileformat.info/format/mpeg/sample/05e7e78068f44f0ea748855ef33c9f4a/MELT.MPG";
//Append the GUI to a form
Form form = new Form("Video on java mobile!");
Display display = Display.getDisplay(this);
display.setCurrent(form);
try
{
HttpConnection conn = (HttpConnection)Connector.open(url,
Connector.READ_WRITE);
InputStream is = conn.openInputStream();
Player p = Manager.createPlayer(is,"video/mpeg");
//I tried the below, but that didn't work either
//Player p = Manager.createPlayer(url);
p.realize();
//Get the video controller
VideoControl video = (VideoControl) p.getControl("VideoControl");
if(video != null)
{
//Get a GUI to display the video
Item videoItem = (Item)video.initDisplayMode(
VideoControl.USE_GUI_PRIMITIVE, null);
form.append(videoItem);
}
//Start the video
p.prefetch();
p.start();
}
catch(Exception e)
{
form.append(url + " Error:" + e.getMessage());
}
}
I've just started with Java, Eclipse, Netbeans. Since, there similar samples found everywhere, I believe I'm missing something very basic. Can someone please help?
The problem here was the video file. Although my source video seemed "mpeg", it wasn't acceptable to the emulator. After searching through a bit, I found a converter and I manually converted some sample mp4 to "mpeg". It finally worked with the same emulator, after I tried to download and play these manually converted files.
One piece of advise if you are new J2ME/JavaME apps (like me), keep playing with the input data sources/formats and the emulators. Switching emulators or the input data-formats is an easy way to identify the not-so-evident problems.

Resources