Reaading Audio in a UWP application - audio

Hi everyone I'm trying to read an audio file in a UWP application using the method shown below:
MediaPlayer player = new MediaPlayer();
StorageFile file = await StorageFile.GetFileFromApplicationUriAsync(new Uri("ms-appx:///Assets/" + fileName));
Debug.WriteLine(file.Path);
var f = File.Exists(file.Path);
player.AutoPlay = false;
player.Source = MediaSource.CreateFromStorageFile(file);
player.Play();
But I get the following esception: System.TypeInitializationException: 'The type initializer for 'Focus.Services.Utility' threw an exception.'
Please can someone help ? I tried several methods for reading audio files but none seems to give.

Related

Azure Text to Speech (Cognitive Services) in web app - how to stop it from outputting audio?

I'm using Azure Cognitive Services for Text to Speech in a web app.
I return the bytes to the browser and it works great, however on the server (or local machine) the speechSynthesizer.SpeakTextAsync(inp) line outputs the audio to the speaker.
Is there a way to turn this off, since this runs on a web server (and even if I ignore it, there's the delay while it outputs audio before sending back the data)
Here's my code ...
var speechConfig = SpeechConfig.FromSubscription(speechKey, speechRegion);
speechConfig.SpeechSynthesisVoiceName = "fa-IR-FaridNeural";
speechConfig.OutputFormat = OutputFormat.Detailed;
using (var speechSynthesizer = new SpeechSynthesizer(speechConfig))
{
// todo - how to disable it saying it here?
var speechSynthesisResult = await speechSynthesizer.SpeakTextAsync(inp);
return Convert.ToBase64String(speechSynthesisResult.AudioData);
}
What you can do is add an audioconfig to the speechSynthesizer.
In this audioconfig object you can specify a file path to a .wav file which already exist on the server.
Whenever you run speaktextasyn instead of a speaker it will redirect the data to the .wav file.
This audio file you can read and perform your logic later.
Just add the following code before creating the speechSynthesizer object.
var audioconfig = AudioConfig.FromWavFileOutput(filepath);
here filepath is a location of the .wav file as a string.
Complete code :
string filepath = "<file path> " ;
var speechConfig = SpeechConfig.FromSubscription(speechKey, speechRegion);
var audioconfig = AudioConfig.FromWavFileOutput(filepath);
speechConfig.SpeechSynthesisVoiceName = "fa-IR-FaridNeural";
speechConfig.OutputFormat = OutputFormat.Detailed;
using (var speechSynthesizer = new SpeechSynthesizer(speechConfig, audioconfig))
{
// todo - how to disable it saying it here?
var speechSynthesisResult = await speechSynthesizer.SpeakTextAsync(inp);
return Convert.ToBase64String(speechSynthesisResult.AudioData);
}

Azure TTS neural voice audio file is created abnormally in 1 byte size

Azure TTS standard voice audio files are generated normally. However, for neural voice, the audio file is generated abnormally with the size of 1 byte. The code is below.
C# code
public static async Task SynthesizeAudioAsync()
{
var config = SpeechConfig.FromSubscription("xxxxxxxxxKey", "xxxxxxxRegion");
using var synthesizer = new SpeechSynthesizer(config, null);
var ssml = File.ReadAllText("C:/ssml.xml");
var result = await synthesizer.SpeakSsmlAsync(ssml);
​ using var stream = AudioDataStream.FromResult(result);
await stream.SaveToWaveFileAsync("C:/file.wav");
}
ssml.xml - The file below, set to standard voice, works fine.
<speak version="1.0" xmlns="https://www.w3.org/2001/10/synthesis" xml:lang="en-US">
<voice name="en-GB-George-Apollo">
When you're on the motorway, it's a good idea to use a sat-nav.
</voice>
</speak>
ssml.xml - However, the following file set for neural voice does not work, and an empty sound source file is created.
<speak version="1.0" xmlns="https://www.w3.org/2001/10/synthesis" xml:lang="en-US">
<voice name="en-US-AriaNeural">
When you're on the motorway, it's a good idea to use a sat-nav.
</voice>
</speak>
Looking at the behavior you have described due to some issues, the Speech service has returned no audio bytes.
I have checked the SSML file at my end it works completely fine i.e. there is no issues with the SSML.
As a next step to the solution, I would recommend you to add error handling code to give better picture of the error and take the action accordingly :
var config = SpeechConfig.FromSubscription("xxxxxxxxxKey", "xxxxxxxRegion");
using var synthesizer = new SpeechSynthesizer(config, null);
var ssml = File.ReadAllText("C:/ssml.xml");
var result = await synthesizer.SpeakSsmlAsync(ssml);
if (result.Reason == ResultReason.Canceled)
{
var cancellation = SpeechSynthesisCancellationDetails.FromResult(result);
Console.WriteLine($"CANCELED: Reason={cancellation.Reason}");
if (result.Reason == ResultReason.SynthesizingAudioCompleted)
{
Console.WriteLine ("No error ")
using var stream = AudioDataStream.FromResult(result);
await stream.SaveToWaveFileAsync("C:/file.wav");
}
else if (cancellation.Reason == CancellationReason.Error)
{
{
Console.WriteLine($"CANCELED: ErrorCode={cancellation.ErrorCode}");
Console.WriteLine($"CANCELED: ErrorDetails=[{cancellation.ErrorDetails}]");
}
}
The above piece of modification will provide friendly error message on the console app.
Note : If you are not using the console app, you will have modify the code.
Sample output :
This is just a sample output. the error you might see would be different.

Access denied using mvxfilestore, mvvmcross annd xamarin touch ios 8.1

My app just got reject from apple and i believe it might be caused by them changing to test with ios 8.1. However I cannot reproduce the error in any way. Their chrash report states the app chrashes on startup.
It seems that the exception(have the crash log) comes from
<Warning> Unhandled managed exception: Access to the path "/var/mobile/Documents/settings" is denied. (System.UnauthorizedAccessException)
which originates from
Cirrious.MvvmCross.Plugins.File.MvxFileStore.WriteFileCommon
I am using the mvvmcross 3.11 MvxFileStore plugin. Deployment target ios7, 8.1 ios sdk.
I have been surfing the web and some states Documents directory has moved in iOS 8 and this might cause the exception.
But I cant wrap my head around the fact that I cant reproduce this error.
Do anyone have a similar issue, a suggestion how to fix or an idea how to reproduce their crash.
Anything is appreciated.
Update:
From the post it is suggested to do the following fix
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, sFile);
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, sFile);
}
I have tried adding it to our project. We used the MvxFileStore to create the path to the settings file
var filestore = Mvx.Resolve<IMvxFileStore>();
string path = filestore.PathCombine(filestore.NativePath (string.Empty), FILENAME);
Now We de the following
var filestore = Mvx.Resolve<IMvxFileStore>();
string path = this.DocsDir() + "/" + FILENAME;
public string DocsDir ()
{
var version = int.Parse(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string docsDir = "";
if (version>=8) {
var docs = NSFileManager.DefaultManager.GetUrls (NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User) [0];
docsDir = docs.Path;
Console.WriteLine("ios 8++ "+docsDir);
} else {
docsDir = Environment.GetFolderPath (Environment.SpecialFolder.MyDocuments);
Console.WriteLine("ios 7.1-- " + docsDir);
}
return docsDir;
}
I will resubmit our app and post the result.
Okay,
The fix we did by seperating ios 8 and the rest of ios and do different implementations depending on the ios worked.
Apple has approved our apps and all is love (y)

How to get information/data from platformRequest() in J2ME?

I want to implement a behavior similar to Whatsapp, where when the user can upload an image. I tried opening the images in my app, but if the image is too large, I will have an out of memory error.
To solve this, I'm opening forwarding the images to be open in the phone's native image viewer using the platformRequest() method.
However, I want to know how is it Whatsapp modifies the phone's native image viewer to add a "Select" button, with which the user selects the image he wants to upload. How is that information sent back to the J2ME application and how is the image resized?
Edit:
I tried this in two different ways, both of which gave me the OOME.
At first, I tried the more direct method:
FileConnection fc = (FileConnection) Connector.open("file://localhost/" + currDirName + fileName);
if (!fc.exists()) {
throw new IOException("File does not exists");
}
InputStream fis = fc.openInputStream();
Image im = Image.createImage(fis);
fis.close();
When that didn't work, I tried a more "manual" approach, but that gave me an error as well.
FileConnection fc = (FileConnection) Connector.open("file://localhost/" + currDirName + fileName);
if (!fc.exists()) {
throw new IOException("File does not exists");
}
InputStream fis = fc.openInputStream();
ByteArrayOutputStream file = new ByteArrayOutputStream();
int c;
byte[] data = new byte[1024];
while ((c = fis.read(data)) != -1) {
file.write(data, 0, c);
}
byte[] fileData = null;
fileData = file.toByteArray();
fis.close();
fc.close();
file.close();
Image im = Image.createImage(fileData, 0, fileData.length);
When I call the createImage method, the out of memory error occurs in both cases.
This varies with the devices. An E72 gives me the error with 3MB images, while a newer device will give me the error with images larger than 10MBs.
MIDP 2 (JSR 118) does not have API for that, you need to find another way to handle big images.
As for WhatsApp, it looks like they do not rely on MIDP in supporting this functionality. If you check the Wikipedia page you'll note that they don't claim general Java ME as supported platform, but instead, list narrower platforms like Symbian, S40, Blackberry etc.
This most likely means that they implement "problematic features" like one you're asking about using platform-specific API of particular target devices, having essentially separate projects / releases for every platform listed.
If this feature is really necessary in your application, you likely will have to do something like this.
In this case, consider also encapsulating problematic features in a way to make it easier to switch just part of your source code when building it for different platforms. For example, Class.forName(String) can be used to load platform specific implementation depending on target platform.
//...
Image getImage(String resourceName) {
// ImageUtil is an interface with method getImage
ImageUtil imageUtil = (ImageUtil) Class.forName(
// get platform-specific implementation, eg
// "mypackage.platformspecific.s40.S40ImageUtil"
// "mypackage.platformspecific.bb.BBImageUtil"
// "mypackage.platformspecific.symbian.SymbialImageUtil"
"mypackage.platformspecific.s40.S40ImageUtil");
return imageUtil.getImage(resourceName);
}
//...

Java Me video player realize error with http - MediaException

I'm using the below code (references from, http://www.java-tips.org/java-me-tips/midp/playing-video-on-j2me-devices.html). It fails at 'realize()', with the javax.microedition.media.MediaException, "Unable to create native player". What is the problem here?
I tried this using both Eclipse and Netbeans. Am I missing some "internet" permissions or using any incorrect encoding, the video is an external 'mpg' test-resource and does work fine when downloaded through a desktop browser.
public void run()
{
String url = "http://www.fileformat.info/format/mpeg/sample/05e7e78068f44f0ea748855ef33c9f4a/MELT.MPG";
//Append the GUI to a form
Form form = new Form("Video on java mobile!");
Display display = Display.getDisplay(this);
display.setCurrent(form);
try
{
HttpConnection conn = (HttpConnection)Connector.open(url,
Connector.READ_WRITE);
InputStream is = conn.openInputStream();
Player p = Manager.createPlayer(is,"video/mpeg");
//I tried the below, but that didn't work either
//Player p = Manager.createPlayer(url);
p.realize();
//Get the video controller
VideoControl video = (VideoControl) p.getControl("VideoControl");
if(video != null)
{
//Get a GUI to display the video
Item videoItem = (Item)video.initDisplayMode(
VideoControl.USE_GUI_PRIMITIVE, null);
form.append(videoItem);
}
//Start the video
p.prefetch();
p.start();
}
catch(Exception e)
{
form.append(url + " Error:" + e.getMessage());
}
}
I've just started with Java, Eclipse, Netbeans. Since, there similar samples found everywhere, I believe I'm missing something very basic. Can someone please help?
The problem here was the video file. Although my source video seemed "mpeg", it wasn't acceptable to the emulator. After searching through a bit, I found a converter and I manually converted some sample mp4 to "mpeg". It finally worked with the same emulator, after I tried to download and play these manually converted files.
One piece of advise if you are new J2ME/JavaME apps (like me), keep playing with the input data sources/formats and the emulators. Switching emulators or the input data-formats is an easy way to identify the not-so-evident problems.

Resources