Ok, here's the code:
import java.io.*;
import javax.swing.JFileChooser;
import javax.swing.JOptionPane;
import sun.audio.*;
public class Sound {
public static void main ( String Args[]){
JFileChooser openf =new JFileChooser();
openf.showOpenDialog(null);
File fl= openf.getSelectedFile();
String sound = fl.getAbsolutePath();
JOptionPane.showMessageDialog(null, sound);
InputStream in;
try{
in = new FileInputStream(sound);
AudioStream audio = new AudioStream(in);
AudioPlayer.player.start(audio);
}catch(Exception e){
JOptionPane.showMessageDialog(null, e.toString());
}
}
}
Im working this application to allow the selection of audio files (through the jfilechooser) such as: mp3, wma or wav for its reproduction.
The exception I keep having is the following: 'java.io.IOException: could not create audio stream from input stream'.
I heard somewhere else that some of the sun.audio classes im importing were having some problems. Could that be?
Thanks.
Miguel André.
I guess you are trying to play an mp3 file. Java doesnt support mp3 natively. Your code is capable of playing wave(*.wav) files only. JavaFX supports mp3 out-of-the-box. Java supports MP3 with use of an external plug-in(JMF,FMJ,JLayer ..)
Related
How to get the current video frame while playing video by libvlcsharp?
I can play video with libvlcsharp by the codes below:
public void OnAppearing()
{
LibVLC = new LibVLC();
var media = new LibVLCSharp.Shared.Media(LibVLC, new Uri("http://live.cgtn.com/1000/prog_index.m3u8"));
MediaPlayer = new MediaPlayer(LibVLC)
{
Media = media
};
media.Dispose();
Play();
}
private void Play()
{
if (m_url != string.Empty)
{
MediaPlayer.Play(new LibVLCSharp.Shared.Media(LibVLC, new Uri(m_url)));
}
}
You could use the TakeSnapshot method. However please note:
The snapshot will be written to the disk, there's no way to get the frame in memory in VLC 3 (A new API for that will likely come in VLC4)
This is not meant to grab every frame, use it only for a snapshot.
If you need more frames, have a look at the Thumbnailer samples. They're not meant to grab all frames either.
I am trying to play an audio file which is stored locally. I am using Xam.Plugin.SimpleAudioPlayer for playing audio. For UWP and Android, added the files in the Assets folder with the Build Action set to Content and Android Asset respectively.
My Code:
private void PlayAudio()
{
try
{
var stream = GetStreamFromFile("audio.mp3");
var audio = Plugin.SimpleAudioPlayer.CrossSimpleAudioPlayer.Current;
audio.Load(stream);
audio.Play();
}
catch (Exception e)
{
Debug.WriteLine("exception:>>" + e);
}
}
Stream GetStreamFromFile(string filename)
{
var assembly = typeof(App).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream("AudioPlay." + filename);
return stream;
}
But getting an exception in android and UWP, didn't check in IOS.
Android Exception:
[0:] exception:>>System.NullReferenceException: Object reference not set to an instance of an object.
at Plugin.SimpleAudioPlayer.SimpleAudioPlayerImplementation.Load (System.IO.Stream audioStream) [0x00050] in C:\dev\open\Xamarin-Plugins\SimpleAudioPlayer\SimpleAudioPlayer\Plugin.SimpleAudioPlayer.Android\SimpleAudioPlayerImplementation.cs:100
at AudioPlay.MainPage.PlayAudio () [0x00014] in F:\AudioPlay\AudioPlay\AudioPlay\MainPage.xaml.cs:63
I am using .mp3 file. Am I missing something in this implementation? Please help me to fix this issue?
You just need to directly put the mp3 file in share project . And call the method
var stream = GetStreamFromFile("xxx.mp3");
var audio = Plugin.SimpleAudioPlayer.CrossSimpleAudioPlayer.Current;
audio.Load(stream );
audio.Play();
Stream GetStreamFromFile(string filename)
{
var assembly = typeof(App).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream("yourprojectname." + filename);
return stream;
}
Update
I check your demo , and if you want to play mp3 file in your project. You need to set the build action of the mp3 as Embedded resource
Right click the mp3 file -> Property
try this For xamarin.android
var u = Android.Net.Uri.Parse("filepath");
MediaPlayer _player = MediaPlayer.Create(this, u);
_player.Start();
or
use XamarinMediaManager plugin
https://github.com/martijn00/XamarinMediaManager
The accepted answer does not work when attempting to load a resource from a project that is not the main Xamarin Forms project.
The solution is to use Android.AssetManager.Open() to get a Stream (I'm sure iOS has something similar). You'll need to use DependencyService to access this outside the Android project.
The MP3 files would be put in the Assets folder of the Android project. The audio files need to have Build Action: "Android Asset", not "Embedded Resource" as above.
Example:
Android Project has Assets/Audio/click.mp3
AndroidProject/AndroidAssetManager.cs
[assembly: Dependency (typeof (AndroidAssetManager))]
namespace App.Droid
{
public interface IAssetManager
{
Stream LoadAsset(string assetName);
...
}
public class AndroidAssetManager : IAssetManager
{
private readonly AssetManager _assets;
public AndroidAssetManager(AssetManager assets)
{
_assets = assets;
}
public Stream LoadAsset(string assetName)
{
return _assets.Open(assetName);
}
...
}
}
Shared/AudioPlayer.cs:
class AudioPlayer : IAudioPlayer
{
private IAssetManager AssetManager => DependencyService.Get<IAssetManager>();
private ISimpleAudioPlayer _clickPlayer;
public AudioPlayer()
{
_clickPlayer = CrossSimpleAudioPlayer.CreateSimpleAudioPlayer();
_clickPlayer.Load(AssetManager.Load("Audio/click.mp3"));
}
public void PlayClick()
{
_clickPlayer.Play();
}
}
Ok I'm currently doing a Trivial Pursuit game project with javafx and my group wants me to add audio the problem is I have a method
public static void playSoundEffect(Sound sfx) {
Media media=null;
try {
media = new Media(GameAudio.class.getClassLoader().getResource(sfx.getSound()).toURI().toString());
mediaPlayer = new MediaPlayer(media);
mediaPlayer.play();
} catch (URISyntaxException e) {
e.printStackTrace();
}
}
But it has its issues because if I want to mute all the audio, only the last played sound will be muted and not the whole project's audio.
I was thinking of making 2 List of MediaPlayer(SFX and Music) that contains each audio files but I'm not sure how to set this up properly... My current try was using Enum for the const strings containing the path. Then in some class i use the method above to play the sound at a certain point. But since i always call a new instance of mediaPlayer I don't have any control on it anymore and that's why I'm so lost.
As #James_D supposed for the mute I will use a BooleanProperty muted and call a method mediaPlayer.muteProperty().bind(muted) on each mediaplayer created.
I'm writing code to convert SVG's to PNG's:
package com.example;
import java.io.*;
import java.nio.file.Paths;
import org.apache.batik.transcoder.image.PNGTranscoder;
import org.apache.batik.transcoder.SVGAbstractTranscoder;
import org.apache.batik.transcoder.TranscoderInput;
import org.apache.batik.transcoder.TranscoderOutput;
public class Main {
public static void main(String [] args) throws Exception {
// read the input SVG document into TranscoderInput
String svgURI = Paths.get(args[0]).toUri().toURL().toString();
TranscoderInput input = new TranscoderInput(svgURI);
// define OutputStream to PNG Image and attach to TranscoderOutput
OutputStream ostream = new FileOutputStream("out.png");
TranscoderOutput output = new TranscoderOutput(ostream);
// create a JPEG transcoder
PNGTranscoder t = new PNGTranscoder();
// set the transcoding hints
t.addTranscodingHint(SVGAbstractTranscoder.KEY_HEIGHT, new Float(600));
t.addTranscodingHint(SVGAbstractTranscoder.KEY_WIDTH, new Float(600));
// convert and write output
t.transcode(input, output);
// flush and close the stream then exit
ostream.flush();
ostream.close();
}
}
I get the following exceptions executing it with a variety of SVG's:
Exception in thread "main" org.apache.batik.transcoder.TranscoderException: null
Enclosed Exception:
Could not write PNG file because no WriteAdapter is availble
at org.apache.batik.transcoder.image.ImageTranscoder.transcode(ImageTranscoder.java:132)
at org.apache.batik.transcoder.XMLAbstractTranscoder.transcode(XMLAbstractTranscoder.java:142)
at org.apache.batik.transcoder.SVGAbstractTranscoder.transcode(SVGAbstractTranscoder.java:156)
at com.example.Main.main(Main.java:26)
Batik version (reported by Maven):
version=1.9
groupId=org.apache.xmlgraphics
artifactId=batik-transcoder
I get the same error with Batik 1.7.
Suggestions?
The problem was solved by Peter Coppens on the xmlgraphics-batik-users mailing list. The problem is that the Maven repository for Batik 1.9 is missing a dependency, which can be addressed by adding to pom.xml:
<dependency>
<groupId>org.apache.xmlgraphics</groupId>
<artifactId>batik-codec</artifactId>
<version>1.9</version>
</dependency>
The cryptic exception disappears and the code functions as expected with this addition. This was reported as a bug for Batk 1.7 (https://bz.apache.org/bugzilla/show_bug.cgi?id=44682).
I'm trying to play a sound, but it's not playing
Here's my code:
public void Replay()
{
playAudio ();
Application.LoadLevel (Application.loadedLevel);
}
void playAudio()
{
AudioSource audio = GetComponent<AudioSource> ();
audio.Play();
}
When a button clicked, I'm calling Replay(). But, the sound is not played.
If I remarkedApplication.LoadLevel (Application.loadedLevel);, the sound plays normally.
What should I do to make the sound play with Application.LoadLevel()?
The AudioSource playing the sound will get removed before it has time to finish.
Here is an alternative solution using yield to wait for the sound to finish.
public void Replay()
{
StartCoroutine("ReplayRoutine");
}
IEnumerator ReplayRoutine()
{
AudioSource audio = GetComponent<AudioSource>();
audio.Play();
yield return new WaitForSeconds(audio.clip.length);
Application.LoadLevel(Application.loadedLevel);
}
You call play method and you load the scene after it. Try to call playAudio() before loading level.
public void Replay() {
Application.LoadLevel(Application.loadedLevel);
playAudio();
}
I think you don't give the chance to the audio source to play the sound, because after executing play, you immediately re-loaded the scene, so the same instance of audio source does not exist anymore, a new instance is created which has not yet received the play command. You can use some small delay using co-routine or other ways to give the needed time to the audio source. ( If you want to play the sound before loading level, otherwise just play the sound in a Start() callback)