I am currently working on a school J2ME project using Canvas and I am trying to get the Ping Pong ball app ready for tomorrow's exam, but when loading a PNG image with Image.createImage(url) I get an IOException (the image is in the right src file and it's 32x32 pixels)
This is my code:
public class BallGame extends GameCanvas implements Runnable {
private Image ballImg;
private Sprite ballSprite;
private String url="/ball.PNG";
private int ballX = getWidth() / 2;
private int ballY = getHeight() / 2;
public BallGame() {
super(false);
}
public void run() {
while (true) {
try {
updateScreen(getGraphics());
} catch (Exception e) {
e.printStackTrace();
}
}
}
public void start() {
try {
ballImg = Image.createImage(url);
} catch (IOException ex) {
System.out.println("*********");
ex.printStackTrace();
System.out.println("********************");
}
ballSprite = new Sprite(ballImg, 32, 32);
ballSprite.setRefPixelPosition(16, 16);
ballSprite.setPosition(ballX, ballY);
Thread runner = new Thread(this);
runner.start();
}
public void Createbackground(Graphics g) {
g.setColor(0x000000);
g.fillRect(0, 0, getWidth(), getHeight());
}
public void updateScreen(Graphics graphics) {
Createbackground(graphics);
ballSprite.setRefPixelPosition(ballX, ballY);
ballSprite.paint(graphics);
flushGraphics();
}
public void moveBall() {
}}
Looks to me like you have a wrong path or some problems with the ownership of the file. Check if some other program is editing it.
Have you tried adding a full absolute path?
Also have you tried using the "ball.png" image inside the same folder without folder paths like:
private String url="ball.PNG";
If that works it's definitely a path problem.
In my J2ME device version when I open a file my path starts like this:
String path_start="file:///a:/";
Related
In my app, images are saving successfully to gallery after editing. but quality is not up to the mark on physical device. I got 0.5mp to 0.7mp highest. but same app I open in emulator and after saving image I got pretty good quality of images (about 1.5mp to 3mp). didn't find the exact reason of this. will be glad if you help to find out. attaching my image saving code below.
public void saveAsFile(#NonNull final String str, #NonNull final SaveSettings saveSettings, #NonNull final OnSaveListener onSaveListener) {
Log.d(TAG, "Image Path: " + str);
this.parentView.saveFilter((OnSaveBitmap) new OnSaveBitmap() {
#Override
public void onBitmapReady(Bitmap bitmap) {
new AsyncTask<String, String, Exception>() {
#Override
public void onPreExecute() {
super.onPreExecute();
PhotoEditor.this.clearHelperBox();
PhotoEditor.this.parentView.setDrawingCacheEnabled(false);
}
#SuppressLint({"MissingPermission"})
public Exception doInBackground(String... strArr) {
Bitmap bitmap;
try {
FileOutputStream fileOutputStream = new FileOutputStream(new File(str), false);
if (PhotoEditor.this.parentView != null) {
PhotoEditor.this.parentView.setDrawingCacheEnabled(true);
if (saveSettings.isTransparencyEnabled()) {
bitmap = BitmapUtil.removeTransparency(PhotoEditor.this.parentView.getDrawingCache());
} else {
bitmap = PhotoEditor.this.parentView.getDrawingCache();
}
bitmap.compress(Bitmap.CompressFormat.PNG, 100 , fileOutputStream);
}
Log.d(PhotoEditor.TAG, "Filed Saved Successfully");
return null;
} catch (Exception e) {
e.printStackTrace();
Log.d(PhotoEditor.TAG, "Failed to save File");
return e;
}
}
#Override
public void onPostExecute(Exception exc) {
super.onPostExecute(exc);
if (exc == null) {
if (saveSettings.isClearViewsEnabled()) {
PhotoEditor.this.clearAllViews();
}
onSaveListener.onSuccess(str);
return;
}
onSaveListener.onFailure(exc);
}
}.execute();
}
public void onFailure(Exception exc) {
onSaveListener.onFailure(exc);
}
});
}
I tried in many ways but couldn't find the solution.
I've installed the PocketSphinx demo and it works fine under Ubuntu and Eclipse, but despite trying I can't work out how I would add recognition of multiple words.
All I want is for the code to recognize single words, which I can then switch() within the code, e.g. "up", "down", "left", "right". I don't want to recognize sentences, just single words.
Any help on this would be grateful. I have spotted other users' having similar problems but nobody knows the answer so far.
One thing which is baffling me is why do we need to use the "wakeup" constant at all?
private static final String KWS_SEARCH = "wakeup";
private static final String KEYPHRASE = "oh mighty computer";
.
.
.
recognizer.addKeyphraseSearch(KWS_SEARCH, KEYPHRASE);
What has wakeup got to do with anything?
I have made some progress (?) : Using addGrammarSearch I am able to use a .gram file to list my words, e.g. up,down,left,right,forwards,backwards, which seems to work well if all I say are those particular words. However, any other words will cause the system to match what is said to the "nearest" word from those stated. Ideally I don't want recognition to occur if words spoken are not in the .gram file...
Thanks to Nikolay's tip (see his answer above), I have developed the following code which works fine, and does not recognize words unless they're on the list. You can copy and paste this directly over the main class in the PocketSphinxDemo code:
public class PocketSphinxActivity extends Activity implements RecognitionListener
{
private static final String DIGITS_SEARCH = "digits";
private SpeechRecognizer recognizer;
#Override
public void onCreate(Bundle state)
{
super.onCreate(state);
setContentView(R.layout.main);
((TextView) findViewById(R.id.caption_text)).setText("Preparing the recognizer");
try
{
Assets assets = new Assets(PocketSphinxActivity.this);
File assetDir = assets.syncAssets();
setupRecognizer(assetDir);
}
catch (IOException e)
{
// oops
}
((TextView) findViewById(R.id.caption_text)).setText("Say up, down, left, right, forwards, backwards");
reset();
}
#Override
public void onPartialResult(Hypothesis hypothesis)
{
}
#Override
public void onResult(Hypothesis hypothesis)
{
((TextView) findViewById(R.id.result_text)).setText("");
if (hypothesis != null)
{
String text = hypothesis.getHypstr();
makeText(getApplicationContext(), text, Toast.LENGTH_SHORT).show();
}
}
#Override
public void onBeginningOfSpeech()
{
}
#Override
public void onEndOfSpeech()
{
reset();
}
private void setupRecognizer(File assetsDir)
{
File modelsDir = new File(assetsDir, "models");
recognizer = defaultSetup().setAcousticModel(new File(modelsDir, "hmm/en-us-semi"))
.setDictionary(new File(modelsDir, "dict/cmu07a.dic"))
.setRawLogDir(assetsDir).setKeywordThreshold(1e-20f)
.getRecognizer();
recognizer.addListener(this);
File digitsGrammar = new File(modelsDir, "grammar/digits.gram");
recognizer.addKeywordSearch(DIGITS_SEARCH, digitsGrammar);
}
private void reset()
{
recognizer.stop();
recognizer.startListening(DIGITS_SEARCH);
}
}
Your digits.gram file should be something like:
up /1e-1/
down /1e-1/
left /1e-1/
right /1e-1/
forwards /1e-1/
backwards /1e-1/
You should experiment with the thresholds within the double slashes // for performance, where 1e-1 represents 0.1 (I think). I think the maximum is 1.0.
And it's 5.30pm so I can stop working now. Result.
you can use addKeywordSearch which uses to file with keyphrases. One phrase per line with threshold for each phrase in //, for example
up /1.0/
down /1.0/
left /1.0/
right /1.0/
forwards /1e-1/
Threshold must be selected to avoid false alarms.
Working on updating Antinous amendment to the PocketSphinx demo to allow it to run on Android Studio. This is what I have so far,
//Note: change MainActivity to PocketSphinxActivity for demo use...
public class MainActivity extends Activity implements RecognitionListener {
private static final String DIGITS_SEARCH = "digits";
private SpeechRecognizer recognizer;
/* Used to handle permission request */
private static final int PERMISSIONS_REQUEST_RECORD_AUDIO = 1;
#Override
public void onCreate(Bundle state) {
super.onCreate(state);
setContentView(R.layout.main);
((TextView) findViewById(R.id.caption_text))
.setText("Preparing the recognizer");
// Check if user has given permission to record audio
int permissionCheck = ContextCompat.checkSelfPermission(getApplicationContext(), Manifest.permission.RECORD_AUDIO);
if (permissionCheck != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO}, PERMISSIONS_REQUEST_RECORD_AUDIO);
return;
}
new AsyncTask<Void, Void, Exception>() {
#Override
protected Exception doInBackground(Void... params) {
try {
Assets assets = new Assets(MainActivity.this);
File assetDir = assets.syncAssets();
setupRecognizer(assetDir);
} catch (IOException e) {
return e;
}
return null;
}
#Override
protected void onPostExecute(Exception result) {
if (result != null) {
((TextView) findViewById(R.id.caption_text))
.setText("Failed to init recognizer " + result);
} else {
reset();
}
}
}.execute();
((TextView) findViewById(R.id.caption_text)).setText("Say one, two, three, four, five, six...");
}
/**
* In partial result we get quick updates about current hypothesis. In
* keyword spotting mode we can react here, in other modes we need to wait
* for final result in onResult.
*/
#Override
public void onPartialResult(Hypothesis hypothesis) {
if (hypothesis == null) {
return;
} else if (hypothesis != null) {
if (recognizer != null) {
//recognizer.rapidSphinxPartialResult(hypothesis.getHypstr());
String text = hypothesis.getHypstr();
if (text.equals(DIGITS_SEARCH)) {
recognizer.cancel();
performAction();
recognizer.startListening(DIGITS_SEARCH);
}else{
//Toast.makeText(getApplicationContext(),"Partial result = " +text,Toast.LENGTH_SHORT).show();
}
}
}
}
#Override
public void onResult(Hypothesis hypothesis) {
((TextView) findViewById(R.id.result_text)).setText("");
if (hypothesis != null) {
String text = hypothesis.getHypstr();
makeText(getApplicationContext(), "Hypothesis" +text, Toast.LENGTH_SHORT).show();
}else if(hypothesis == null){
makeText(getApplicationContext(), "hypothesis = null", Toast.LENGTH_SHORT).show();
}
}
#Override
public void onDestroy() {
super.onDestroy();
recognizer.cancel();
recognizer.shutdown();
}
#Override
public void onBeginningOfSpeech() {
}
#Override
public void onEndOfSpeech() {
reset();
}
#Override
public void onTimeout() {
}
private void setupRecognizer(File assetsDir) throws IOException {
// The recognizer can be configured to perform multiple searches
// of different kind and switch between them
recognizer = defaultSetup()
.setAcousticModel(new File(assetsDir, "en-us-ptm"))
.setDictionary(new File(assetsDir, "cmudict-en-us.dict"))
// .setRawLogDir(assetsDir).setKeywordThreshold(1e-20f)
.getRecognizer();
recognizer.addListener(this);
File digitsGrammar = new File(assetsDir, "digits.gram");
recognizer.addKeywordSearch(DIGITS_SEARCH, digitsGrammar);
}
private void reset(){
recognizer.stop();
recognizer.startListening(DIGITS_SEARCH);
}
#Override
public void onError(Exception error) {
((TextView) findViewById(R.id.caption_text)).setText(error.getMessage());
}
public void performAction() {
// do here whatever you want
makeText(getApplicationContext(), "performAction done... ", Toast.LENGTH_SHORT).show();
}
}
Caveat emptor: this is a work in progress. Check back later. Suggestions would be appreciated.
I'm getting some buggy graphics behaviour when I run my AndEngine-based app on AVDs with the Android 7.0 (Nougat) image in Android Studio 2.2.2. The result is normal when I use Android 6 (link), but it is translated, enlarged and wrapped in the x-dimension, with distorted colours when I use Android 7 (link).
Has anyone had experience with this type of graphics distortion? Could it be caused by Android 7.0 not being backward-compatible with something in AndEngine/OpenGL, or a problem with the Nougat image in Android Studio's AVD?
The minimal app I used to recreate the behaviour uses AndEngine (GLES2-AnchorCenter) to display a sprite on the main scene. I've tested it with several AVDs and it is consistently buggy with Nougat regardless of the device.
public class MainActivity extends SimpleLayoutGameActivity {
private static int CAMERA_WIDTH = 480;
private static int CAMERA_HEIGHT = 800;
private Scene mScene;
private ITextureRegion myTexture;
#Override
protected int getLayoutID() {
return R.layout.activity_main;
}
#Override
protected int getRenderSurfaceViewID() {
return R.id.gameview;
}
#Override
public void onCreate(Bundle pSavedInstanceState) {
super.onCreate(pSavedInstanceState);
}
#Override
public Engine onCreateEngine(EngineOptions pEngineOptions) {
return new Engine(pEngineOptions);
}
#Override
public EngineOptions onCreateEngineOptions() {
ScreenOrientation orientation = ScreenOrientation.PORTRAIT_FIXED;
EngineOptions en = new EngineOptions(true, orientation,
new FillResolutionPolicy(), new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT));
return en;
}
#Override
public synchronized void onPauseGame() {
super.onPauseGame();
}
#Override
public synchronized void onResumeGame() {
super.onResumeGame();
}
#Override
protected void onCreateResources() {
BitmapTextureAtlasTextureRegionFactory.setAssetBasePath("gfx/");
final BuildableBitmapTextureAtlas gameTextureAtlasBIPMA2 = new BuildableBitmapTextureAtlas(this.getTextureManager(),
136, 136, BitmapTextureFormat.RGBA_4444, TextureOptions.BILINEAR_PREMULTIPLYALPHA);
myTexture = BitmapTextureAtlasTextureRegionFactory.createFromAsset(
gameTextureAtlasBIPMA2, this.getAssets(), "redbomb.png");
try {
gameTextureAtlasBIPMA2.build(new BlackPawnTextureAtlasBuilder<IBitmapTextureAtlasSource, BitmapTextureAtlas>(2, 0, 2));
gameTextureAtlasBIPMA2.load();
} catch (Exception e) {
throw new RuntimeException("Error loading BIPMA2", e);
}
}
#Override
protected Scene onCreateScene() {
this.mScene = new Scene();
// Add a sprite
float diameter = CAMERA_WIDTH/1.5f;
this.mScene.attachChild(new Sprite(CAMERA_WIDTH/2f, CAMERA_HEIGHT/2f,
diameter, myTexture.getHeight() * diameter / myTexture.getWidth(),
myTexture, getVertexBufferObjectManager()));
return this.mScene;
}
}
UPDATE: I have isolated the cause to the use of RGBA_4444 as the texture pixel format; when changed to RGBA_8888 the graphics are normal. Still, if anyone knows how to fix it so that I can use RGBA_4444...
I have two questions.
The first is about updating the UI, the second is when I try to connect to the camera to get a mjpeg stream and run getResponseCode(), the app locks there. The MDS shows a lot of data transferring.
I have some classes like ....:
Http extends Thread {
public abstract String getUrl();
public abstract String getBase64Encode();
public abstract void onReturn(int responseCode, InputStream is,int lenght);
protected abstract void onError(Exception e);
}
CameraHttp extends Http and MjpegHttp extends CameraHttp.
http connects to a URL which is the jpeg or mjpeg camera adresses.
I have a Camera Class. It starts a connection with the overridden method mjpegconnection.go();
I also have a static bitmap on ViewCam screen which extends MainScreen.
After it starts:
url = getUrl();
queryString = encodeURL(queryString);
byte postmsg[] = queryString.getBytes("UTF-8");
httpConnection = (HttpConnection) Connector.open(url
+ ";deviceside=false", Connector.READ_WRITE);
httpConnection.setRequestMethod(HttpConnection.GET);
httpConnection.setRequestProperty("Authorization", getBase64Encode());
os = httpConnection.openDataOutputStream();
for (int i = 0; i < postmsg.length; i++) {
os.write(postmsg[i]);
}
{
if (!cancel) {
System.out.println(httpConnection.getURL()+
" *****"+httpConnection.getPort());
System.out.println("onreturn oncesi"
+ httpConnection.getResponseCode());
onReturn(httpConnection.getResponseCode(), httpConnection
.openInputStream(),(int) httpConnection.getLength());
System.out.println("onreturn sornrası");
}
os.close();
httpConnection.close();
}
} catch (Exception e) {
System.out.println("hata " + e.getMessage());
try {
httpConnection.close();
Thread.sleep(60);
} catch (Exception ie) {
}
onError(e);
}
After dosomething
// decides mjpeg-jpeg stream
// if it is mjpeg, direct to parser,
// else it sets image with setImage() and return to connection with go();
public void parse(InputStream is, int lenght) {
try {
if (!type.isMjpegStream()) {
setImage(is, lenght);
System.gc();
StaticVar.ActiveCam.setConnected(true);
} else {
if (parser == null) {
parser = new JpegParser(is, this);
} else {
parser.setInputSteam(is, this);
}
parser.parse();
is.close();
}
} catch (Exception e) {
}
}
and
public void setImage(InputStream is, int lenght) {
byte[] raw = new byte[lenght];
try {
is.read(raw);
currentImage = Bitmap.createBitmapFromBytes(raw, 0, raw.length, 1);
ViewCam.ViewCam=currentImage; //static var.
} catch (IOException e) {
System.out.println("catche***********");
// TODO Auto-generated catch block
e.printStackTrace();
}
}
How can I repaint the screen to show the bitmap?
And my ViewCam
public class ViewCam extends MainScreen {
Header header;
String headerString;
public static Bitmap ViewCam;// cam image shows
private static Thread My;// runs connection
void OnStart() {
My = new Thread() {
public void run() {
System.out.println("ONSTART");
StaticVar.ActiveCam.go();
};
};
My.start();
Bitmap bitmap = Bitmap.getBitmapResource("res/main.png");
Bitmap bmp2 = ResizeImage.resizeBitmap(bitmap, Display.getWidth(),
Display.getHeight());
Background bg = BackgroundFactory.createBitmapBackground(bmp2);
this.setBackground(bg);
this.getMainManager().setBackground(bg);
}
public ViewCam() {
StaticVar.ActiveCam.getIp();
OnStart();
headerString ="Cam View";
header = new Header("res/bartop.png", headerString, 0);
add(header);
ViewCam = Bitmap.getBitmapResource("res/spexco_splash.png");
ViewCam = ResizeImage.bestFit(ViewCam, Display.getWidth(), Display
.getHeight());
BitmapField bf = new BitmapField(ViewCam);
add(bf);
}
}
Try Screen.invalidate()
public void invalidate(int x, int y, int width, int height)
Invalidates a region of this screen.
This method marks a region of this screen as needing a repaint. The repainting is handled later by the main event dispatch thread.
Note: Any thread can safely invoke this method, and does not require to synchronize on the event lock.
Overrides:
invalidate in class Manager
Parameters:
x - Left edge of the region in ContentRect coordinates.
y - Top edge of the region in ContentRect coordinates.
width - Width (in pixels) of the region.
height - Height (in pixels) of the region.
How do I create and display an image in j2me application?
And in which folder can I put that image in my application?
This link has exactly what you are looking for to get started.
Basically, to create the image, you call upon Image.createImage();
Image img = Image.createImage("/imageName.png");
If it is in a sub-folder in the Jar:
Image img = Image.createImage("/subDir/imageName.png");
To display the image, you need to paint it to a Canvas through a Graphics instance that is tied to the Canvas (better visualized in the link above).
public void paint(Graphics g) {
...
g.drawImage(img, 0, 0, Graphics.TOP | Graphics.LEFT);
....
}
You could also use the Graphics.drawRegion function, but here is a link to the JavaDocs for J2ME for you to look through to see what is best for your needs.
To draw an Image on a JavaME MIDlet you need a Canvas to paint it on to. You can do as follow:
Firs you have to place the original image file inside your package (usually inside "res" or one of his subdirectories).
Secondly you need to create a class extending Canvas and implement the paint method:
import java.io.IOException;
import javax.microedition.lcdui.Canvas;
import javax.microedition.lcdui.Graphics;
import javax.microedition.lcdui.Image;
public class MyCanvas extends Canvas {
private Image image;
public MyCanvas(){
try {
image = Image.createImage("picture.png");
} catch (IOException e) {
e.printStackTrace();
}
}
protected void paint(Graphics g) {
g.drawImage(image, 10, 10, Graphics.TOP | Graphics.LEFT);
}
}
Now you need to create an instance of this class and tell the MIDlet di display it, for example:
import javax.microedition.lcdui.Display;
import javax.microedition.midlet.MIDlet;
import javax.microedition.midlet.MIDletStateChangeException;
public class MyMIDlet extends MIDlet {
public MyMIDlet(){
}
protected void destroyApp(boolean unconditional)
throws MIDletStateChangeException {
}
protected void pauseApp() {
}
protected void startApp() throws MIDletStateChangeException {
Display.getDisplay(this).setCurrent(new MyCanvas());
}
}
Remember that this way the Canvas will be painted only one time and if you change something, you need to call the repaint() method.
This source code builds on previously posted comments:
import java.io.*;
import javax.microedition.io.*;
import javax.microedition.io.file.FileConnection;
import javax.microedition.lcdui.*;
import javax.microedition.midlet.*;
public class ImageLoader extends MIDlet
implements CommandListener, Runnable {
private Display mDisplay;
private Form mForm;
public ImageLoader() {
mForm = new Form("Connecting...");
mForm.addCommand(new Command("Exit", Command.EXIT, 0));
mForm.setCommandListener(this);
}
public void startApp() {
if (mDisplay == null) mDisplay = Display.getDisplay(this);
mDisplay.setCurrent(mForm);
Thread t = new Thread(this);
t.start();
}
public void pauseApp() {}
public void destroyApp(boolean unconditional) {}
public void commandAction(Command c, Displayable s) {
if (c.getCommandType() == Command.EXIT)
notifyDestroyed();
}
public void run() {
FileConnection fc = null;
DataInputStream in = null;
DataOutputStream out = null;
try {
fc = (FileConnection)Connector.open("file:///root1/i.PNG");
int length = (int)fc.fileSize();//possible loss of precision may throw error
byte[] data = null;
if (length != -1) {
data = new byte[length];
in = new DataInputStream(fc.openInputStream());
in.readFully(data);
}
else {
int chunkSize = 512;
int index = 0;
int readLength = 0;
in = new DataInputStream(fc.openInputStream());
data = new byte[chunkSize];
do {
if (data.length < index + chunkSize) {
byte[] newData = new byte[index + chunkSize];
System.arraycopy(data, 0, newData, 0, data.length);
data = newData;
}
readLength = in.read(data, index, chunkSize);
index += readLength;
} while (readLength == chunkSize);
length = index;
}
Image image = Image.createImage(data, 0, length);
ImageItem imageItem = new ImageItem(null, image, 0, null);
mForm.append(imageItem);
mForm.setTitle("Done.");
fc = (FileConnection)Connector.open("file:///root1/x.PNG");
if(!fc.exists()){
try{
fc.create();
}catch(Exception ce){System.out.print("Create Error: " + ce);}
}
out = new DataOutputStream(fc.openOutputStream());
out.write(data);
}
catch (IOException ioe) {
StringItem stringItem = new StringItem(null, ioe.toString());
mForm.append(stringItem);
mForm.setTitle("Done.");
}
finally {
try {
if (in != null) in.close();
if (fc != null) fc.close();
}
catch (IOException ioe) {}
}
}
}
The code is modified from the link Fostah provided here.
It opens an image, displays it, then saves it as x.PNG instead of i.PNG using FileConnection. The tricky thing to watch for is where the file is being saved/loaded from. If your using J2meWTK with Netbeans, then the folder will be displayed in the output window when you run the mobile app. The folder will be something like temp.DefaultColorPhone/filesystem/root1 . That is where you will have to have an image. I'm not sure how to have the temp environment created with the image by default. That means you have to start the mobile app, check where the temp root1/ is located, in your IDE, then drop the image into the folder, then proceed with running the ImageLoader application. I'll try to find out how to automate this by posting a question. Also, Start with a small image, 50x50 (bigger images may cause problems).