Hi I am building an app that will use incoming audio from the MIC and compare it with a stored sound file. At the moment I am trying to get to grips with what the data from the AudioRecord function looks like when saved to a array of bytes. My problem is that the values that are returned are all zero. I don't know if I am maybe not using/setting up the AudioRecord function properly. Here is my code:
import android.app.Activity;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.TextView;
public class SnoreAlarmActivity extends Activity implements OnClickListener {
/** Called when the activity is first created. */
Button start, stop;
TextView display;
Boolean rec = false;
AudioRecord snore;
byte[] arrayOfByte = new byte[16 * 1024 / 8];
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
start = (Button) findViewById(R.id.bStart);
stop = (Button) findViewById(R.id.bStop);
display = (TextView) findViewById(R.id.tAnswer);
start.setOnClickListener(this);
stop.setOnClickListener(this);
int i = AudioRecord.getMinBufferSize(44100, 16, 2);
snore = new AudioRecord(MediaRecorder.AudioSource.MIC, 44100, 16, 2, i);// From
// MIC,Sample
// Rate
// of
// 44100,
// Channel_IN_MONO,16Bit
// Encoding,buffer
// size
// i
new Record().start();
}
#Override
public void onClick(View arg0) {
// TODO Auto-generated method stub
switch (arg0.getId()) {
case R.id.bStart:
rec = true;
break;
case R.id.bStop:
rec = false;
snore.stop();
snore.release();
for (int i = 0; i < 100; i++) {
int x = (int) arrayOfByte[i];
Log.w("Tag", "" + x);
}
break;
}
}
class Record extends Thread {
Record() {
}
#Override
public void run() {
// TODO Auto-generated method stub
while (rec) {
snore.startRecording();
snore.read(arrayOfByte, 0, (16 * 1024 / 8));
}
}
}
}
So my question is, why am I getting zero's?
*Also I know I should store the data in a short because of 16Bit encoding, will do that later, just trying to understand the values returned from the AudioRecord function at the moment.
Recording through MIC give me headache :)
Hope this will help because I lost whole day playing with it.
Here are steps which must be satisfied for MIC to work
Maniefst.xml MUST contains:
< uses-permission android:name="android.permission.RECORD_AUDIO" />
AudioRecord buffer should be set with AudioRecord.getMinBufferSize (you did it right)
If MIC wasn't released, you WON'T be able to bind to MIC again!
Restarting your phone will reset MIC.
Here is short example how I did it.
private android.media.AudioRecord aRecorder = null;
private Boolean breakLoop = false;
private byte[] buffer;
public Boolean StartRecording()
int freq = 22050;
try
{
prepareWaveFile(MyRandomAccessFile); //open file and writes WAV header
int bufferSize = android.media.AudioRecord.getMinBufferSize(freq, android.media.AudioFormat.CHANNEL_IN_MONO, android.media.AudioFormat.ENCODING_PCM_16BIT);
buffer = new byte[bufferSize];
if(bufferSize == AudioRecord.ERROR_BAD_VALUE){
Log.e(LOG_TAG, "Min buff size error");
return false;
}
aRecorder = new android.media.AudioRecord(MediaRecorder.AudioSource.MIC, freq, android.media.AudioFormat.CHANNEL_IN_MONO, android.media.AudioFormat.ENCODING_PCM_16BIT, bufferSize);
aRecorder.startRecording();
breakLoop = false;
int TotalSize=0; //koliko je snimljeno podataka
while (TotalSize < freq * 2 * 1 * 30) { //SampleRate * BytesPerSample * NumberOfChannels * seconds
int bufferReadResult = aRecorder.read(buffer, 0, bufferSize);
TotalSize+=bufferReadResult; // filesize failsafe
if (breakLoop) break; //if other thread stops recording
if (bufferReadResult>0) MyRandomAccessFile.write(buffer, 0, bufferReadResult); //write into some file ...
}
aRecorder.stop();
aRecorder.release(); //this is mandatory !
updateWaveFile(MyRandomAccessFile); //updates wave header
MyRandomAccessFile.close();
return true;
} catch (Exception e)
{
Log.e(LOG_TAG, "StartRecording: LOOP " + e);
try
{
aRecorder.stop();
} catch(Exception ex) {}
try
{
aRecorder.release();
} catch(Exception ex) {}
try
{
randomAccessWriter.close();
} catch(Exception ex) {}
return false;
}
are you on the Emulator? I heard that the Mic doesn't work on that.
Related
I'm pretty new to android studio. I noticed that my program had a very severe performance hiccup and I believe it is slowing down after I run the app every time. I think I have a runaway thread and I will attach pictures at the end of my post. I could really use some help. The first picture shows an example of the thread and then the second picture shows the thread after 5 minutes or so of waiting. I attached two codes. CameraSurfaceView runs the code while FaceDetectionThread creates the thread.
package com.example.phliip_vision;
import java.util.ArrayList;
import java.util.List;
import android.annotation.SuppressLint;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.PointF;
import android.graphics.Rect;
import android.hardware.Camera;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.Size;
import android.media.FaceDetector;
import android.media.FaceDetector.Face;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceHolder.Callback;
import android.view.SurfaceView;
import com.example.phliip_vision.Point;
import com.example.phliip_vision.MeasurementStepMessage;
import com.example.phliip_vision.MessageHUB;
import com.example.phliip_vision.Util;
public class CameraSurfaceView extends SurfaceView implements Callback,
Camera.PreviewCallback {
public static final int CALIBRATION_DISTANCE_A4_MM = 294;
public static final int CALIBRATION_MEASUREMENTS = 10;
public static final int AVERAGE_THREASHHOLD = 5;
private static final String TAG = "CameraSurfaceView";
/**
* Measured distance at calibration point
*/
private float _distanceAtCalibrationPoint = -1;
private float _currentAvgEyeDistance = -1;
// private int _facesFoundInMeasurement = -1;
/**
* in cm
*/
private float _currentDistanceToFace = -1;
private final SurfaceHolder mHolder;
private Camera mCamera;
private Face _foundFace = null;
private int _threashold = CALIBRATION_MEASUREMENTS;
private FaceDetectionThread _currentFaceDetectionThread;
private List<Point> _points;
protected final Paint _middlePointColor = new Paint();
protected final Paint _eyeColor = new Paint();
private Size _previewSize;
// private boolean _measurementStartet = false;
private boolean _calibrated = false;
private boolean _calibrating = false;
private int _calibrationsLeft = -1;
public CameraSurfaceView(final Context context, final AttributeSet attrs) {
super(context, attrs);
_middlePointColor.setARGB(100, 200, 0, 0);
_middlePointColor.setStyle(Paint.Style.FILL);
_middlePointColor.setStrokeWidth(2);
_eyeColor.setColor(Color.GREEN);
mHolder = getHolder();
mHolder.addCallback(this);
}
public void setCamera(final Camera camera) {
mCamera = camera;
if (mCamera != null) {
requestLayout();
Log.d(TAG, "mCamera RANNNNNNN!!!!");
Camera.Parameters params = mCamera.getParameters();
camera.setDisplayOrientation(90);
List<String> focusModes = params.getSupportedFocusModes();
if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
Log.d(TAG, "FOCUS_MODE_AUTO RANNNNNNN!!!!");
// set the focus mode
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
// set Camera parameters
mCamera.setParameters(params);
}
}
}
/**
* Variables for the onDraw method, in order to prevent variable allocation
* to slow down the sometimes heavily called onDraw method
*/
private final PointF _middlePoint = new PointF();
private final Rect _trackingRectangle = new Rect();
private final static int RECTANGLE_SIZE = 20;
private boolean _showEyes = false;
private boolean _showTracking = true;
#SuppressLint("DrawAllocation")
#Override
protected void onDraw(final Canvas canvas) {
// super.onDraw(canvas);
if (_foundFace != null) {
_foundFace.getMidPoint(_middlePoint);
Log.d(TAG, "_middlePoint RANNNNNNN!!!!");
Log.i("Camera", _middlePoint.x + " : " + _middlePoint.y);
// portrait mode!
float heightRatio = getHeight() / (float) _previewSize.width;
float widthRatio = getWidth() / (float) _previewSize.height;
Log.i("Drawcall", _middlePoint.x + " : " + _middlePoint.y);
int realX = (int) (_middlePoint.x * widthRatio);
int realY = (int) (_middlePoint.y * heightRatio);
Log.i("Drawcall", "Real :" + realX + " : " + realY);
int halfEyeDist = (int) (widthRatio * _foundFace.eyesDistance() / 2);
if (_showTracking) {
// Middle point
Log.d(TAG, "_showTracking RANNNNNNN!!!!");
_trackingRectangle.left = realX - RECTANGLE_SIZE;
_trackingRectangle.top = realY - RECTANGLE_SIZE;
_trackingRectangle.right = realX + RECTANGLE_SIZE;
_trackingRectangle.bottom = realY + RECTANGLE_SIZE;
canvas.drawRect(_trackingRectangle, _middlePointColor);
}
if (_showEyes) {
// Left eye
Log.d(TAG, "_showEyes RANNNNNNN!!!!");
_trackingRectangle.left = realX - halfEyeDist - RECTANGLE_SIZE;
_trackingRectangle.top = realY - RECTANGLE_SIZE;
_trackingRectangle.right = realX - halfEyeDist + RECTANGLE_SIZE;
_trackingRectangle.bottom = realY + RECTANGLE_SIZE;
canvas.drawRect(_trackingRectangle, _eyeColor);
// Right eye
_trackingRectangle.left = realX + halfEyeDist - RECTANGLE_SIZE;
_trackingRectangle.top = realY - RECTANGLE_SIZE;
_trackingRectangle.right = realX + halfEyeDist + RECTANGLE_SIZE;
_trackingRectangle.bottom = realY + RECTANGLE_SIZE;
canvas.drawRect(_trackingRectangle, _eyeColor);
}
}
}
public void reset() {
Log.d(TAG, "reset RANNNNNNN!!!!");
_distanceAtCalibrationPoint = -1;
_currentAvgEyeDistance = -1;
_calibrated = false;
_calibrating = false;
_calibrationsLeft = -1;
}
/**
* Sets this current EYE distance to be the distance of a peace of a4 paper
* e.g. 29,7cm
*/
public void calibrate() {
Log.d(TAG, "calibrate RANNNNNNN!!!!");
if (!_calibrating || !_calibrated) {
_points = new ArrayList<>();
_calibrating = true;
_calibrationsLeft = CALIBRATION_MEASUREMENTS;
_threashold = CALIBRATION_MEASUREMENTS;
}
}
private void doneCalibrating() {
Log.d(TAG, "doneCalibrating RANNNNNNN!!!!");
_calibrated = true;
_calibrating = false;
_currentFaceDetectionThread = null;
// _measurementStartet = false;
_threashold = AVERAGE_THREASHHOLD;
_distanceAtCalibrationPoint = _currentAvgEyeDistance;
MessageHUB.get().sendMessage(MessageHUB.DONE_CALIBRATION, null);
}
public boolean isCalibrated() {
Log.d(TAG, "isCalibrated RANNNNNNN!!!!");
return _calibrated || _calibrating;
}
public void showMiddleEye(final boolean on) {
Log.d(TAG, "showMiddleEye RANNNNNNN!!!!");
_showTracking = on;
}
public void showEyePoints(final boolean on) {
Log.d(TAG, "showEyePoints RANNNNNNN!!!!");
_showEyes = on;
}
private void updateMeasurement(final FaceDetector.Face currentFace) {
if (currentFace == null) {
Log.d(TAG, "updateMeasurement RANNNNNNN!!!!");
// _facesFoundInMeasurement--;
return;
}
_foundFace = _currentFaceDetectionThread.getCurrentFace();
_points.add(new Point(_foundFace.eyesDistance(),
CALIBRATION_DISTANCE_A4_MM
* (_distanceAtCalibrationPoint / _foundFace
.eyesDistance())));
while (_points.size() > _threashold) {
_points.remove(0);
Log.d(TAG, "Removing points RANNNNNNN!!!!");
}
float sum = 0;
for (Point p : _points) {
sum += p.getEyeDistance();
Log.d(TAG, "adding points RANNNNNNN!!!!");
}
_currentAvgEyeDistance = sum / _points.size();
_currentDistanceToFace = CALIBRATION_DISTANCE_A4_MM
* (_distanceAtCalibrationPoint / _currentAvgEyeDistance);
_currentDistanceToFace = Util.MM_TO_CM(_currentDistanceToFace);
MeasurementStepMessage message = new MeasurementStepMessage();
message.setConfidence(currentFace.confidence());
message.setCurrentAvgEyeDistance(_currentAvgEyeDistance);
message.setDistToFace(_currentDistanceToFace);
message.setEyesDistance(currentFace.eyesDistance());
message.setMeasurementsLeft(_calibrationsLeft);
message.setProcessTimeForLastFrame(_processTimeForLastFrame);
MessageHUB.get().sendMessage(MessageHUB.MEASUREMENT_STEP, message);
}
private long _lastFrameStart = System.currentTimeMillis();
private float _processTimeForLastFrame = -1;
#Override
public void onPreviewFrame(final byte[] data, final Camera camera) {
Log.d(TAG, "onPreviewFrame RANNNNNNN!!!!" + _calibrationsLeft);
if (_calibrationsLeft == -1)
return;
if (_calibrationsLeft > 0) {
// Doing calibration !
Log.d(TAG, "_calibrationLeft RANNNNNNN!!!!" + _calibrationsLeft);
if (_currentFaceDetectionThread != null
&& _currentFaceDetectionThread.isAlive()) {
Log.d(TAG, "_currentFaceDectectionThread RANNNNNNN!!!!" + _currentFaceDetectionThread);
// Drop Frame
return;
}
// No face detection started or already finished
_processTimeForLastFrame = System.currentTimeMillis()
- _lastFrameStart;
_lastFrameStart = System.currentTimeMillis();
if (_currentFaceDetectionThread != null) {
Log.d(TAG, "_calibrationLeft-- RANNNNNNN!!!!");
_calibrationsLeft--;
updateMeasurement(_currentFaceDetectionThread.getCurrentFace());
if (_calibrationsLeft == 0) {
Log.d(TAG, "Calibrating done RANNNNNNN!!!!");
doneCalibrating();
invalidate();
return;
}
}
_currentFaceDetectionThread = new FaceDetectionThread(data,
_previewSize);
_currentFaceDetectionThread.start();
invalidate();
} else {
// Simple Measurement
if (_currentFaceDetectionThread != null
&& _currentFaceDetectionThread.isAlive()) {
Log.d(TAG, "Dropping frames RANNNNNNN!!!!");
// Drop Frame
return;
}
// No face detection started or already finished
_processTimeForLastFrame = System.currentTimeMillis()
- _lastFrameStart;
_lastFrameStart = System.currentTimeMillis();
if (_currentFaceDetectionThread != null)
updateMeasurement(_currentFaceDetectionThread.getCurrentFace());
Log.d(TAG, "Updating measurements RANNNNNNN!!!!");
_currentFaceDetectionThread = new FaceDetectionThread(data,
_previewSize);
_currentFaceDetectionThread.start();
Log.d(TAG, "invalidate RANNNNNNN!!!!");
invalidate();
}
}
/*
* SURFACE METHODS, TO CREATE AND RELEASE SURFACE THE CORRECT WAY.
*
* #see
* android.view.SurfaceHolder.Callback#surfaceCreated(android.view.SurfaceHolder
* )
*/
#Override
public void surfaceCreated(final SurfaceHolder holder) {
synchronized (this) {
// This allows us to make our own drawBitmap
this.setWillNotDraw(false);
}
}
#Override
public void surfaceDestroyed(final SurfaceHolder holder) {
mCamera.release();
mCamera = null;
}
#Override
public void surfaceChanged(final SurfaceHolder holder, final int format,
final int width, final int height) {
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
Parameters parameters = mCamera.getParameters();
_previewSize = parameters.getPreviewSize();
// mCamera.setDisplayOrientation(90);
// mCamera.setParameters(parameters);
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
mCamera.setPreviewCallback(this);
} catch (Exception e) {
Log.d("This", "Error starting camera preview: " + e.getMessage());
}
}
}
Here is the other code.
package com.example.phliip_vision;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.hardware.Camera.Size;
import android.media.FaceDetector;
import android.media.FaceDetector.Face;
import android.util.Log;
public class FaceDetectionThread extends Thread {
public static final String FACEDETECTIONTHREAD_TAG = "FaceDetectionThread_Tag";
private static final String TAG = "FaceDetectionThread";
private Face _currentFace;
private final byte[] _data;
private final Size _previewSize;
private Bitmap _currentFrame;
public FaceDetectionThread(final byte[] data, final Size previewSize) {
Log.d(TAG, "What are we waiting on in FaceDetectionThread????");
_data = data;
_previewSize = previewSize;
}
public Face getCurrentFace() {
Log.d(TAG, "What are we waiting on in Current faces????");
return _currentFace;
}
public Bitmap getCurrentFrame() {
return _currentFrame;
}
/**
* bla bla bla
*/
#Override
public void run() {
long t = System.currentTimeMillis();
YuvImage yuvimage = new YuvImage(_data, ImageFormat.NV21,
_previewSize.width, _previewSize.height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
if (!yuvimage.compressToJpeg(new Rect(0, 0, _previewSize.width,
_previewSize.height), 100, baos)) {
Log.e("Camera", "compressToJpeg failed");
}
Log.i("Timing", "Compression finished: "
+ (System.currentTimeMillis() - t));
t = System.currentTimeMillis();
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.RGB_565;
_currentFrame = BitmapFactory.decodeStream(new ByteArrayInputStream(
baos.toByteArray()), null, bfo);
Log.i("Timing", "Decode Finished: " + (System.currentTimeMillis() - t));
t = System.currentTimeMillis();
// Rotate the so it siuts our portrait mode
Matrix matrix = new Matrix();
matrix.postRotate(90);
matrix.preScale(-1, 1);
// We rotate the same Bitmap
_currentFrame = Bitmap.createBitmap(_currentFrame, 0, 0,
_previewSize.width, _previewSize.height, matrix, false);
Log.i("Timing",
"Rotate, Create finished: " + (System.currentTimeMillis() - t));
t = System.currentTimeMillis();
if (_currentFrame == null) {
Log.e(FACEDETECTIONTHREAD_TAG, "Could not decode Image");
return;
}
FaceDetector d = new FaceDetector(_currentFrame.getWidth(),
_currentFrame.getHeight(), 1);
Face[] faces = new Face[1];
d.findFaces(_currentFrame, faces);
Log.i("Timing",
"FaceDetection finished: " + (System.currentTimeMillis() - t));
t = System.currentTimeMillis();
_currentFace = faces[0];
Log.d(FACEDETECTIONTHREAD_TAG, "Found: " + faces[0] + " Faces");
}
}
enter image description here
enter image description here
How to fix this code using multithreading?
It's working but I need to know how to add a thread to this code, I think that this is why the progress bar not updating progressively!
public void copyfile(ActionEvent event){
try {
File fileIn = new File(filepath);
long length = fileIn.length();
long counter = 0;
double r;
double res=(counter/length);
filename=fieldname.getText();
FileInputStream from=new FileInputStream(filepath);
FileOutputStream to=new FileOutputStream("C:\\xampp\\htdocs\\videos\\"+filename+".mp4");
byte [] buffer = new byte[4096];
int bytesRead=0;
while( (r=bytesRead=from.read(buffer))!= 1){
progressbar.setProgress(counter/length);
counter += r*100;
to.write(buffer, 0, bytesRead);
System.out.println("File is loading!!"+(counter/length));
}
from.close();
to.close();
} catch (Exception e) {
progress.setText("upload is finished!!");
}
}
Can you please post any solution, which helps me?
Thanks for all advices.
There is an example of associating a progress bar with progress of a concurrent task in Oracle's JavaFX 8 concurrency documentation.
import javafx.concurrent.Task;
Task task = new Task<Void>() {
#Override public Void call() {
static final int max = 1000000;
for (int i=1; i<=max; i++) {
if (isCancelled()) {
break;
}
updateProgress(i, max);
}
return null;
}
};
ProgressBar bar = new ProgressBar();
bar.progressProperty().bind(task.progressProperty());
new Thread(task).start();
can someone please help. i am trying to send data to a thermal printer using bluetooth. i understand how to discover the devices but not able to connect or know how to send the stream of data to be printed. what do I use here ? there is OBEX and RFComm. which one is appropriate. and can you plz share a sample of code to show how to do it, it would be much appreciated.
Below is a sample code that i have found which uses OBEX to search for near by devices and its actually for image transferring. can you plz point out to me the part that are important and how to change this in order to send a stream of Data rather than picture... plz plz help
public class BluetoothImageSender extends MIDlet implements CommandListener{
public Display display;
public Form discoveryForm;
public Form readyToConnectForm;
public Form dataViewForm;
public ImageItem mainImageItem;
public Image mainImage;
public Image bt_logo;
public TextField addressTextField;
public TextField subjectTextField;
public TextField messageTextField;
public Command selectCommand;
public Command exitCommand;
public Command connectCommand;
public List devicesList;
public Thread btUtility;
public String btConnectionURL;
public boolean readData = false;
public long startTime = 0;
public long endTime = 0;
public BluetoothImageSender() {
startTime = System.currentTimeMillis();
display = Display.getDisplay(this);
discoveryForm = new Form("Image Sender");
try{
mainImage = Image.createImage("/btlogo.png");
bt_logo = Image.createImage("/btlogo.png");
} catch (java.io.IOException e){
e.printStackTrace();
}
mainImageItem = new ImageItem("Bluetooth Image Sender", mainImage, Item.LAYOUT_CENTER, "");
discoveryForm.append(mainImageItem);
discoveryForm.append("\nThis application will scan the area for Bluetooth devices and determine if any are offering OBEX services.\n\n");
/// discoveryForm initialization
exitCommand = new Command("Exit", Command.EXIT, 1);
discoveryForm.addCommand(exitCommand);
discoveryForm.setCommandListener(this);
/// devicesList initialization
devicesList = new List("Select a Bluetooth Device", Choice.IMPLICIT, new String[0], new Image[0]);
selectCommand = new Command("Select", Command.ITEM, 1);
devicesList.addCommand(selectCommand);
devicesList.setCommandListener(this);
devicesList.setSelectedFlags(new boolean[0]);
/// readyToConnectForm initialization
readyToConnectForm = new Form("Ready to Connect");
readyToConnectForm.append("The selected Bluetooth device is currently offering a valid OPP service and is ready to connect. Please click on the 'Connect' button to connect and send the data.");
connectCommand = new Command("Connect", Command.ITEM, 1);
readyToConnectForm.addCommand(connectCommand);
readyToConnectForm.setCommandListener(this);
/// dataViewForm initialization
dataViewForm = new Form("File Sending Progress");
dataViewForm.append("Below is the status of the file sending process:\n\n");
dataViewForm.addCommand(exitCommand);
dataViewForm.setCommandListener(this);
}
public void commandAction(Command command, Displayable d) {
if(command == selectCommand) {
btUtility.start();
}
if(command == exitCommand ) {
readData = false;
destroyApp(true);
}
if(command == connectCommand ) {
Thread filePusherThread = new FilePusher();
filePusherThread.start();
display.setCurrent(dataViewForm);
}
}
public void startApp() {
display.setCurrent(discoveryForm);
btUtility = new BTUtility();
}
public void pauseApp() {
}
public void destroyApp(boolean b) {
notifyDestroyed();
}
////////////////
/**
* This is an inner class that is used for finding
* Bluetooth devices in the vicinity.
*/
class BTUtility extends Thread implements DiscoveryListener {
Vector remoteDevices = new Vector();
Vector deviceNames = new Vector();
DiscoveryAgent discoveryAgent;
// obviously, 0x1105 is the UUID for
// the Object Push Profile
UUID[] uuidSet = {new UUID(0x1105) };
// 0x0100 is the attribute for the service name element
// in the service record
int[] attrSet = {0x0100};
public BTUtility() {
try {
LocalDevice localDevice = LocalDevice.getLocalDevice();
discoveryAgent = localDevice.getDiscoveryAgent();
discoveryForm.append(" Searching for Bluetooth devices in the vicinity...\n");
discoveryAgent.startInquiry(DiscoveryAgent.GIAC, this);
} catch(Exception e) {
e.printStackTrace();
}
}
public void deviceDiscovered(RemoteDevice remoteDevice, DeviceClass cod) {
try{
discoveryForm.append("found: " + remoteDevice.getFriendlyName(true));
} catch(Exception e){
discoveryForm.append("found: " + remoteDevice.getBluetoothAddress());
} finally{
remoteDevices.addElement(remoteDevice);
}
}
public void inquiryCompleted(int discType) {
if (remoteDevices.size() > 0) {
// the discovery process was a success
// so out them in a List and display it to the user
for (int i=0; i<remoteDevices.size(); i++){
try{
devicesList.append(((RemoteDevice)remoteDevices.elementAt(i)).getFriendlyName(true), bt_logo);
} catch (Exception e){
devicesList.append(((RemoteDevice)remoteDevices.elementAt(i)).getBluetoothAddress(), bt_logo);
}
}
display.setCurrent(devicesList);
} else {
// handle this
}
}
public void run(){
try {
RemoteDevice remoteDevice = (RemoteDevice)remoteDevices.elementAt(devicesList.getSelectedIndex());
discoveryAgent.searchServices(attrSet, uuidSet, remoteDevice , this);
} catch(Exception e) {
e.printStackTrace();
}
}
public void servicesDiscovered(int transID, ServiceRecord[] servRecord){
for(int i = 0; i < servRecord.length; i++) {
DataElement serviceNameElement = servRecord[i].getAttributeValue(0x0100);
String _serviceName = (String)serviceNameElement.getValue();
String serviceName = _serviceName.trim();
btConnectionURL = servRecord[i].getConnectionURL(ServiceRecord.NOAUTHENTICATE_NOENCRYPT, false);
System.out.println(btConnectionURL);
}
display.setCurrent(readyToConnectForm);
readyToConnectForm.append("\n\nNote: the connection URL is: " + btConnectionURL);
}
public void serviceSearchCompleted(int transID, int respCode) {
if (respCode == DiscoveryListener.SERVICE_SEARCH_COMPLETED) {
// the service search process was successful
} else {
// the service search process has failed
}
}
}
////////////////
/**
* FilePusher is an inner class that
* now gets the byte[] named file
* to read the bytes of the file, and
* then opens a connection to a remote
* Bluetooth device to send the file.
*/
class FilePusher extends Thread{
FileConnection fileConn = null;
String file_url = "/loginscreen.png";
byte[] file = null;
String file_name = "loginscreen.png";
String mime_type = "image/png";
// this is the connection object to be used for
// bluetooth i/o
Connection connection = null;
public FilePusher(){
}
public void run(){
try{
InputStream is = this.getClass().getResourceAsStream(file_url);
ByteArrayOutputStream os = new ByteArrayOutputStream();
// now read the file in into the byte[]
int singleByte = 0;
while(singleByte != -1){
singleByte = is.read();
os.write(singleByte);
}
System.out.println("file size: " + os.size());
file = new byte[os.size()];
file = os.toByteArray();
dataViewForm.append("File name: " + file_url);
dataViewForm.append("File size: " + file.length + " bytes");
is.close();
os.close();
} catch (Exception e){
e.printStackTrace();
System.out.println("Error processing the file");
}
try{
connection = Connector.open(btConnectionURL);
// connection obtained
// create a session and a headerset objects
ClientSession cs = (ClientSession)connection;
HeaderSet hs = cs.createHeaderSet();
// establish the session
cs.connect(hs);
hs.setHeader(HeaderSet.NAME, file_name);
hs.setHeader(HeaderSet.TYPE, mime_type); // be sure to note that this should be configurable
hs.setHeader(HeaderSet.LENGTH, new Long(file.length));
Operation putOperation = cs.put(hs);
OutputStream outputStream = putOperation.openOutputStream();
outputStream.write(file);
// file push complete
outputStream.close();
putOperation.close();
cs.disconnect(null);
connection.close();
dataViewForm.append("Operation complete. File transferred");
endTime = System.currentTimeMillis();
long diff = (endTime - startTime)/1000;
System.out.println("Time to transfer file: " + diff);
dataViewForm.append("Time to transfer file: " + diff);
} catch (Exception e){
System.out.println("Error sending the file");
System.out.println(e);
e.printStackTrace();
}
}
}
}
I want to make the clock application where user enters the number in the textbox and click ok then user get the number at every 1 sec.
Example if user enter 5 then the timer start the display screen shows the number 1,2,3,4,5,0,1,2,3,4,5,0,1,2,3,...so on.
Now i had taken form and text field for user to enter the number,then a timer which will change the number at every second.and 10 images of number (0-9).As i want to dispaly the number in very large size.Now i had implement this logic in below way:-
public class Clock extends MIDlet implements CommandListener {
public Command GO, Exit;
TextField TxtData;
protected Display display;
int number, counter;
Form form;
private Timer timer;
private TestTimerTask task;
boolean increment, time;
private StringItem s1 = new StringItem("", "");
Image image0;
Image image1;
Image image2;
Image image3;
Image image4;
Image image5;
Image image6;
Image image7;
Image image8;
Image image9;
Image[] secondAnimation;
protected void startApp() {
display = Display.getDisplay(this);
increment = true;
time = false;
form = new Form("Clock");
TxtData = new TextField("Number:-", "", 5, TextField.NUMERIC);
try {
image0 = Image.createImage("/images/0.png");
image1 = Image.createImage("/images/1.png");
image2 = Image.createImage("/images/2.png");
image3 = Image.createImage("/images/3.png");
image4 = Image.createImage("/images/4.png");
image5 = Image.createImage("/images/5.png");
image6 = Image.createImage("/images/6.png");
image7 = Image.createImage("/images/7.png");
image8 = Image.createImage("/images/8.png");
image9 = Image.createImage("/images/9.png");
secondAnimation = new Image[]{image0,image1,image2, image3, image4, image5, image6, image7, image8, image9};
} catch (IOException ex) {
System.out.println("exception");
}
GO = new Command("Go", Command.OK, 1);
Exit = new Command("Exit", Command.EXIT, 2);
form.append(TxtData);
form.append(s1);
form.addCommand(GO);
form.addCommand(Exit);
form.setCommandListener(this);
display.setCurrent(form);
}
protected void pauseApp() {
}
protected void destroyApp(boolean unconditional) {
timer.cancel();
notifyDestroyed();
}
public void commandAction(Command cmnd, Displayable dsplbl) {
String label = cmnd.getLabel();
if (label.equals("Go")) {
try {
System.out.println("txt==" + (TxtData.getString()));
if (!TxtData.getString().equalsIgnoreCase("")) {
counter = Integer.parseInt(TxtData.getString());
if (time) {
timer.cancel();
task.cancel();
}
number = 1;
timer = new Timer();
task = new TestTimerTask();
timer.schedule(task, 1000, 1000);
}
} catch (NumberFormatException ex) {
System.out.println("exception");
}
} else if (label.equals("Exit")) {
destroyApp(true);
}
}
private class TestTimerTask extends TimerTask {
public final void run() {
time = true;
s1.setText(""+ number);
if (counter < 10) {
form.append(secondAnimation[0]);
form.append(secondAnimation[0]);
form.append(secondAnimation[number]);
} else if (counter < 100) {
form.append(secondAnimation[0]);
form.append(secondAnimation[(number % 100) / 10]);
form.append(secondAnimation[(number % 10)]);
} else if (counter < 1000) {
form.append(secondAnimation[(number % 10)]);
form.append(secondAnimation[(number % 100) / 10]);
form.append(secondAnimation[(number / 100)]);
}
number++;
if (number == counter + 1) {
number = 0;
}
}
} }
But as the form goes on appending the image as timer moves it is not showing the desired output!
I had tried to do it through LWUIT but as i had user 10 .png files and adding LWUIT.jar file make the size of .jar file 557kb which is very heavy.
So i want to do it through normal forms only.
I cant use canvas as the keypad can vary like (touch,qwerty etc).So i need to do normal form or LWUIT only.Can anyone please help me for this.
I noticed you only append items but never remove - is that intended?
Also, did you try two different forms to animate instead of one? For simple test, say, fill them in parallel, just call setCurrent for one that is not displayed in the moment of update
//...
private void appendTwice(Image image) {
form1.append(image);
form2.append(image);
}
//...
public final void run() {
time = true;
s1.setText(""+ number);
if (counter < 10) {
appendTwice(secondAnimation[0]);
//...
}
display.setCurrent(number & 1 == 0 ? form1 : form2);
number++;
//...
}
//...
i am having two cell phones and i want to exchange file between these two.
Device A invoke java app, it will scan available bluetooth device in range, show them into list and user can select one device and click send.
i have written below code, it is not working.
package hello;
import java.io.*;
import java.util.Vector;
import javax.bluetooth.*;
import javax.microedition.io.*;
import javax.microedition.io.StreamConnection.*;
import javax.microedition.lcdui.*;
import javax.microedition.midlet.MIDlet;
import javax.obex.*;
import javax.obex.ResponseCodes;
public class MyMidlet extends MIDlet implements CommandListener, DiscoveryListener
{
public Command cmdSend;
public Command cmdScan;
public TextBox myText;
public List devList;
public Form myForm;
private LocalDevice localDev;
private DiscoveryAgent dAgent;
private ServiceRecord servRecord;
private Vector myVector;
private ClientSession connection = null;
private String url = null;
private Operation op = null;
private boolean cancelInvoked = false;
public MyMidlet()
{
cmdSend = new Command("Send", 2, 0);
cmdScan = new Command("Scan", 5, 0);
}
public void startApp()
{
if(myText == null)
{
myText = new TextBox("Dummy Text", "Hello", 10, 0);
myText.addCommand(cmdScan);
myText.setCommandListener(this);
Display.getDisplay(this).setCurrent(myText);
}
}
public void pauseApp(){}
public void destroyApp(boolean flag) { }
public void commandAction(Command command, Displayable displayable)
{
if(command == cmdScan)
{
if(myForm == null) { myForm = new Form("Scanning"); }
else {
for(int i = 0; i < myForm.size(); i++) myForm.delete(i);
}
myForm.append("Scanning for bluetooth devices..");
Display.getDisplay(this).setCurrent(myForm);
if(devList == null)
{
devList = new List("Devices", 3);
devList.addCommand(cmdSend);
devList.setCommandListener(this);
} else
{
for(int j = 0; j < devList.size(); j++) devList.delete(j);
}
if(myVector == null) myVector = new Vector();
else myVector.removeAllElements();
try
{
if(localDev == null)
{
localDev = LocalDevice.getLocalDevice();
localDev.setDiscoverable(0x9e8b33);
dAgent = localDev.getDiscoveryAgent();
}
dAgent.startInquiry(0x9e8b33, this);
}
catch(BluetoothStateException bluetoothstateexception)
{
myForm.append("Please check your bluetooth is turn-on");
}
}
if(command == cmdSend)
{
myForm.setTitle("Sending");
for(int k = 0; k < myForm.size(); k++) myForm.delete(k);
myForm.append("Sending application..");
Display.getDisplay(this).setCurrent(myForm);
try
{
RemoteDevice remotedevice = (RemoteDevice)myVector.elementAt(devList.getSelectedIndex());
dAgent.searchServices(null, new UUID[] {new UUID(4358L)}, remotedevice, this);
return;
}
catch(BluetoothStateException bluetoothstateexception1)
{
myForm.append("could not open bluetooth: " + bluetoothstateexception1.toString());
}
}
}
public void deviceDiscovered(RemoteDevice remotedevice, DeviceClass deviceclass)
{
try
{
devList.append(remotedevice.getFriendlyName(false), null);
}
catch(IOException _ex)
{
devList.append(remotedevice.getBluetoothAddress(), null);
}
myVector.addElement(remotedevice);
}
public void servicesDiscovered(int i, ServiceRecord aservicerecord[])
{
servRecord = aservicerecord[0];
}
public void serviceSearchCompleted(int i, int j)
{
if(j != 1) myForm.append("service search not completed: " + j);
try
{
byte[] fileContent = "Raxit Sheth -98922 38248".getBytes();
String s=servRecord.getConnectionURL(0, false);
myForm.append("Debug 0");
connection = (ClientSession) Connector.open(s);
myForm.append("Debug1");
HeaderSet headerSet = connection.connect(null);
myForm.append("Debug1.1");
headerSet.setHeader(HeaderSet.NAME, "a.txt");
headerSet.setHeader(HeaderSet.TYPE, "text/plain");
headerSet.setHeader(HeaderSet.LENGTH, new Long(fileContent.length));
myForm.append("Debug1.2");
//op = connection.put(headerSet); throwing java.lang.IllegalArgument.Exception
op = connection.put(null);
myForm.append("Debug1.2.1");
op.sendHeaders(headerSet);
myForm.append("Debug1.3");
OutputStream out = op.openOutputStream();
myForm.append("Debug2");
//sending data
myForm.append("Debug3");
out.write(fileContent);
myForm.append("Debug4");
//int responseCode = op.getResponseCode();
//myForm.append("resp code="+responseCode);
out.close();
op.close();
connection.close();
myForm.append("Done");
//i was expecting this will send a.txt file with content Raxit Sheth -98922 38248
//to remote device's inbox/gallery/bluetooth folder
}
catch(Exception ex) { myForm.append(ex.toString()); }
}
public void inquiryCompleted(int i)
{
Display.getDisplay(this).setCurrent(devList);
}
}
Your problem is almost certainly the fact that you're starting your bluetooth scanning in the commandAction() method. This is a system lifecycle method, and needs to return quickly. Attempting to perform a blocking operations (such as bluetooth scanning) in this thread could tie up resources which the handset needs to do other things such as the actual scanning!
Refactor so that the scanning is performed in a new thread, then try again.