I want to make an application that guesses the name of the selected food image. I used yolov5. Output is correct on test images in colab.
I'm testing with the same image.
https://colab.research.google.com/github/ultralytics/yolov5/blob/master/tutorial.ipynb
I created the yolov5s-fp16.lite file from here.
https://www.youtube.com/watch?v=tySgZ1rEbW4
I followed this video for it to work on android platform.
my tf lite file on android
try {
Yolov5sFp16 model = Yolov5sFp16.newInstance(context);
// Creates inputs for reference.
TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 640, 640, 3}, DataType.FLOAT32);
inputFeature0.loadBuffer(byteBuffer);
// Runs model inference and gets result.
Yolov5sFp16.Outputs outputs = model.process(inputFeature0);
TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
// Releases model resources if no longer used.
model.close();
} catch (IOException e) {
// TODO Handle the exception
}
predictBtn
predictBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view){
try {
Yolov5sFp16 model = Yolov5sFp16.newInstance(MainActivity.this);
// Creates inputs for reference.
TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{4, 640, 640, 3}, DataType.FLOAT32);
Log.d("shape", inputFeature0.toString());
bitmap = Bitmap.createScaledBitmap(bitmap, 640 ,640,true );
inputFeature0.loadBuffer(TensorImage.fromBitmap(bitmap).getBuffer());
// Runs model inference and gets result.
Yolov5sFp16.Outputs outputs = model.process(inputFeature0);
TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
result.setText(labels[getMax(outputFeature0.getFloatArray())]+" ");
// Releases model resources if no longer used.
model.close();
} catch (IOException e) {
// TODO Handle the exception
}
}
});
}
ERROR
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.myapplicationdeploy, PID: 6744
java.lang.IllegalArgumentException: The size of byte buffer and the shape do not match.
at org.tensorflow.lite.support.common.SupportPreconditions.checkArgument(SupportPreconditions.java:104)
at org.tensorflow.lite.support.tensorbuffer.TensorBuffer.loadBuffer(TensorBuffer.java:309)
at org.tensorflow.lite.support.tensorbuffer.TensorBuffer.loadBuffer(TensorBuffer.java:328)
at com.example.myapplicationdeploy.MainActivity$3.onClick(MainActivity.java:108)
at android.view.View.performClick(View.java:6597)
at com.google.android.material.button.MaterialButton.performClick(MaterialButton.java:1194)
at android.view.View.performClickInternal(View.java:6574)
at android.view.View.access$3100(View.java:778)
at android.view.View$PerformClick.run(View.java:25885)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:193)
at android.app.ActivityThread.main(ActivityThread.java:6669)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)
When I debug, it gives an error in the buffer part. What should I do? What should the dimensions be?
Related
I'm trying to make a handwritten text recognition android application on android studio. I converted my model into tensorflow lite and then uploaded that file in my application. Below Im sharing the code for the same... There are many log statements in the code. to check till where it is getting executed.
public void classifyImage(Bitmap image){
try {
Ocr mdl = Ocr.newInstance(getApplicationContext());
TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 128, 32, 1}, DataType.FLOAT32);
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(4*128*32*1);
int[] intValues = new int[128 * imageSize];
image.getPixels(intValues, 0, image.getWidth(), 0, 0, image.getWidth(), image.getHeight());
Log.d("TAG Model", " get pixels ");
int pixel = 0;
for (int i=0; i<4096; i++){
int val = intValues[pixel++];
byteBuffer.putFloat(val);
}
Log.d("TAG Model", " bytebuffer putfloat com ");
inputFeature0.loadBuffer(byteBuffer);
Ocr.Outputs outputs = mdl.process(inputFeature0);
Log.d("TAG Model", " Model process ");
TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
Log.d("TAG Model", " Model output");
float[] confidences = outputFeature0.getFloatArray();
mdl.close();
} catch (Exception e) {
// TODO Handle the exception
Log.d("TAG Model", "classifyImage: "+e.getMessage());
}
}
On running my application Im getting an error : D/TAG Model: classifyImage: Cannot copy from a TensorFlowLite tensor (StatefulPartitionedCall:0) with 10368 bytes to a Java Buffer with 324 bytes.
screenshot of the error
Can you please help we with this, as i'm struck on this error and cannot proceed further.
I already have a CRF trained model that I have trained using SimpleTagger.
SimpleTagger.main(new String[] {
"--train", "true",
"--model-file", "/Desktop/crfmodel",
"--threads", "8",
"--training-proportion", "0.8",
"--weights", "dense",
"--test", "lab",
// "--orders", "2",
"/Desktop/annotations.txt"
});
I am planning to load this model and use it for tagging. I am using this code.
public static void main(String[] args) throws Exception {
//DOCS http://mallet.cs.umass.edu/classifier-devel.php
Instance instance = getMyInstance();
Classifier classifier = loadClassifier(Paths.get("/Desktop/crfmodel").toFile());
Labeling labeling = classifier.classify(instance).getLabeling();
Label l = labeling.getBestLabel();
System.out.print(instance);
System.out.println(l);
}
private static Classifier loadClassifier(File serializedFile)
throws FileNotFoundException, IOException, ClassNotFoundException {
ObjectInputStream ois = new ObjectInputStream (new FileInputStream(serializedFile));
Classifier crf = (Classifier) ois.readObject();
ois.close();
return crf;
}
When I try to do the above I get the following error
Exception in thread "main" java.lang.ClassCastException: cc.mallet.fst.CRF cannot be cast to cc.mallet.classify.Classifier
at TagClassifier.loadClassifier(TagClassifier.java:77)
at TagClassifier.main(TagClassifier.java:64)
The error is happening in line
Classifier crf = (Classifier) ois.readObject();
May I know why this is happening. Also, if there is a correct documented way to label an input using a trained model, can you please share any links/documentation? Thank you very much in advance!!!
I think I figured it out by looking at SimpleTagger code.
crfModel = loadClassifier(Paths.get("/Desktop/crfmodel").toFile());
pipe = crfModel.getInputPipe();
pipe.setTargetProcessing(false);
String formatted = getFormattedQuery(q);
Instance instance = pipe.pipe(new Instance(formatted, null, null, null));
Sequence sequence = (Sequence) instance.getData();
Sequence[] tags = tag(sequence, 3);
private static Sequence[] tag(Sequence input, int bestK) {
Sequence[] answers;
if (bestK == 1) {
answers = new Sequence[1];
answers[0] = crfModel.transduce(input);
} else {
MaxLatticeDefault lattice = new MaxLatticeDefault(crfModel, input, null);
answers = lattice.bestOutputSequences(bestK).toArray(new Sequence[0]);
}
return answers;
}
They're different things, so you can't cast one to the other. A CRF infers classes for each element in a sequence, so its output is an array of labels. A classifier takes one input and returns one label.
I'm trying to create an app that can use the camera for Windows Phone 8.1, using the Windows RT/XAML development model.
When I try to call either of the capture methods off of the MediaCapture class I get an ArgumentException with the message "The parameter is incorrect." Here is my code
private async Task Initialize()
{
if (!DesignMode.DesignModeEnabled)
{
await _mediaCaptureMgr.InitializeAsync();
ViewFinder.Source = _mediaCaptureMgr;
await _mediaCaptureMgr.StartPreviewAsync();
}
}
private async void ViewFinder_OnTapped(object sender, TappedRoutedEventArgs e)
{
ImageEncodingProperties imageProperties = ImageEncodingProperties.CreateJpeg();
var stream = new InMemoryRandomAccessStream();
await _mediaCaptureMgr.CapturePhotoToStreamAsync(imageProperties, stream);
_bitmap = new WriteableBitmap((int) ViewFinder.ActualWidth, (int) ViewFinder.ActualHeight);
stream.Seek(0);
await _bitmap.SetSourceAsync(stream);
PreviewImage.Source = _bitmap;
PreviewElements.Visibility = Visibility.Visible;
ViewFinder.Visibility = Visibility.Collapsed;
Buttons.Visibility = Visibility.Visible;
Message.Visibility = Visibility.Collapsed;
stream.Seek(0);
var buffer = new global::Windows.Storage.Streams.Buffer((uint) stream.Size);
stream.ReadAsync(buffer, (uint) stream.Size, InputStreamOptions.None);
DataContext = buffer.ToArray();
if (PhotoCaptured != null)
PhotoCaptured(this, null);
}
The initialize method is called on page load, and the viewfinder_ontapped is called when they tap the CaptureElement I have in the xaml. The error is thrown on
await _mediaCaptureMgr.CapturePhotoToStreamAsync(imageProperties, stream);
What's really bizarre is that I downloaded the latest source for the winrt xaml toolkit http://winrtxamltoolkit.codeplex.com/ and tried their sample camera app, which uses similar code. It throws the same error on MediaCapture.CapturePhotoToStorageFileAsync(). Can anyone help me identify why?
i am trying to load an image to be used as a texture and i keep getting input==null errors; This is my code where i am trying to load the image and where most of the errors are pointing to.
package com.mime.minefront.graphics;
import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;
public class Texture {
public static Render floor = loadBitmap("/C:\\Users\\Danny\\Documents\\NetBeansProjects\\MyFirstgame\\src\\res\\textures\\floor.png");
public static Render loadBitmap(String fileName) {
try {
BufferedImage image = ImageIO.read(Texture.class.getResource(fileName));
int width = image.getWidth();
int height = image.getHeight();
Render result = new Render(width, height);
image.getRGB(0, 0, width, height, result.pixels, 0, width);
return result;
} catch (Exception e) {
System.out.println("CRASH!");
throw new RuntimeException(e);
}
}
}
and the errors i am getting are as followed.
Exception in thread "Thread-2" java.lang.ExceptionInInitializerError
at com.mime.minefront.graphics.Render3D.floor(Render3D.java:42)
at com.mime.minefront.graphics.Screen.render(Screen.java:28)
at com.mime.minefront.Display.render(Display.java:144)
at com.mime.minefront.Display.run(Display.java:112)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: input == null!
at com.mime.minefront.graphics.Texture.loadBitmap(Texture.java:19)
at com.mime.minefront.graphics.Texture.<clinit>(Texture.java:8)
... 5 more
Caused by: java.lang.IllegalArgumentException: input == null!
at javax.imageio.ImageIO.read(ImageIO.java:1388)
at com.mime.minefront.graphics.Texture.loadBitmap(Texture.java:11)
... 6 more
BUILD SUCCESSFUL (total time: 4 seconds)
Please help
I am trying to download image from Internet using async call like this:
private void DoGetAlbumart(object sender, DoWorkEventArgs e)
{
string req = (string)e.Argument;
WebClient wc = new WebClient();
wc.OpenReadCompleted += new OpenReadCompletedEventHandler(ReadWebRequestCallback);
wc.OpenReadAsync(new Uri(req));
}
void ReadWebRequestCallback( object sender, OpenReadCompletedEventArgs e)
{
if (e.Error == null && !e.Cancelled)
{
try
{
BitmapImage image = new BitmapImage();
image.SetSource(e.Result);
SecondTile.Source = image;
}
catch (Exception ex)
{
}
}
else
{
}
}
It seems that when breakpoint hits at BitmapImage image = new BitmapImage(), I got the following exception:
ex = {System.UnauthorizedAccessException: Invalid cross-thread access.
at MS.Internal.XcpImports.CheckThread()
at System.Windows.DependencyObject..ctor(UInt32 nativeTypeIndex, IntPtr constructDO)
at System.Windows.Media.Imaging.BitmapImage..ctor()
What else can I try to get rid of this error?
Callback methods run in background threads, not the UI thread. Unfortunately, BitmapImages can only be instantiated in the UI thread. If you need to access the UI thread from a callback, try the following:
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
BitmapImage image = new BitmapImage();
image.SetSource(e.Result);
SecondTile.Source = image;
});