ONNX model with sub operator error does not bind - onnx

I am trying to create a session with a squeezenet ONNX model using:
session = winrt::Windows::AI::MachineLearning::LearningModelSession{ model, winrt::Windows::AI::MachineLearning::LearningModelDevice(deviceKind) };
I have two versions of squeezenet. One has a 'Sub' layer at the beginning, and the other does not. The one with 'Sub' throws an error when the above is executed.
Any ideas what is happening here?

What is the shape of the input for the squeezenet that is failing? If you post the failing model we can help you troubleshoot it.
2 key things to make it work for OS build 17763:
Make sure you are using ONNX versions 1.2 (opset 7)
Make sure your input has the right shape that the model is expecting.
What error are you getting when you create the LearningModelSession ?

The existence of 'Sub' operator in ONNX graph should not affect whether or not you can run that model on Windows. I think the more important question is the ONNX version (or the operator set version) and the target Windows version. Starting in October 2018 update, windows machine learning will be compatible with ONNX version 1.2.2 (https://github.com/onnx/onnx/releases/tag/v1.2.2). Double check if you have the model 1.2.2 and October 2018 update SDK (10.0.17763.x)

Related

Unity ECS on linux error when trying to start new 2D project

I have installed Unity on linux and everithing seems to work as expected when developing games in the standard model. However when I set up a project for ECS development, by installing the following packages:
Burst 1.3.0
Entities 0.11.0
Hybrid renderer 0.5.1
Jobs 0.2.10
Mathematics 1.1.0
I get the following error when creating a blank 2D project:
Library/PackageCache/com.unity.2d.animation#3.2.2/Runtime/TransformAccessJob.cs(196,62): error CS1061: 'NativeHashMap<int, TransformAccessJob.TransformData>' does not contain a definition for 'Length' and no accessible extension method 'Length' accepting a first argument of type 'NativeHashMap<int, TransformAccessJob.TransformData>' could be found (are you missing a using directive or an assembly reference?)
Unity version: 2019.3.14f1 Personal
Anyone has any idea what could be the issue?
Removing the 2D Animation package is not a valid option for me since I am using the 2D skeleton animation functionalities it provides.
This problem seems to result from the API change with Jobs 0.2.10.
Downgrading Jobs to 0.2.9 solved this problem for me.
A temporary fix if you want to use the animation package is to change m_TransformData.Length to m_TransformData.Count() in the line specified in the error (TransformAccessJob.cs line 196).

Unable to work with Tensorflow old as well as new version

I have been trying to understand how Tensorflow works by executing some of the codes present as samples in the online tutorials. However, when I execute the code, I run into either of the 2 problems below:-
1. module 'tensorflow' has no attribute 'placeholder'
2. module 'tensorflow_core.compat.v1' has no attribute 'unpack'
I had tensorflow version 1.14 installed earlier in Anaconda. I was not able to upgrade to tensorflow 2.0. So I unistalled Anaconda and did a reinstallation along with the newer version of tensorflow. I did this primalrily because several of the function definitions had changed. Order of inputs and attribute names like unpack for example. But with the newer version even placeholder is not being recognised. I checked online and used tf.disable_v2_behavior() to use the previous version even though I have the latest version installed. But then I get the naming error again. I am really confused how to proceed. I have been at it since hours. Any suggestions ?

gensim KeydVectors dimensions

Im gensims latest version, loading trained vectors from a file is done using KeyedVectors, and dosent requires instantiating a new Word2Vec object. But now my code is broken because I can't use the model.vector_size property. What is the alternative to that? I mean something better than just kv[kv.index2word[0]].size.
kv.vector_size still works; I'm using gensim 2.3.0, which is the latest as I write. (I am assuming kv is your KeyedVectors object.) It appears object properties are not documented on the API page, but auto-complete suggests it, and there is no deprecated warning or anything.
Your question helped me answer my own, which was how to get the number of words: len(kv.index2word)

How to create a GrammaticalRelation in Stanford CoreNLP

I have recently upgraded to the latest version of Stanford CoreNLP. The code I previously used to get the subject or object in a sentence was
System.out.println("subject: "+dependencies.getChildWithReln(dependencies.getFirstRoot(), EnglishGrammaticalRelations.NOMINAL_SUBJECT));
but this now returns null.
I have tried creating a relation with
GrammaticalRelation subjreln =
edu.stanford.nlp.trees.GrammaticalRelation.valueOf("nsubj");
without success. If I extract a relation using code like
GrammaticalRelation target = (dependencies.childRelns(dependencies.getFirstRoot())).iterator().next();
Then run the same request,
System.out.println("target: "+dependencies.getChildWithReln(dependencies.getFirstRoot(), target));
then I get the desired result, confirming that the parsing worked fine (I also know this from printing out the full dependencies).
I suspect my problem has to do with the switch to universal dependencies, but I don't know how to create the GrammaticalRelation from scratch in a way that will match what the dependency parser found.
Since version 3.5.2 the default dependency representation in CoreNLP is Universal Dependencies. This new representation is implemented in a different class (UniversalEnglishGrammaticalRelations) so the GrammaticalStructure objects are now defined somewhere else.
All you have to do to use the new version is to replace EnglishGrammaticalRelations with UniversalGrammaticalRelations:
System.out.println("subject: "+dependencies.getChildWithReln(dependencies.getFirstRoot(), UniversalEnglishGrammaticalRelations.NOMINAL_SUBJECT));
Note, however, that some relations in the new representation are different and might no longer exist (nsubj still does). We are currently compiling migration guidelines from the old representation to the new Universal Dependencies relations. It is still incomplete but it already contains all relation names and their class names in CoreNLP.

"Compilation failed for data model at path" when compiling Core Data model after upgrading to lion

After upgrading to lion, the following error prevents successful compilation of a core data model:
core-data-model/MyModel.xcdatamodeld:0: error: Compilation
failed for data model at path
'resources/MyModel.momd/MyModel.mom'
This is the result of executing the following command:
/Developer/usr/bin/momc core-data-model/MyModel.xcdatamodeld resources/MyModel.momd
Note that this command is executing in a custom build script independently of xcode, and that it ran without problems before upgrading to Lion.
I've read of model compilation errors after upgrading to Lion (for instance see this question), but the solutions detailed there do not seem to apply.
Anyone else encounter problems manually invoking model compilation after upgrading to Lion? Any ideas? Thanks.
Figured it out - it seems like the object model compiler now expects the destination path to be absolute. This works:
/Developer/usr/bin/momc core-data-model/MyModel.xcdatamodeld
/Users/amos/projects/my-project/resources/MyModel.momd
I was also getting this error because I had a bad inverse relationship in my model :
I fixed it by splitting it into two inverse relationships :

Resources