When I try to run a report in COGNOS Report Studio, I get error : ORA-00918: column ambiguously defined
Now there is no way to get the runtime sql and test it out against the oracle db. So I am left groping around.
My question is...when we develop the model in framework manager, we do not write our own sql. Just specify the tables and columns and joins. So the error should never come because this error come when you forget to prefix a column name with the table alias.
I agree it is odd. Start removing data items from the query until it works. Try to narrow it down to a specific data item within a table. See what SQL is generated without the offending field, that should give you a hint as to what is going on.
Related
Also posted on super users:
I'm a spotfire novice trying to create a parameterized info link. Ultimate goal is to create a default template that may be customized to return specific rows in a very large table. I've not been able to cobble together enough information from online searches to get me from point A to Z.
Spotfire version is 7.11 on an Oracle 11.2 SE DB.
Currently I've got a date/time prompt in the info link that will be global to all users. What I need is to be able to further filter to 1 of 2 columns (one is real, the other a string) in order to minimize loading times. There are 17 other on-demand tables that are related to the main one. Limiting the initial query will greatly speed up performance.
In information designer for the information link, if I edit the SQL in the WHERE and explicitly define the value or string for the column, I get the rows I want. When I try to define it using an input parameter (?ParamName), I either get nothing when I reload or get asked to input a parameter "for testing".
Q1: In the document properties for the analysis, I've been adding in properties that I assume is supposed to get picked up by the query.
- What part do scripts play in passing this variable to the SQL?
- Do I just need to define a value for a property name or include a IronPython script? - If script is required, can I just define the parameter to pass?
Q2: In the info link SQL, what is the correct syntax for defining the parameter variable depending on the type (real v string)? If I use a string, I need to include LIKE in order to pick up the desired rows. If I use a real, is it possible to define it as a list of values?
Thanks in advance.
Though not exactly clear from your description, I think you should be able to accomplish your goals using the "Load on demand" dialog that is accessed either when you add your data table to your analysis, or subsequently using the Data Table Properties>Type of Data>Settings dialog.
Spotfire uses this dialog to dynamically modify your SQL. Thus, you do not need to explicitly include the LIKE statement in your SQL. Spotfire will add it in based on what you define in the On-Demand settings. For example, you could have an Input Field where you type a constraint that will be stored as a Document Property and then refer to that Document Property in your On-Demand settings to control the table loading.
I'm trying to create a report in Report Designer. It uses three tables: SOShipment, SOShipLine, and Inventory Item. But when I try to run it, I get the following error:
System.Exception: The table SOShipLine does not exist.
But SOShipLine definitely exists. It's a core part of Acumatica, I got it from the instance in the Schema Builder, and I even double-checked the database to be absolutely sure that it exists. What's going on here?
I figured out the problem. I had an invalid filter. I had Value1 as [#StartDate], when it should have been either #StartDate or =[#StartDate]. The error it gave me was very confusing, though.
I need to get the Product description manually from database, so please suggest which table it contains.
My Finding
Productslp table contains p_dscription column but that is clob datatype and I am unable to get the data from that.
I would suggest, you raise a support ticket with SAP Product support, since it may be a matter of correct Oracle JDBC driver.
We use HANA DB and had the same issue, they provided me with an updated driver and that resolved the issue for me.
Alternatively you can check the description in backoffice/hmc if that is an acceptable solution.
You can get using the SQL tab in hac with this query.
Select ps.p_description,ps.p_summary from products as p join productslp as ps ON p.pk=ps.itempk where pk=8796158722049
See the image below for result
TLDR; How do you add a full text index using Entity framework 5 coded migrations
I'm having issues adding a full text index to a database using Entity framework migrations. It needs to be there from the start so I'm attempting modifying the InitialCreate migration that was automatically generated to add it.
As there isn't a way to do it via the DbMigrations API I've resorted to running inline sql at the end of the 'Up' code.
Sql("create fulltext catalog AppNameCatalog;");
Sql("create fulltext index on Document (Data type column Extension) key index [PK_dbo.Document] on AppNameCatalog;");
When this runs everything gets created fine until it reaches this sql, then it throws the the sql error 'CREATE FULLTEXT CATALOG statement cannot be used inside a user transaction.'. Which is expected and working as designed.
Thankfully Sql() has an overload that allows you to run the sql outside the migration transaction. Awesome! I thought.
Sql("create fulltext catalog AppNameCatalog;", true);
Sql("create fulltext index on Document (Data type column Extension) key index [PK_dbo.Document] on AppNameCatalog;", true);
But low and behold modifying the code to do this (see above) results in a new timeout error 'Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.'
I've tried spitting out the sql and running it manually and it works fine. I've also diff'd the generated sql with and without running it outside a transaction and they are identical so it must be something in the way sql is executed.
Thanks in advance for any help!
I had a similar problem. My InitialCreate migration was creating a table and then attempting to add a full text index to that table, using the overloaded Sql() to indicate that it needs to execute outside the transaction. I was also getting a timeout error and I suspect it's due to a thread deadlock.
I could get it to work in some scenarios by using Sql() calls instead of CreateTable() and by merging the CREATE FULL TEXT CATALOG and CREATE FULL TEXT INDEX statements into a single Sql() call. However, this wasn't very reliable. Sometimes it would work and sometimes it would fail with the same timeout error.
The only reliable solution I found was to move the creation of the catalog and full text index into a separate migration.
I've got a Subsonic query that isn't returning any values. I think the problem is in my where clause, although I'm not sure why.
What I'm really looking for is a way to debug the query to see the SQL that's actually being spit out by Subsonic. I know there was a way to do this with the Query object with Inspect(), but I'm using Select objects (or could also probably use SQLQuerys) because I need joins. Is there any inspect() type option for a Subsonic Select?
Here's the code I'm using:
Dim qry As New [Select]("Contract_NO")
qry.From(<table1>.Schema)
qry.InnerJoin(<table2>.<table2columnA>, <table1>.<table1columnA)
qry.Where(NonInfoleaseLessor.Columns.LessorCode).Like("mystring")
If I comment out the where line, I get a full list of results. It doesn't like something about it, but I've manually run the query at the database with that where clause, and it works. How can I see what the difference is?
The problem with your query is that you should be using Contains("mystring") instead of Like("mystring").
The best way to see the SQL is to use the BuildSqlStatement() method of the query.
Use [a] profiler to see what SQL is actually being executed against the database.
As Adam spotted:
.Like("mystring")
should most probably be
.Like("%mystring%")
please try using Like("%mystring%")
It might have something to do with your choice of clause, or which column name you are using. Subsonic has a couple of column name field
OBJECT.xyzColumn
OBJECT.xyzColumn.QualifiedName
OBJECT.Columns.xyz
I have had to play with these in the past to get the values I wanted.