Spotfire Data Table Write Back to Database - spotfire

Can I update columns in my data table and write them back to the database? Basically, it is only a column that I have to update with a Y/N .

you can do this with IronPython script, however it's not supported by TIBCO and does not share the information model you create in the Info Designer.
more here: http://spotfired.blogspot.com/2014/04/write-back-to-database-from-spotfire.html

3 possible ways that I can think of..
Iron Python script and update the flags based on marking
step 1:Use spotfire standard HTML wrapper that holds your Spotfire analysis
step 2:Create a Java function in HTML to pass the key identifiers as input that you want to updates the flags and java function to in turn call a stored procedure to switch the flags.
Leverage TSSS and write a data function to pass the SQL for sjdbc that helps to connect to database and execute statements...

This is not possible out-of-the-box. You would have to customize the product to attain such functionality.

Related

extension library rest control Xpages

I am using extension library Rest controls ViewJsonservice to provide the data from Notes database, is there a easy way using same control I can provide the data from two databases, I can put the similar view from dbA to dbB
Short answer: no
Long answer:
I presume you want the data from 2 views available in a single endpoint. Either after each other (appended) or somehow merged.
You can do that using code. Check this article for basic info.
In a nutshell: use the ViewNavigator class in both databases to retrieve results and append or merge the data before you return it.
Nice side effect : you can return all the data.

how do i correctly set up a parameterized information link in spotfire?

Also posted on super users:
I'm a spotfire novice trying to create a parameterized info link. Ultimate goal is to create a default template that may be customized to return specific rows in a very large table. I've not been able to cobble together enough information from online searches to get me from point A to Z.
Spotfire version is 7.11 on an Oracle 11.2 SE DB.
Currently I've got a date/time prompt in the info link that will be global to all users. What I need is to be able to further filter to 1 of 2 columns (one is real, the other a string) in order to minimize loading times. There are 17 other on-demand tables that are related to the main one. Limiting the initial query will greatly speed up performance.
In information designer for the information link, if I edit the SQL in the WHERE and explicitly define the value or string for the column, I get the rows I want. When I try to define it using an input parameter (?ParamName), I either get nothing when I reload or get asked to input a parameter "for testing".
Q1: In the document properties for the analysis, I've been adding in properties that I assume is supposed to get picked up by the query.
- What part do scripts play in passing this variable to the SQL?
- Do I just need to define a value for a property name or include a IronPython script? - If script is required, can I just define the parameter to pass?
Q2: In the info link SQL, what is the correct syntax for defining the parameter variable depending on the type (real v string)? If I use a string, I need to include LIKE in order to pick up the desired rows. If I use a real, is it possible to define it as a list of values?
Thanks in advance.
Though not exactly clear from your description, I think you should be able to accomplish your goals using the "Load on demand" dialog that is accessed either when you add your data table to your analysis, or subsequently using the Data Table Properties>Type of Data>Settings dialog.
Spotfire uses this dialog to dynamically modify your SQL. Thus, you do not need to explicitly include the LIKE statement in your SQL. Spotfire will add it in based on what you define in the On-Demand settings. For example, you could have an Input Field where you type a constraint that will be stored as a Document Property and then refer to that Document Property in your On-Demand settings to control the table loading.

Is it possible to save on database without a PXGraph or a Screen?

The entry for that screen is not needed. All the records are automatically generated. or probably by using DAC only.
The Graph/DAC logic is preferred as you get all of the framework freebies such as field defaulting and calculated formula fields.
You can however get around this using PXDatabase.Insert or PXDatabase.Update PXDatabase.Delete
I use these for upgrade processes or bulk delete of processing records. These calls do not require a graph to execute but ignore all DAC attributes which may or may not default values, calculate values, etc.
If you search on PXDatabase in the Acumatica code browser you can find examples. Here is one from EmployeeMaint.Location_RowPersisted:
PXDatabase.Update<Location>(
new PXDataFieldAssign("VAPAccountLocationID", _KeyToAbort),
new PXDataFieldRestrict("LocationID", _KeyToAbort),
PXDataFieldRestrict.OperationSwitchAllowed);
PXDataFieldAssign is setting column values.
PXDataFieldRestrict is your where condition.
It is best to find multiple examples of PXDatabase in Acumatica and confirm your query results using a tool such as SQL profiler to make sure its executing the correct statement you intend to run.
You can't use DAC without Graph. All BQL queries require PXGraph instance. The only way to save data without using BQL is using ODBC or any other ORM to connect strictly to database and do your changes. But it is not recommended way as in case of doing it in that way you will ignore all the Business Logic.

Read a table from Kentico database which was not declared as 'custom table'

My question is pretty simple. I am working on Kentico 9 with its SQL Server database which contains several tables which had been added directly from the SQL Management Studio by an external contractor. The fact is that those tables are being used to store custom content which will be displayed for a site, but, in the code they don't have the code for making queries. I mean, they don't have Info and Provider classes.
https://docs.kentico.com/display/K82/Retrieving+database+data+using+ObjectQuery+API
According with this, all tables into the Kentico database can be accessed by invoking methods on these classes, but I don't have it this time.
Something like this, it will not work if I use my table name:
var user = UserInfoProvider.GetUserInfo("administrator");
var items = CustomTableItemProvider.GetItems("MyTable")
.TopN(10)
.WhereEquals("ItemCreatedBy", user.UserID)
.OrderBy("ItemCreatedWhen");
My question is:
can I query any table by its name?
One last thing:
I cannot declared those table as "custom table" because it seems to be a bug in the CMS.
Or you can pull data using your own SQL query:
var ds = ConnectionHelper.ExecuteQuery("select ....", null, QueryTypeEnum.SQLQuery);
Nevertheless I would recommend to create a custom class inside a custom module (much more robust than custom tables) instead and use the generated Info and InfoProvider classes to get and manipulate data.
I think an object has to be registered within the system (created through Kentico UI or API) in order to be pulled from DB with object query.
So I'd choose one of the following options:
Use Entity Framework or something similar to work with that data
Create appropriate custom tables or even custom module and push data there. Not sure why you can't create a custom table... What is an error you're getting?
If you need to present data on the UI only (without processing on the back end) - use just custom queries
Hope this helps.
If you are accessing in code then you could do it the good old fashioned way. If you want to pull data from the database to display on the website you could also do so by creating a custom query and using a transformation to display the fields, then use a repeater on the page to display the transformed data. Alternatively you can use a SQL datasource with a basic repeater, but you still have to create a transformation to display the data. Both methods allow you to access the data in the tables from within the CMS UI, no need to touch any code behind.
If your objective is to read data from these database tables to transform on webpage e.g. using CMS Repeater webpart, you can simply create custom query(s) in Kentico itself and load data using it. You can find the detail here on how to create custom custom queries and load data using it.
On the other hand you can also write your custom classes and define the custom methods where you can pull data using your own SQL query like this:
var ds = ConnectionHelper.ExecuteQuery("select ....", null, QueryTypeEnum.SQLQuery);
Lastly I don't think there should be any issue to create custom table instead of those direct DB tables, only thing we have to ensure code name of custom table should be unique means don't try to use exact same name because it'll cause exception due to same table name already exist in DB. You can please share exception you getting while creating custom table so that I can help you out further.

How to access a connector / Data Base from the initial/instantiation Form/Page?

How to access a connector / Data Base from the initial/instantiation Form/Page?
Hi every body, any help will be appreciated.
I try to access using the API Rest, but the method need the activyty/task id or the instance flow id.
This is because the connector stores its result in a proces/local/Busines data model or Variables,
but in the initial form I don't have an
instance of the flow/task/activity and I can't access to the variable that stores the value.
I need to use the connector to access data base and to the Ldap
to get some values to show in the initial form before instantiating the process.
Is there any way to call a Groovy Script from initial Form?, if there is,
I can access from that script to the data base, and save this value into a form variable, to show it in the form I think.
P.S.: I use Bonita 7.2
thanks!
Sounds like you have a chicken and egg problem.
Can you instantiate the process with minimal data, then use a connector out to populate the BDM with the connector data, and then make the first step of your process the "initial" form? At that point you then have the case, taskid, etc.
If the data is not task/case specific, you can access the BDM data via the REST API and a custom query - i.e. you're not just limited to the API's that require the case/task/instance, etc. However, you may need to get clever with how you isolate that record. For example, I have some global parameters that I keep in in the BDM, and access them within my form by requesting the first record in that table via the rest API:
I created a variable called "globals" of type "External API" with the following REST call that retrieves the record with persistenceId=1:
../API/bdm/businessData/com.company.model.GlobalParameters/1
In your case, you probably need to use a REST Api extensions. Basically, you can create a new REST Endpoint using Groovy script. There is a documentation available here: http://documentation.bonitasoft.com/rest-api-extensions-808
Cheers

Resources