i have a requirement in which i have to break a table in two parts like
Advised not to use style class
I have a option attribute of separateRows af:column . Please share sample code for separateRows af:column
another option we are considering is using styleClass="#{row.dataRow?'':'AFTableCellSubtotal'}"
Related
I have written a article in my google docs.
I have included small tables, big tables and huge tables in different places in the files.
Now I need to modify some properties of all tables at a time.
But that seems not possible?
Are there any methods to modifying properties of all tables at a time for google docs?
PS. more details to illustrate my issue:
1. Here is a doc file with one table.
2. Right click on the table and choose Table properties
3. Now here comes more tables in a doc file
How can I deal with all the tables together? (All modifications are the same)
Method 1
When creating the tables, you can simply set all the properties on the first one and then for the next ones you can copy and paste the first one since the format will be kept.
Method 2
If you want to modify more tables at the same time, you can make use of Apps Script.
Apps Script is a powerful development platform which can be used to build web apps and automate tasks. What makes it special is the fact that it is easy to use and to create applications that integrate with G Suite.
Therefore, your task can be achieved by using this script.
Snippet
function setTableProperties() {
var doc = DocumentApp.openById("DOCUMENT_ID");
var tables = doc.getBody().getTables();
tables.forEach((table) => {
//Any instruction run with the variable table will be executed for all tables.
});
}
Explanation
The above script gathers all the tables from the wanted document and then using a for loop accesses each table from the document.
In order to set the properties of the tables as wanted, you just have to use the appropriate method/s.
The getAttributes method can be used as well in order to see exactly which properties does a table posses.
Reference
Apps Script Document Service;
Apps Script Enum Attribute;
Apps Script Table Class;
Apps Script DocumentApp Class.
Also posted on super users:
I'm a spotfire novice trying to create a parameterized info link. Ultimate goal is to create a default template that may be customized to return specific rows in a very large table. I've not been able to cobble together enough information from online searches to get me from point A to Z.
Spotfire version is 7.11 on an Oracle 11.2 SE DB.
Currently I've got a date/time prompt in the info link that will be global to all users. What I need is to be able to further filter to 1 of 2 columns (one is real, the other a string) in order to minimize loading times. There are 17 other on-demand tables that are related to the main one. Limiting the initial query will greatly speed up performance.
In information designer for the information link, if I edit the SQL in the WHERE and explicitly define the value or string for the column, I get the rows I want. When I try to define it using an input parameter (?ParamName), I either get nothing when I reload or get asked to input a parameter "for testing".
Q1: In the document properties for the analysis, I've been adding in properties that I assume is supposed to get picked up by the query.
- What part do scripts play in passing this variable to the SQL?
- Do I just need to define a value for a property name or include a IronPython script? - If script is required, can I just define the parameter to pass?
Q2: In the info link SQL, what is the correct syntax for defining the parameter variable depending on the type (real v string)? If I use a string, I need to include LIKE in order to pick up the desired rows. If I use a real, is it possible to define it as a list of values?
Thanks in advance.
Though not exactly clear from your description, I think you should be able to accomplish your goals using the "Load on demand" dialog that is accessed either when you add your data table to your analysis, or subsequently using the Data Table Properties>Type of Data>Settings dialog.
Spotfire uses this dialog to dynamically modify your SQL. Thus, you do not need to explicitly include the LIKE statement in your SQL. Spotfire will add it in based on what you define in the On-Demand settings. For example, you could have an Input Field where you type a constraint that will be stored as a Document Property and then refer to that Document Property in your On-Demand settings to control the table loading.
I’ve created new Segmented Keys in ACUMATICA for use in a specific module. I would like to assign the Dimension name dynamically but I noticed it works only with hard code or name like [PXDimension(“VENDOR”)]
Also, I have some limitation to create an IF Conditional inside the customized field… it does not recognize the IF clause (see the image).
I would appreciate any suggestion how to solve this issue.
I haven't seen how your original attempt at PXDimension looked, but I'm going to take a guess and assume you tried to reference a new custom field contained in a setup table, something like:
[PXDimension(typeof(XXMySetup.usrMyCustomField))]
If that's indeed what you tried to do, one very important thing to do is to ensure that you have a view for your table in your graph, otherwise the attribute will not find the table and record in your cache. For instance:
public PXSetup<XXMySetup> XXMySetup;
Without this view declared in the graph, the dimension attribute will not work as expected. It would be nice if a clear exception was thrown in this case - I made the same mistake recently and it would have been helpful.
I do not know that the question is right? Please do not take it your mind if it is crazy. Actually I am working on xpages application. There I need to do two things, that I want to add the picklist functionality and binding the dynamic data like field_1,field_2,field_3, ... upto n depands on customer choice.I am using the composite data for both custom controls. I can remove the picklist control's composite data and also I can do it by passing the scope variables. But that takes more time than the composite data.
I did not get any error. But the binded documents is not saving.
Is it possible to import the CCs that are having composite Data?
Code for first CC:-
<xc:viewpicklist datasrc="view1" dialogID="dialog1" dialogWidth="700px" dialogTitle="Pick this field value!!!">
<xc:this.viewColumn>
<xp:value>0</xp:value>
<xp:value>1</xp:value>
<xp:value>2</xp:value>
</xc:this.viewColumn>
</xc:viewpicklist>
Code for Second CC:-
<xc:BOM_Partinfo BOM_Partinfo="#{document1}"
TNUM="field#{index+1}" Desc="Desc#{index+1}" quan="Ea#{index+1}"
exp="exp#{index+1}" cap="cap#{index+1}" total="price#{index+1}"
RD="RD#{index+1}" m="manufact#{index+1}"
m_n="manufactnum#{index+1}">
</xc:BOM_Partinfo>
You can read information that is set in the properties of a custom control if it was static in the calling page:
var x = getComponent("yourcomponentid");
x.getPropertyMap().get("parametername");
but you want to propagate a data source from the outer control to the inner control...
You need to plan carefully. If you hand over the data source, then your custom control is dependent on a fixed set of fields in the data source (that would be a parameter of type com.ibm.xsp.model.DocumentDataSource). This would violate the encapsulation principles. So I would recommend you actually hand over data bindings - the advantage: you are very flexible what to bind to (not only data sources, but also beans and scope variables would work then). The trick is you provide the binding name as you would statically type it in (e.g. "document1.subject" or "requestScope.bla" ). In your control you then do
${"#{compositeData.field1}"}
${"#{compositeData.field2}"}
You need one for each field.
You cannot send a document data source to a custom control using composite data parameters.
You can try and use this script instead
http://openntf.org/XSnippets.nsf/snippet.xsp?id=access-datasources-of-custom-controls
Define data source in XP/CC where you want those CCs. Define parameter "dataSourceName" for both CCs. Inside each of them use EL "requestScope[compositeData.dataSourceName].fieldName" everywhere you want to bind to datasource.
I'm starting to do some test on SubSonic 3 and I'm missing some stuff.
1st: Where's the Table names constants? The place where we could ask for the same of a certain table using intelisense...
2nd: Same as the above but for the table columns... where are they?
This is very useful mostly when you need to pass those names as string... it you need to refactor your DB we don't need to look through all the code to find where was I using that column!! Once you re-generate the code the compiler tells you!
3rd: Now how can I perform an ExecuteReader on a certain table like I'm used to on 2.x through the Query object? I used this a lot for list where I really don't need the business objects (BO) overhead... When I needed a BO (for showing a grid row details) I create it from the row itself...
I'm using ActiveRecord btw...
Thanks guys!
Alex
1st: Where's the Table names constants? The place where we could ask for the same of a certain table using intelisense...
In Structs.tt find the following line of code at line 47:
<# foreach(var col in tbl.Columns){#>
Add the following code above it:
public static string TableName { get { return "<#=tbl.Name#>"; } }
Now you'll have a property that returns the name of the table.
2nd: Same as the above but for the table columns... where are they?
In the generated Structs.cs file, this is included in the 3.0.0.3 version
3rd: Now how can I perform an ExecuteReader on a certain table like I'm used to on 2.x through the Query object? I used this a lot for list where I really don't need the business objects (BO) overhead... When I needed a BO (for showing a grid row details) I create it from the row itself...
If you're using SqlQuery object you can call ExecuteReader on it. Alternatively you can use Linq syntax to generate return custom shaped objects and they'll get mapped automatically.
1st and 2nd: It's not implemented in the default tt-files.
A similiar question:
SubSonic 3 Simple Query Tool
Problem is that's not a correct implementation if you want the 2.x way - the XColumn properties used to be column objects and not string constants, those were found under the Columns struct. So I hope that check-in will not be accepted and that someone will 2.x-ify it correctly.
Anyway as you can see it seems pretty easy to fix it on your own.