I'm trying to create a new content type in a migration that has a ContainerPart attached with some settings already applied.
ContentDefinitionManager.AlterTypeDefinition("NewType",
cfg => cfg
.DisplayedAs("New Type")
.WithPart(typeof(TitlePart).Name)
.WithPart(typeof(ContainerPart).Name)
.WithSetting("ContainerPartSettings.ItemsShownDefault", "False")
.WithPart(typeof(CommonPart).Name)
.WithPart(typeof(IdentityPart).Name)
.Creatable()
.Listable()
);
ItemsShownDefault, after the migration, stays at its default value of True.
I've tried a few different variations of this:
Renaming "ContainerPartSettings" to other versions like ContainerSettings, ContainerTypePartSettings
Not using the typeof() function and specifying the part with a direct string
From what I can tell, ContainerSettings uses a different method to store its values compared to others like AutorouteSettings.
Looking at your code, your are chaining the settings on your type instead of the part:
.WithPart(typeof(ContainerPart).Name) // close the WithPart function
// which means the following line is chained to the type
.WithSetting("ContainerPartSettings.ItemsShownDefault", "False")
To fix this, do it like this:
.WithPart(typeof(ContainerPart).Name, part => part
// inside the ContainerPart context
.WithSetting("ContainerPartSettings.ItemsShownDefault", "False")
) // Close ContainerPart context
.WithPart(..)
If you did mean to set the settings on the type instead of the part, use the ContainerTypePartSettings class that is defined here:
.WithPart(typeof(ContainerPart).Name)
// Set settings on the type instead of the part
.WithSetting("ContainerTypePartSettings.ItemsShownDefault", false)
Related
Suppose I have two tables USER_GROUP and USER_GROUP_DATASOURCE. I have a classic relation where one userGroup can have multiple dataSources and one DataSource simply is a String.
Due to some reasons, I have a custom RecordMapper creating a Java UserGroup POJO. (Mainly compatibility with the other code in the codebase, always being explicit on whats happening). This mapper sometimes creates simply POJOs containing data only from the USER_GROUP table, sometimes also the left joined dataSources.
Currently, I am trying to write the Multiset query along with the custom record mapper. My query thus far looks like this:
List<UserGroup> = ctx
.select(
asterisk(),
multiset(select(USER_GROUP_DATASOURCE.DATASOURCE_ID)
.from(USER_GROUP_DATASOURCE)
.where(USER_GROUP.ID.eq(USER_GROUP_DATASOURCE.USER_GROUP_ID))
).as("datasources").convertFrom(r -> r.map(Record1::value1))
)
.from(USER_GROUP)
.where(condition)
.fetch(new UserGroupMapper()))
Now my question is: How to create the UserGroupMapper? I am stuck right here:
public class UserGroupMapper implements RecordMapper<Record, UserGroup> {
#Override
public UserGroup map(Record rec) {
UserGroup grp = new UserGroup(rec.getValue(USER_GROUP.ID),
rec.getValue(USER_GROUP.NAME),
rec.getValue(USER_GROUP.DESCRIPTION)
javaParseTags(USER_GROUP.TAGS)
);
// Convention: if we have an additional field "datasources", we assume it to be a list of dataSources to be filled in
if (rec.indexOf("datasources") >= 0) {
// How to make `rec.getValue` return my List<String>????
List<String> dataSources = ?????
grp.dataSources.addAll(dataSources);
}
}
My guess is to have something like List<String> dataSources = rec.getValue(..) where I pass in a Field<List<String>> but I have no clue how I could create such Field<List<String>> with something like DSL.field().
How to get a type safe reference to your field from your RecordMapper
There are mostly two ways to do this:
Keep a reference to your multiset() field definition somewhere, and reuse that. Keep in mind that every jOOQ query is a dynamic SQL query, so you can use this feature of jOOQ to assign arbitrary query fragments to local variables (or return them from methods), in order to improve code reuse
You can just raw type cast the value, and not care about type safety. It's always an option, evne if not the cleanest one.
How to improve your query
Unless you're re-using that RecordMapper several times for different types of queries, why not do use Java's type inference instead? The main reason why you're not getting type information in your output is because of your asterisk() usage. But what if you did this instead:
List<UserGroup> = ctx
.select(
USER_GROUP, // Instead of asterisk()
multiset(
select(USER_GROUP_DATASOURCE.DATASOURCE_ID)
.from(USER_GROUP_DATASOURCE)
.where(USER_GROUP.ID.eq(USER_GROUP_DATASOURCE.USER_GROUP_ID))
).as("datasources").convertFrom(r -> r.map(Record1::value1))
)
.from(USER_GROUP)
.where(condition)
.fetch(r -> {
UserGroupRecord ug = r.value1();
List<String> list = r.value2(); // Type information available now
// ...
})
There are other ways than the above, which is using jOOQ 3.17+'s support for Table as SelectField. E.g. in jOOQ 3.16+, you can use row(USER_GROUP.fields()).
The important part is that you avoid the asterisk() expression, which removes type safety. You could even convert the USER_GROUP to your UserGroup type using USER_GROUP.convertFrom(r -> ...) when you project it:
List<UserGroup> = ctx
.select(
USER_GROUP.convertFrom(r -> ...),
// ...
I originally had the following SQL function:
CREATE FUNCTION resolve_device(query JSONB) RETURNS JSONB...
and the following code calling the method generated by jOOQ:
final JsonArray jsonArray = jooqDWH.select(resolveDevice(queryJson)).fetchOne().value1().getAsJsonArray();
final JsonObject o = jsonArray.get(0).getAsJsonObject();
This worked fine. I needed to return a real device object rather than a JSON blob though, so I changed the SQL function to:
CREATE FUNCTION resolve_device(query JSONB) RETURNS SETOF device...
and the code to:
final ResolveDeviceRecord deviceRecord = jooqDWH.fetchOne(resolveDevice(queryJson));
but I am getting a runtime error:
org.jooq.exception.SQLDialectNotSupportedException: Type class com.google.gson.JsonElement is not supported in dialect DEFAULT
Many other parts of my code continue to work fine with the custom binding I have converting JsonElement to JSONB, but something about the change to this function's signature caused it to stop working.
I tried a few different variants of DSL.field() and DSL.val() to try to force it to be recognized but have not had any luck so far.
This could be a bug in jOOQ or a misconfiguration in your code generator. I'll update my answer once it is clear what went wrong.
Workaround:
Meanwhile, here's a workaround using plain SQL:
// Manually create a data type from your custom JSONB binding first:
final DataType<JsonObject> jsonb = SQLDataType.OTHER.asConvertedDataType(jsonBinding);
// Then, create an explicit bind variable using that data type:
final ResolveDeviceRecord deviceRecord =
jooqDWH.fetchOptional(table("resolve_device({0})", val(queryJson, jsonb)))
.map(r -> r.into(ResolveDeviceRecord.class))
.orElse(null);
In my Migrations.cs file I'm attaching an AutoroutePart to a Content Item.
ContentDefinitionManager.AlterTypeDefinition("Faq", type => type
.WithPart("AutoroutePart")
);
How can I tell the AutoroutePart to use some arbitrary string + the Content Item's slug as the custom path? I know I can change the settings of fields using .WithSetting("FieldSettings.Setting","Value") but this doesn't seem to be an option for Parts. I also wouldn't know how to refer to the SlugToken in code.
Should be possible
ContentDefinitionManager.AlterTypeDefinition("Faq", type => type
.WithPart("AutoroutePart", part => part
.WithSetting("AutorouteSettings.AllowCustomPattern", "True")
.WithSetting("AutorouteSettings.AutomaticAdjustmentOnEdit", "False")
.WithSetting("AutorouteSettings.PatternDefinitions", "[{\"Name\":\"Title\",\"Pattern\":\"{Content.Slug}\",\"Description\":\"my-faq\"}]")
.WithSetting("AutorouteSettings.DefaultPatternIndex", "0"))));
I struck to pass multiple arguments in define.
The following is my code. I would like to pass two array inside the define, But I'm able to pass only one as like the following.
class test {
$path = [$path1,$path2]
$filename = [$name1,$name2]
define testscript { $filename: } // Can able to pass one value.
}
define testscript () {
file {"/etc/init.d/${title}": //Can able to receive the file name.
ensure => file,
content => template('test/test.conf.erb'),
}
From my above code, I could retrieve the filename inside the define resource. I also need path to set the value in the template. I`m not able to send / retrieve second argument in template.
Is there any way to improve my code to pass two values ( $path and $filename ) inside define resource ?
Any help is much appreciated.
Is there any way to improve my code to pass the two values ( $path and $filename ) inside define resource ?
Puppet has good documentation, which covers this area well.
To begin, you need to appreciate that a defined type is a resource type, in almost every way analogous to any built-in or extension type. If your defined type accepts parameters, then you bind values to those parameters just as you would in any other resource declaration. For example:
class mymodule::test {
mymodule::testscript { $name1: path => $path1 }
mymodule::testscript { $name2: path => $path2 }
}
define mymodule::testscript ($path) {
file {"${path}/${title}":
ensure => 'file',
content => template('test/test.conf.erb')
}
}
Additionally, because defined types are resource types, you should discard the concept of "passing" values as to them as if they were instead functions. That mental model is likely to betray you. In particular, it will certainly give you the wrong expectation about what would happen if you specify an array or a hash as your resource title.
In particular, you need to understand that in any resource declaration, if you give the resource title as an array, then that means a separate resource for each array member, with the array member as that resource's title. In that case, every one of those resources receives the same parameter values, as declared in the body of the declaration. Moreover, resource titles are always strings. Except for one level of arrays, as described above, if you give anything else as a resource title then it will be converted to a string.
I created an ActivityNode (an Entry) and I can add custom fields with the
setFields(List<Field> newListField)
fonction.
BUT
I am unable to modify these fields. (In this case I try to modify the value of the field named LIBENTITE)
FieldList list = myEntry.getTextFields();
List<Field> updatedList = new ArrayList<Field>();
//I add each old field in the new list, but I modify the field LIBENTITE
for(Field myField : list){
if(myField.getName().equals("LIBENTITE")){
((TextField)myField).setTextSummary("New value");
}
updatedList.add(myField);
}
myEntry.setFields(updatedList);
activityService.updateActivityNode(myEntry);
This code should replace the old list of fields with the new one, but I can't see any change in the custom field LIBENTITE of myEntry in IBM connections.
So I tried to create a new list of fields, not modifying my field but adding a new one :
for(Field myField:list){
if(!myField.getName().equals("LIBENTITE")){
updatedList.add(myField);
}
}
Field newTextField = new TextField("New Value");
newTextField .setFieldName("LIBENTITE");
updatedList.add(newTextField );
And this code is just adding the new field in myEntry. What I see is that the other custom fields did not change and I have now two custom fields named LIBENTITE, one with the old value and the second with the new value, in myEntry.
So I though that maybe if I clear the old list of Fields, and then I add the new one, it would work.
I tried the two fonctions
myEntry.clearFieldsMap();
and
myEntry.remove("LIBENTITE");
but none of them seems to work, I still can't remove a custom field from myEntry using SBT.
Any suggestions ?
I have two suggestions, as I had (or have) similar problems:
If you want to update an existing text field in an activity node, you have to call node.setField(fld) to update the field in the node object.
Code snippet from my working application, where I'm updating a text field containing a (computed) start time:
ActivityNode node = activityService.getActivityNode(id);
node.setTitle(formatTitle()); // add/update start and end time in title
boolean startFound = false;
// ...
FieldList textfields =node.getTextFields();
Iterator<Field> iterFields = textfields.iterator();
while (iterFields.hasNext()) {
TextField fld = (TextField) iterFields.next();
if (fld.getName().equals(Constants.FIELDNAME_STARTTIME)) {
fld.setTextSummary(this.getStartTimeString()); // NOTE: .setFieldValue does *not* work
node.setField(fld); // write updated field back. This seems to be the only way updating fields works
startFound=true;
}
}
If there is no field with that name, I create a new one (that's the reason I'm using the startFound boolean variable).
I think that the node.setField(fld) should do the trick. If not, there might be a way to sidestep the problem:
You have access to the underlying DOM object which was parsed in. You can use this to tweak the DOM object, which finally will be written back to Connections.
I had to use this as there seems to be another nasty bug in the SBT SDK: If you read in a text field which has no value, and write it back, an error will be thrown. Looks like the DOM object misses some required nodes, so you have to create them yourself to avoid the error.
Some code to demonstrate this:
// ....
} else if (null == fld.getTextSummary()) { // a text field without any contents. Which is BAD!
// there is a bug in the SBT API: if we read a field which has no value
// and try to write the node back (even without touching the field) a NullPointerException
// will be thrown. It seems that there is no value node set for the field. We
// can't set a value with fld.setTextSummary(), the error will still be thrown.
// therefore we have to remove the field, and - optionally - we set a defined "empty" value
// to avoid the problem.
// node.remove(fld.getName()); // remove the field -- this does *not* work! At least not for empty fields
// so we have to do it the hard way: we delete the node of the field in the cached dom structure
String fieldName = fld.getName();
DeferredElementNSImpl fldData = (DeferredElementNSImpl) fld.getDataHandler().getData();
fldData.getParentNode().removeChild(fldData); // remove the field from the cached dom structure, therefore delete it
// and create it again, but with a substitute value
Field newEmptyField = new TextField (Constants.FIELD_TEXTFIELD_EMPTY_VALUE); // create a field with a placeholder value
newEmptyField.setFieldName(fieldName);
node.setField(newEmptyField);
}
Hope that helps.
Just so that post does not stay unanswered I write the answer that was in a comment of the initial question :
"currently, there is no solution to this issue, the TextFields are read-only map. we have the issue recorded on github.com/OpenNTF/SocialSDK/issues/1657"