I have a datasource as a global element, is it possible to reference it from a Java Component. Im using Mule 3.4 version
If you can I would use setter injection:
<component>
<singleton-object class="SomeJavaComponent">
<property key="dataSource" value-ref="jdbcDataSource"/>
</singleton-object>
</component>
with a setter defined in your component similar to the following:
private DataSource dataSource;
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource
}
Alternatively, not the nicest way, but you can get it from the registry:
this.muleContext.getRegistry.lookupObject("jdbcDataSource");
Well in reality that's not the correct answer. There is no such animal as DataSource.
Here is a working (tested) example, Mule 3.5, standalone Mule CE 3.4.1:
<component doc:name="callDynamicQuery">
<singleton-object class="org.fiuze.trasnsformers.DynamicSql">
<property key="dataSource" value="null" value-ref="MySqlDataConnector"/>
</singleton-object>
</component>
Note, that the MySqlDataConnector is a jdbc:mysql-data-source (jdbc-ee:mysql-data-source) and NOT the jdbc:connector (jdbc-ee:connector).
The java class is like this:
package org.fiuze.trasnsformers;
import java.sql.SQLException;
import org.enhydra.jdbc.standard.StandardDataSource;
import java.sql.Connection;
public class DynamicSql {
private StandardDataSource dataSource;
public StandardDataSource getConn() {
return dataSource;
}
public void setConn(StandardDataSource conn) {
this.dataSource = conn;
}
public void callSql(String sql) throws SQLException {
Connection connection = this.dataSource.getConnection();
try {
connection.prepareStatement(sql).execute();
} catch (SQLException sqlE) {
connection.close();
throw new SQLException(sqlE);
} finally {
connection.close();
}
}
}
This example will work as long as your example has a working dataSource.
When I tried the above answer, Mule didn't initialize the property since there was no fitting class.
I am expanding on the above answer in the light of my deploying this to production. The autowire functionality in Studio CE didn't seem to pick up the Data Source.
Instead of letting you down and letting you spend hours like myself, here is your solution:
package org.fiuze.trasnsformers;
import java.sql.SQLException;
import org.enhydra.jdbc.standard.StandardDataSource;
import org.mule.api.MuleContext;
import org.mule.api.annotations.expressions.Lookup;
import org.mule.api.context.MuleContextAware;
import java.sql.Connection;
public class DynamicSql implements MuleContextAware {
#Lookup
private MuleContext muleContext;
private StandardDataSource dataSource;
public StandardDataSource getConn() {
return dataSource;
}
public void setConn(StandardDataSource conn) {
this.dataSource = conn;
}
public void callSql(String sql) throws SQLException {
if (this.dataSource == null) {
this.dataSource = this.muleContext.getRegistry().lookupObject("MySqlDataConnector");
}
Connection connection = this.dataSource.getConnection();
try {
connection.prepareStatement(sql).execute();
} catch (SQLException sqlE) {
connection.close();
throw new SQLException(sqlE);
} finally {
connection.close();
}
}
#Override
public void setMuleContext(MuleContext context) {
this.muleContext = context;
}
}
The main idea there is that we can use the MuleContext and grab it from there. See the elegant way of pulling the context by a simple annotation. This deployed and worked.
Finally, you could simplify now the code. If you have EE, you probably don't need to load in the context and your DynamicSql class will be lightweight. If you count your coins, and running on CE, you can still do it.
With a little sweat.
Related
I am using Cassandra as a datasource in my Spring boot application and would like to initialize the database before the application starts.
Up to now what I have done is, I have defined a class "CassandraConfiguration" extending "AbstractCassandraConfiguration" class as in the examples which you can see below and I have a repository extending "CassandraRepository". When I create the keyspace and the table myself, the application works fine.
However, I want to create the keyspace and tables automatically while application is starting. In order to do that, I supplied a schema.cql file under resources folder but I could not make that script work.
Does anyone have any idea what can I do to create the keyspace(s) and tables automatically?
Thanks.
Edit: I am using Cassandra 2.0.9, spring-boot 1.3.2.RELEASE and datastax cassandra driver 2.1.6 versions.
CassandraConfiguration.java
#Configuration
#PropertySource(value = { "classpath:cassandra.properties" })
#EnableCassandraRepositories(basePackages = { "bla.bla.bla.repository" })
public class CassandraConfiguration extends AbstractCassandraConfiguration {
#Autowired
private Environment environment;
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints( environment.getProperty( "cassandra.contactpoints" ) );
cluster.setPort( Integer.parseInt( environment.getProperty( "cassandra.port" ) ) );
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping() throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
#Bean
public CassandraConverter converter() throws ClassNotFoundException {
return new MappingCassandraConverter(cassandraMapping());
}
#Override
protected String getKeyspaceName() {
return environment.getProperty( "cassandra.keyspace" );
}
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(cluster().getObject());
session.setKeyspaceName(environment.getProperty("cassandra.keyspace"));
session.setConverter(converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.RECREATE_DROP_UNUSED;
}
}
If you are still having problems with this, in Spring Boot 2 and SD Cassandra 2.0.3 you can do this straightforward Java configuration and setup everything out of the box.
#Configuration
#EnableCassandraRepositories(basePackages = "com.example.repository")
public class DbConfigAutoStart extends AbstractCassandraConfiguration {
/*
* Provide a contact point to the configuration.
*/
#Override
public String getContactPoints() {
return "exampleContactPointsUrl";
}
/*
* Provide a keyspace name to the configuration.
*/
#Override
public String getKeyspaceName() {
return "exampleKeyspace";
}
/*
* Automatically creates a Keyspace if it doesn't exist
*/
#Override
protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
CreateKeyspaceSpecification specification = CreateKeyspaceSpecification
.createKeyspace("exampleKeyspace").ifNotExists()
.with(KeyspaceOption.DURABLE_WRITES, true).withSimpleReplication();
return Arrays.asList(specification);
}
/*
* Automatically configure a table if doesn't exist
*/
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.CREATE_IF_NOT_EXISTS;
}
/*
* Get the entity package (where the entity class has the #Table annotation)
*/
#Override
public String[] getEntityBasePackages() {
return new String[] { "com.example.entity" };
}
And you are good to go
Your return type BasicCassandraMappingContext() might be deprecated. Use
#Bean
public CassandraMappingContext mappingContext() throws ClassNotFoundException {
CassandraMappingContext mappingContext= new CassandraMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
return mappingContext;
}
#Override
public String[] getEntityBasePackages() {
return new String[]{"base-package name of all your entity, annotated
with #Table"};
}
#Override
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
return CassandraEntityClassScanner.scan(getEntityBasePackages());
}
Instead of,
#Bean
public CassandraMappingContext cassandraMapping() throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
also set:
session.setSchemaAction(SchemaAction.RECREATE_DROP_UNUSED);
and Exclude:
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.RECREATE_DROP_UNUSED;
}
get reference here.
I'm working with spring-boot 1.5.10.RELEASE and cassandra 3.0.16 but you can try downscaling the versions. To create the keyspace you can import the keyspacename from you application.yml or application.properties. Using the #Table annotation your tables should be generated automatically provided you have set the entity base package.
#Value("${cassandra.keyspace}")
private String keySpace;
#Override
public String[] getEntityBasePackages() {
return new String[]{"com.example.your.entities"};
}
#Override
protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
return Arrays.asList(
CreateKeyspaceSpecification.createKeyspace()
.name(keySpace)
.ifNotExists()
);
}
Finally i got it working by adding setKeyspaceCreations(getKeyspaceCreations()) to the CassandraClusterFactoryBean Override and also make sure to enable #ComponentScan.
import com.datastax.driver.core.PlainTextAuthProvider;
import com.datastax.driver.core.policies.ConstantReconnectionPolicy;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.cassandra.config.*;
import org.springframework.data.cassandra.core.cql.keyspace.CreateKeyspaceSpecification;
import org.springframework.data.cassandra.core.cql.keyspace.DropKeyspaceSpecification;
import org.springframework.data.cassandra.core.cql.keyspace.KeyspaceOption;
import org.springframework.data.cassandra.repository.config.EnableReactiveCassandraRepositories;
import java.util.Arrays;
import java.util.List;
#Configuration
#EnableReactiveCassandraRepositories(basePackages = "com.company.domain.data")
public class CassandraConfig extends AbstractReactiveCassandraConfiguration{
#Value("${spring.data.cassandra.contactpoints}") private String contactPoints;
#Value("${spring.data.cassandra.port}") private int port;
#Value("${spring.data.cassandra.keyspace-name}") private String keyspace;
#Value("${spring.data.cassandra.username}") private String userName;
#Value("${spring.data.cassandra.password}") private String password;
#Value("${cassandra.basepackages}") private String basePackages;
#Override protected String getKeyspaceName() {
return keyspace;
}
#Override protected String getContactPoints() {
return contactPoints;
}
#Override protected int getPort() {
return port;
}
#Override public SchemaAction getSchemaAction() {
return SchemaAction.CREATE_IF_NOT_EXISTS;
}
#Override
public String[] getEntityBasePackages() {
return new String[]{"com.company.domain.data"};
}
#Override
public CassandraClusterFactoryBean cluster() {
PlainTextAuthProvider authProvider = new PlainTextAuthProvider(userName, password);
CassandraClusterFactoryBean cluster=new CassandraClusterFactoryBean();
cluster.setJmxReportingEnabled(false);
cluster.setContactPoints(contactPoints);
cluster.setPort(port);
cluster.setAuthProvider(authProvider);
cluster.setKeyspaceCreations(getKeyspaceCreations());
cluster.setReconnectionPolicy(new ConstantReconnectionPolicy(1000));
return cluster;
}
#Override
protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
CreateKeyspaceSpecification specification = CreateKeyspaceSpecification.createKeyspace(keyspace)
.ifNotExists()
.with(KeyspaceOption.DURABLE_WRITES, true);
return Arrays.asList(specification);
}
#Override
protected List<DropKeyspaceSpecification> getKeyspaceDrops() {
return Arrays.asList(DropKeyspaceSpecification.dropKeyspace(keyspace));
}
}
The previous answers are based on AbstractCassandraConfiguration from spring-data-cassandra. If you use spring-boot then it can auto-configure Cassandra for you and there's no need to extend AbstractCassandraConfiguration. However, even in this case you need to do some work to automatically create the keyspace. I've settled on an auto-configuration added to our company's spring-boot starter, but you can also define it as a regular configuration in your application.
/**
* create the configured keyspace before the first cqlSession is instantiated. This is guaranteed by running this
* autoconfiguration before the spring-boot one.
*/
#ConditionalOnClass(CqlSession.class)
#ConditionalOnProperty(name = "spring.data.cassandra.create-keyspace", havingValue = "true")
#AutoConfigureBefore(CassandraAutoConfiguration.class)
public class CassandraCreateKeyspaceAutoConfiguration {
private static final Logger logger = LoggerFactory.getLogger(CassandraCreateKeyspaceAutoConfiguration.class);
public CassandraCreateKeyspaceAutoConfiguration(CqlSessionBuilder cqlSessionBuilder, CassandraProperties properties) {
// It's OK to mutate cqlSessionBuilder because it has prototype scope.
try (CqlSession session = cqlSessionBuilder.withKeyspace((CqlIdentifier) null).build()) {
logger.info("Creating keyspace {} ...", properties.getKeyspaceName());
session.execute(CreateKeyspaceCqlGenerator.toCql(
CreateKeyspaceSpecification.createKeyspace(properties.getKeyspaceName()).ifNotExists()));
}
}
}
In my case I've also added a configuration property to control the creation, spring.data.cassandra.create-keyspace, you may leave it out if you don't need the flexibility.
Note that spring-boot auto-configuration depends on certain configuration properties, here's what I have in my dev environment:
spring:
data:
cassandra:
keyspace-name: mykeyspace
contact-points: 127.0.0.1
port: 9042
local-datacenter: datacenter1
schema-action: CREATE_IF_NOT_EXISTS
create-keyspace: true
More details: spring-boot and Cassandra
My Spring Data Cassandra configuration looks like this:
#Configuration
#EnableCassandraRepositories(basePackages = {
"mypackage.repository.cassandra",
})
public class DistributedRepositoryConfiguration {
// ...
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(cluster().getObject());
session.setKeyspaceName(configuration.get().getKeyspace());
session.setConverter(converter());
session.setSchemaAction(SchemaAction.CREATE);
return session;
}
}
Generally, Spring Data Cassandra works in my project. However, when I start my application I have no tables created. Anyone who could tell me what I'm doing wrong?
It is not written well in documentation if you want to have automatic table creation you should tell cassandra where to look for entity classes:
<cassandra:mapping entity-base-packages="your.package" />
If you want to do the same using annotiation configuration you have to explicitly tell CassandraTemplate where to look for it. So
#Bean
public CassandraSessionFactoryBean session() throws Exception {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(cluster().getObject());
session.setKeyspaceName(keyspaceName);
session.setConverter(converter());
session.setSchemaAction(SchemaAction.CREATE);
return session;
}
#Bean
public CassandraConverter converter() throws Exception {
return new MappingCassandraConverter(mappingContext());
}
#Bean
public CassandraMappingContext mappingContext() throws Exception {
BasicCassandraMappingContext bean = new BasicCassandraMappingContext();
bean.setInitialEntitySet(CassandraEntityClassScanner.scan(("package.with.your.entities")));
return bean;
}
To do it with ease I suggest using AbstractCassandraConfiguration and override methods which You need.
I checked the class AbstractCassandraConfiguration and found the following code:
public String[] getEntityBasePackages() {
return new String[] { getClass().getPackage().getName() };
}
Since my config class isn't in the main package, the component scan does not find my classes with the #Table annotation. So I override the method "getEntityBasePackages()" using my StartUp class and everything worked fine.
This is my config class:
#Configuration
public class CassandraConfig extends AbstractCassandraConfiguration {
#Value("${spring.data.cassandra.keyspace-name}")
private String keyspaceName;
#Override
protected String getKeyspaceName() {
return keyspaceName;
}
#Override
public String[] getEntityBasePackages() {
return new String[]{AppStartup.class.getPackage().getName()};
}
#Override
protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
return Collections.singletonList(CreateKeyspaceSpecification
.createKeyspace(keyspaceName)
.ifNotExists(true)
.with(KeyspaceOption.DURABLE_WRITES, true)
.withSimpleReplication());
}
#Override
public SchemaAction getSchemaAction() {
return SchemaAction.CREATE_IF_NOT_EXISTS;
}
}
Using this class, your application should create the required keyspace and tables to run.
If you using CassandraDataConfiguration just annotated you base class application with:
#EntityScan("mypackage.repository.cassandra")
The base package information will be used in CassandraDataAutoConfiguration.cassandraMapping method to add this package to cassandra mapping.
This is a followup to my previous post at ToolTip Performance in XPages I have got the code to do it written (not tested) so I can't seem to get my Managed Bean to get called properly. My config contians the following:
<managed-bean id="ToolTip">
<managed-bean-name>WFSToolTip</managed-bean-name>
<managed-bean-class>ca.workflo.wfsToolTip.ToolTipText</managed-bean-class>
<managed-bean-scope>session</managed-bean-scope>
</managed-bean>
and I have stripped my code down to the bare minimum:
package ca.workflo.wfsToolTip;
public class ToolTipText {
public String getToolTipText(String key){
return key;
}
}
My class is in the build path. I have a simple XPage with one filed on it and a tool tip for that field. The code for the tooltip is:
<xe:tooltip id="tooltip1" for="inputText1">
<xe:this.label>
<![CDATA[#{javascript:WFSToolTip.getToolTipText("More Stuff");}]]>
</xe:this.label>
</xe:tooltip>
When I load the test XPage in the browser I get an error that:
Error while executing JavaScript computed expression
Script interpreter error, line=1, col=12: Error calling method 'getToolTipText(string)' on java class 'ca.workflo.wfsToolTip.ToolTipText'
JavaScript code
1: WFSToolTip.getToolTipText("More Stuff");
I can't figure out why the call to getToolTipText would fail.
Can anyone see where I'm going wrong. This is my first Managed Bean and at the moment it is managing me rather than the other way around.
Thanks.
You need to:
- implement Serializable which boils down to state it and provide a version
- implement Map ... a little more work
Then you use Expression Language instead of SSJS. It would look like #{WFSToolTip["More Stuff"]}
This is how such a class would look like. You need to:
adjust the view name to reflect the name you want
the view needs to be flat, column 1 = tooltip name, column 2 = tooltip text
somewhere (on an admin/config page) you need to call WFSToolTip.clear(); (in SSJS) after you update the values in the configuration.
The example doesn't lazyload since running though a view navigator once is really fast. No point to do all these lookups.
Here you go:
package com.notessensei.xpages;
import java.io.Serializable;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import java.util.Vector;
import lotus.domino.Base;
import lotus.domino.Database;
import lotus.domino.NotesException;
import lotus.domino.View;
import lotus.domino.ViewEntry;
import lotus.domino.ViewEntryCollection;
import com.ibm.xsp.extlib.util.ExtLibUtil;
public class Parameters implements Serializable, Map<String, String> {
private final static String CONFIG_VIEW = "keywords";
private static final long serialVersionUID = 1L;
private final Map<String, String> internalMap = new HashMap<String, String>();
public Parameters() {
this.populateParameters(internalMap);
}
private void populateParameters(Map<String, String> theMap) {
Database d = ExtLibUtil.getCurrentDatabase();
try {
View v = d.getView(CONFIG_VIEW);
ViewEntryCollection vec = v.getAllEntries();
ViewEntry ve = vec.getFirstEntry();
ViewEntry nextVe = null;
while (ve != null) {
nextVe = vec.getNextEntry(ve);
// Load the parameters, column 0 is the key, column 0 the value
Vector colVal = ve.getColumnValues();
theMap.put(colVal.get(0).toString(), colVal.get(1).toString());
// Cleanup
this.shred(ve);
ve = nextVe;
}
// recycle, but not the current database!!!
this.shred(ve, nextVe, vec, v);
} catch (NotesException e) {
e.printStackTrace();
}
}
public void clear() {
this.internalMap.clear();
this.populateParameters(this.internalMap);
}
public boolean containsKey(Object key) {
return this.internalMap.containsKey(key);
}
public boolean containsValue(Object value) {
return this.internalMap.containsValue(value);
}
public Set<java.util.Map.Entry<String, String>> entrySet() {
return this.internalMap.entrySet();
}
public String get(Object key) {
return this.internalMap.get(key);
}
public boolean isEmpty() {
return this.internalMap.isEmpty();
}
public Set<String> keySet() {
return this.internalMap.keySet();
}
public String put(String key, String value) {
return this.internalMap.put(key, value);
}
public void putAll(Map<? extends String, ? extends String> m) {
this.internalMap.putAll(m);
}
public String remove(Object key) {
return this.internalMap.remove(key);
}
public int size() {
return this.internalMap.size();
}
public Collection<String> values() {
return this.internalMap.values();
}
private void shred(Base... morituri) {
for (Base obsoleteObject : morituri) {
if (obsoleteObject != null) {
try {
obsoleteObject.recycle();
} catch (NotesException e) {
// We don't care we want go get
// rid of it anyway
} finally {
obsoleteObject = null;
}
}
}
}
}
The difference to a regular HashMap is only the constructor that populates it. Hope that clarifies it.
I've never seen that id property.. My beans in faces-config look like this:
<managed-bean>
<managed-bean-name>CurrentJob</managed-bean-name>
<managed-bean-class>com.domain.inventory.Job</managed-bean-class>
<managed-bean-scope>session</managed-bean-scope>
</managed-bean>
Technically managed beans should implement Serializable and have a blank constructor. So you should have something like this inside :
public ToolTipText() {
}
I THINK you can get away without the Serializable for somethings... I always implement though but I'm sure you need the no argument constructor.
Thanks to all that have responded and helped out here especially Stephan Wissel. I thought I would post my version of Stephan's code, pretty much the same. There are issues with making the class an ApplicationScope because you need to shut down the HTTP task to refresh and reload the Class. What I did was added a button to the custom control where I to the view of the tooltips where I do the CRUD stuff and in the button do WFSToolTip().clear() and it rebuilds the map. Pretty neat. My next task for this is try to do the CRUD using JAVA and update the map directly. At the moment though I need to move on to my next task.
My next task revolves around a very similar Class. I have a master database that contains all the basic design and code. Then I have one or more applications that use that code and store the documents in their own database that contains the forms and views for that specific application. In the master I have created one or more application documents. Each of these documents contains the AppName (the key value) then the Map value is an array (Vector) containing the ReplicaID of the Application Database and a few other pieces of information. My class the loads a Map entry for each Application and collects a bunch of other information about the application from several places and stores that in the Map Value. At this point then I can set Database db = thisClass.getDatabase("App Name"). so a single custom control can be used for any/all of the applications. Pretty cool. I think I could get to like this.
Anyway here is the code I'm using for the ToolTips - BTW It has taken an XPage with about 175 fields and 100+ tooltips from being painfully slow to being acceptable. The good thing about it is that the XPage is creating a process profile document and once created it is not frequently modified as an admin action - not an everyday user action.
Please feel free point out error, omitions or suggestions to the code:
package ca.workflo.wfsToolTip;
import lotus.domino.Base;
import lotus.domino.Session;
import lotus.domino.Database;
import lotus.domino.View;
import lotus.domino.NotesException;
import lotus.domino.ViewEntry;
import lotus.domino.ViewEntryCollection;
import java.io.Serializable;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import java.util.Vector;
import com.ibm.xsp.extlib.util.ExtLibUtil;
public class ToolTipText implements Serializable, Map<String, String> {
private static final long serialVersionUID = 1L;
private Session s;
private String repID;
private Database db;
private Database helpDB;
private View helpView;
private ViewEntry ve;
private ViewEntry tVE;
private ViewEntryCollection veCol;
private final Map<String, String> internalMap = new HashMap<String, String>();
public ToolTipText() {
this.populateMap(internalMap);
}
private void populateMap(Map<String, String> theMap) {
try {
s = ExtLibUtil.getCurrentSession();
db = s.getCurrentDatabase();
repID = db.getProfileDocument("frmConfigProfile", "").getItemValue(
"WFSHelpRepID").firstElement().toString();
helpDB = s.getDbDirectory(null).openDatabaseByReplicaID(repID);
helpView = helpDB.getView("vwWFSToolTipHelp");
veCol = helpView.getAllEntries();
ve = veCol.getFirstEntry();
ViewEntry tVE = null;
while (ve != null) {
tVE = veCol.getNextEntry(ve);
Vector colVal = ve.getColumnValues();
theMap.put(colVal.get(0).toString(), colVal.get(1).toString());
recycleObjects(ve);
ve = tVE;
}
} catch (NotesException e) {
System.out.println(e.toString());
}finally{
recycleObjects(ve, tVE, veCol, helpView, helpDB);
}
}
public void clear() {
this.internalMap.clear();
this.populateMap(this.internalMap);
}
public boolean containsKey(Object key) {
return this.internalMap.containsKey(key);
}
public boolean containsValue(Object value) {
return this.internalMap.containsValue(value);
}
public Set<java.util.Map.Entry<String, String>> entrySet() {
return this.internalMap.entrySet();
}
public String get(Object key) {
try {
if (this.internalMap.containsKey(key)) {
return this.internalMap.get(key);
} else {
return "There is no Tooltip Help for " + key;
}
} catch (Exception e) {
return "error in tooltip get Object ";
}
}
public boolean isEmpty() {
return this.internalMap.isEmpty();
}
public Set<String> keySet() {
return this.internalMap.keySet();
}
public String put(String key, String value) {
return this.internalMap.put(key, value);
}
public void putAll(Map<? extends String, ? extends String> m) {
this.internalMap.putAll(m);
}
public String remove(Object key) {
return this.internalMap.remove(key);
}
public int size() {
return this.internalMap.size();
}
public Collection<String> values() {
return this.internalMap.values();
}
public static void recycleObjects(Object... args) {
for (Object o : args) {
if (o != null) {
if (o instanceof Base) {
try {
((Base) o).recycle();
} catch (Throwable t) {
// who cares?
}
}
}
}
}
}
I use JDev 11.1.2.4
I have a custom Supplier class which is being load some items by invoking applicationScope bean method.
I am trying to transform my object to appropriate selectItems. I could obtain right object list essentially, but suddenly faced ClassCastException. Unfortunatelly, i could not find any solution on internet.
I know those classes are exactly same. (additionaly i see on debug time that package and classeses has no difference as seen)
Where is the problem?? I read on internet something about different classloaders but i couldnt reach root cause or solution.
please helpme
brgds
package com.accmee.mobile.supplier;
import com.accmee.mobile.pojo.ServiceCategory;
import com.acme.structure.util.datalist.SimpleListSupplier;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.el.MethodExpression;
import oracle.adfmf.framework.api.AdfmfJavaUtilities;
import oracle.adfmf.javax.faces.model.SelectItem;
public class ServiceCategorySupplier extends SimpleListSupplier
{
public ServiceCategorySupplier(boolean blankItemApplied)
{
super(blankItemApplied);
}
public ServiceCategorySupplier()
{
super();
}
public void loadList()
{
try
{
MethodExpression me = AdfmfJavaUtilities.getMethodExpression("#{applicationScope.loginBean.loadCategories}", List.class, new Class[] { }); /* this applicationScope bean method loads through webservice consume via JavaAPI, and works properly returns list with elements**/
List categories = (List)me.invoke(AdfmfJavaUtilities.getAdfELContext(), new Object[] { });
itemList.addAll(getConvertedToSelectItemList(categories, true)); // here passes parameter into method which faced exception
}
catch (Exception e)
{
e.printStackTrace();
}
}
public String getListName()
{
return "categories";
}
public static Collection getConvertedToSelectItemList(List list, boolean blankItemApplied)
{
Collection convertedCollection = new ArrayList();
SelectItem selectItem = null;
if (blankItemApplied)
{
selectItem = new SelectItem();
convertedCollection.add(selectItem);
}
for(int i=0;i<list.size();i++)
{
ServiceCategory superEntity = (ServiceCategory)list.get(i); // here is the ClassCastException, this line throws exception
selectItem = getConvertedToSelectItem(superEntity);
convertedCollection.add(selectItem);
}
return convertedCollection;
}
public static SelectItem getConvertedToSelectItem(ServiceCategory superEntity)
{
SelectItem selectItem = new SelectItem();
selectItem.setLabel(superEntity.getName());
selectItem.setValue(superEntity);
return selectItem;
}
}
The same class loaded by two different classloaders is considered at runtime as two different classes. Probably that's what's happening to you.
Watch this page: http://www.ibm.com/developerworks/java/library/j-dyn0429/
I had to change my approach. So, i changed returnType of loadCategories method to GenericType instead of my custom class.
Then it worked like that.
public class ServiceCategorySupplier extends SimpleListSupplier
{
public ServiceCategorySupplier(boolean blankItemApplied)
{
super(blankItemApplied);
}
public ServiceCategorySupplier()
{
super();
}
public void loadList()
{
try
{
MethodExpression me = AdfmfJavaUtilities.getMethodExpression("#{applicationScope.loginBean.loadCategories}", List.class, new Class[] { });
List categories = (List)me.invoke(AdfmfJavaUtilities.getAdfELContext(), new Object[] { });
list.addAll(categories);
loadItemList();
}
catch (Exception e)
{
e.printStackTrace();
throw new AdfException(e.getMessage(), AdfException.ERROR);
}
}
public void loadItemList()
{
SelectItem selectItem = null;
itemList=new SelectItem[list.size()];
ServiceCategory serviceCategory=null;
for(int i=0;i<list.size();i++)
{
GenericType serviceCategoryType = (GenericType)list.get(i);
serviceCategory = (ServiceCategory)GenericTypeBeanSerializationHelper.fromGenericType(ServiceCategory.class, serviceCategoryType);
selectItem = getConvertedToSelectItem(serviceCategory);
itemList[i]=selectItem;
}
}
public static SelectItem getConvertedToSelectItem(ServiceCategory superEntity)
{
SelectItem selectItem = new SelectItem();
selectItem.setLabel(superEntity.getName());
selectItem.setValue(superEntity.getId());
return selectItem;
}
public String getListName()
{
return "categories";
}
}
I've spent like the last 24 hours trying to learn JavaFX. I'm trying to build a GUI that will display values from a data source (for example a database). My question is what the preferred way is to do this. So far I've come up with this code to build a simple GUI and get some data from a data source:
import javafx.application.Application;
import javafx.application.Platform;
import javafx.scene.Group;
import javafx.scene.Scene;
import javafx.scene.text.Text;
import javafx.stage.Stage;
public class AvcHmi extends Application {
public static void main(String[] args) {
launch(args);
}
#Override
public void start(Stage primaryStage) {
Text t = new Text(10, 50, "Replace/update this text periodically with data");
Group root = new Group();
root.getChildren().add(t);
primaryStage.setScene(new Scene(root, 400, 300));
primaryStage.show();
new Thread() {
private DataSource dataSource = new DataSource();
{ setDaemon(true); }
#Override
public void run() {
try {
for(;;) {
Thread.sleep(100);
Platform.runLater(new Runnable() {
#Override
public void run() {
System.out.println(dataSource.getDataMap().get("key1"));
}});
}
} catch(InterruptedException e) {
e.printStackTrace();
}
}
}.start();
}
}
Datasource:
import java.util.HashMap;
import java.util.Map;
import java.util.Random;
public class DataSource {
Map<String,String> dataMap = new HashMap<>();
public DataSource() {
dataMap.put("key1", "value1");
dataMap.put("key2", "value2");
dataMap.put("key3", "value3");
}
public Map<String, String> getDataMap() {
Random generator = new Random();
int randInt = generator.nextInt();
dataMap.put("key1", "value"+randInt);
return dataMap;
}
}
100 ms is OK interval to update this GUI as far as I'm concerned. But is this a viable solution?
The next step is to replace the text with a value from the data source. Been looking at Collections and ObservableMap and wondering if it's a preferred way of doing the actual GUI updates? I'm aving some problems with inner classes and final variables but might reason that out after some sleep.
Also, the target machine is not that powerful (somewhere between 350-512 mb RAM). Could this be an issue? My simple tests so far seems to run fine.
Thank you for any feedback on this.
This Oracle example shows how to achieve concurrency loading in data table, with source code; it might help you
You could also look at reading about javafx.concurrent.Task<V> API.
The code on the Oracle example is as follows:
public class UpdateCustomerTask extends Task<Customer> {
private final Customer customer;
public UpdateCustomerTask(Customer customer) {
this.customer = customer;
}
#Override protected Customer call() throws Exception {
// pseudo-code:
// query the database
// read the values
// Now update the customer
Platform.runLater(new Runnable() {
#Override public void run() {
customer.setF setFirstName(rs.getString("FirstName"));
// etc
}
});
return customer;
}
}