How do you set certain spring.datasource properties in broadleaf - broadleaf-commerce

We are using the 5.2 version of broadleaf with spring-boot (using the tomcat connection pool) and mysql.
If there is no activity for a while, for example over night, it looses the connection and can't reestablish it. I have been reading about it and was aiming the use spring.datasource.tomcat.test-on-borrow and spring.datasource.validation-query properties but they don't seem to stick.
We have set up our other database properties in common-shared.properties like:
database.user=...
database.password=...
database.driver=com.mysql.jdbc.Driver
database.url=jdbc:mysql://localhost:3306/broadleaf?useUnicode=true&characterEncoding=utf8
What is the correct way to handle this problem with broadleaf?

Sorry about that; the Broadleaf database properties are not related to those specific properties but rather the ones that you have there as database.user ... etc from the DatabaseStarter project. Setting spring.datasource.tomcat.test-on-borrow and spring.datasource.validation-query.
However, it looks like you are hitting the exact same issue that I resolved in this PR in DatabaseStarter (like for MySQL that you are using) and this PR in HSQLDatabaseStarter (which is the default you get). So, as long as you are using the latest GAs of Broadleaf (specifically at least 5.2.1-GA) you should be good to go.

Related

import issues from legacy issue tracker to Gitlab CE?

What's the dealio on importing data from a legacy issue tracker system into Gitlab CE?
Do tools exist for this? Schemas? Suggestions?
Please notice that this is really a legacy issue tracker system. It predates bugzilla, and runs on an old IIS server and SQL Server 2000).
(Say whatever you want about this setup, but it's nothing we haven't already heard.)
You should be using the REST APIs to create your migrations.
Generally recommendation questions are off topic, so if I mention
there is a redmine issue importer and there are issue tracker issues on the gitlab ce issue tracker requesting this. This sounds like a good kind of thing to make as a community contribution if it's a popular tool.
But if it's not, and you're the only person in the world using your tracker, you probably will want to study the python based redmine issue importer it may server as an example for you to write your own REST-api based tool that reads your db and creates the Gitlab Issue Tracker issues. You don't want and don't need to know the Gitlab side's PostGres schema. It will change over time anyways.

How to Use Apache Drill with Cassandra

I am trying to query Cassandra using Apache Drill. The only connector I could find is here:
http://www.confusedcoders.com/bigdata/apache-drill/sql-on-cassandra-querying-cassandra-via-apache-drill
However this does not build. It comes up with an artifact not found error. I also had another developer who is more versed in these tools take a stab at it, but he also had no luck.
I tried contacting the developer of the plugin I referenced, but the blog does not work and won't let me post comments. Has anyone got this plugin to work (if so how?) or is there another plugin or method I can use to connect apache drill to Cassandra? If anyone could show me how to connect an execute a simple SQL query that would be much appreciated.
I looked at the latest Cassandra storage plugin patch and the latest apache drill source. The drill code has changed and the patch can no longer be applied.
I then manually took the patch apart (it id mostly diff output). Most of the patch was new classes which I could easily add to the latest drill source tree. Most of the other updates were easy to insert into the current source. There were two specific classes that required some minor code modifications/extensions. I rebuilt the distribution from the modified source and installed the drill servers it on a 3 node cluster. The Cassandra schema failed to initialize properly throwing a null pointer exception one of the new classes. This leads me to believe that the (latest) modifed storage plugin is incompatible with the latest version of Cassandra. Since the author of the original storage plugin is unreachable and no one else is stepping up to support the code, this is a dead horse. Beat it if you must.
I was the author of the patch written a year back. Could not get it merged into Drill then, and later got occupied with other stuffs :(
With so many changes to Drill internals, I am not sure what amount of welding would be needed at this point to get it working. Please use the code just as a reference for writing a Drill storage plugin.
Have added this banner on top of the blog post to save fellow developer's hours.
I don't know if anyone is still interested in this topic but I've been experimenting with this plugin and got it to work with Drill 1.18-SNAPSHOT. Here is a link to my branch with this code: 1. My plan is to submit this as a PR for Drill, but it still needs some work. This code will successfully query Cassandra 3.11.5 (latest stable version).

getting jHipster project working with Atomikos JTA/XA transactions

I've been playing with the jHipster yeoman generator for the past week and I'm trying to get my application working with atomikos for JTA/XA transactions and I'm running into a number of problems, which is to be expected since I'm new to spring boot and a number of the other components in the jHipster stack.
I have been using the example found here as my starting point for configuring atomikos. I've implemented everything described there, replacing HikariCP entirely.
At the moment I have eliminated Metrics and liquibase from my configuration as they were giving me problems and I wanted to get the basics working and then add them back in. However, I'm now hitting a Hibernate issue.
Hibernate is complaining that second-level cache is used but hibernate.cache.region.factory_class is not given. The factory_class setting is specified in the configuration and I'm not able to figure out what I'm missing.
Has anyone managed to get atomikos (or maybe bitronix) working with this stack?
I've managed to get this working. For some reason I had to explicitly set hibernate.cache.use_second_level_cache to false. Not sure why it would require this given that I am not setting any second level cache flags anywhere that I can see.
Never the less. It's working now.

XPages - Can't instantiate class: 'Cannot find class <<classname>> in NSF'

I have an XPages application that has been running quite happily for the last couple of years.
The application uses some Java classes as converters that are used to convert data entry on certain fields to upper case, proper case, etc. This functionality has been present and working since the app was first deployed.
Today I was asked to make a change to one of the pages, adding a new field. The change was made in a test copy of the database, the app rebuilt and tested and all was fine.
The same change was then promoted into the live database, but after being rebuilt the error
Can't instantiate class: 'Cannot find class uk.co.xxx.beans.UpperCase in NSF'.
is being returned.
I have tried rebuilding the app and cleaning the project numerous times without success. I have also tried amending the code in the Java class and rebuilding.
If I remove all references to the UpperCase class it then complains about the ProperCase class. It appears as if the app has lost its reference to the Java classes and rebuilding isn't fixing the problem.
As an interim solution I have removed all uses of these classes so that users can at least display the page - but this is obviously not a long term solution. As soon as I reinstate one instance and rebuild, the error returns.
The only change in the app has been the addition of the new field. Removing it makes no difference.
Can you suggest anything else I can try and what may be causing the problem?
Open production db in designer and using Navigator view open WebContent/WEB-INF and delete classes folder. Then rebuild the app or refresh it from template again.
Sometimes classes are not updated correctly. I've seen it few times.
I have seen this too (domino 8.5.3). I recompile until it eventually works. With a restart of the http task thrown in for good measure. How are you deploying the classes within the db or as jar files in lib/ext?
I ran into similar issues with 8.53 FP2 with the same "can't instantiate errors" if accessed by a designer client. I even put a PRD in at IBM for it.
In order to fix this problem we had to update to the server and clients to FP5. We had a work around before we could update the FP5. We had to modify how the application was rolled out. Once the database was updated from a template we had to do a clean then a build while use our Application ID used for signing applications.
http://www-01.ibm.com/support/docview.wss?uid=swg21639571
JDAE8ZV2CX
XPage With Java Design Element Breaks When Domino Designer opens after applying 8.5.3 Fix Pack 2 interim fix of any 853 Fix Pack 2/Fix Pack 3 hotfix

CoreData without changes in underlying models

Good day,
I have an app with CoreData that is in the Apps Store. I have now coded some
cosmetic changes in the interface without changing anything in the CoreData model.
I did not add/delete/or change any entity or property. Now, I am ready to upload my
version 2 of the app. I am unsure whether I have to do anything so that the old data
of the users in the first version will not be deleted but will be saved in the
new app (with exactly the same CoreData model). Please be tolerant with this noob.
Great thanks,
Romeo
When someone installs a new version of an app they already have, all the app's data stays where it is-- it doesn't get deleted.
Whether this data is compatible with the new version is a different question. If you haven't made any changes to the data model, then it should be fine. But you should make sure. Testing the upgrade process is one of the most important parts of testing a new version of an app. Install the current version, generate some data, and then install your new version and make sure everything looks OK. As you describe it, there shouldn't be any problems, but you should never just trust that this is the case.
In Marcus Zarra's Core Data Migration Course on iDeveloper TV, he suggests manually locking your xcdatamodel file so that you can't accidentally make updates to it and force a migration when you didn't plan it. But I agree with Tom, test it first.
iDeveloper.TV Core Data Migration

Resources