Quarkus best way to provide beans from library - shared-libraries

I want to provide quarkus beans from a separate codebase, brought in as a dependency. What is the best way to do this?
My first thought was to find the artifact that has the annotations such as #ApplicationScoped, etc and making them part of my library dependencies, but after some searching it isn't obvious of the correct dependency.
I have also seen extensions, but making an extension feels fairly heavy; I don't need to change how Quarkus runs, just define some beans in a library.
I wish I could provide more in this question, but unsure of best-practice-wise where to go from here.

Besides using a producer method, as said by #Turing75, you may enable bean discovery by generating a Jandex Index for your library:
A dependency with a Jandex index is automatically scanned for beans.
To generate the index just add the following to your pom.xml:
<build>
<plugins>
<plugin>
<groupId>org.jboss.jandex</groupId>
<artifactId>jandex-maven-plugin</artifactId>
<version>1.2.2</version>
<executions>
<execution>
<id>make-index</id>
<goals>
<goal>jandex</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

Related

How to run Cucumber Junit tests parallely without sharing data between invoked threads

I'm running cucumber tests parallelly using below maven configuration:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>3.0.0-M5</version>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>TestRunner.java</include>
</includes>
<testFailureIgnore>true</testFailureIgnore>
<parallel>methods</parallel>
<threadCount>${parallelCount}</threadCount>
<forkCount>${parallelCount}</forkCount>
<reuseForks>false</reuseForks>
<perCoreThreadCount>false</perCoreThreadCount>
</configuration>
</plugin>
</plugins>
Versions:
<serenity.version>3.2.0</serenity.version>
<cucumber.version>7.2.3</cucumber.version>
<junit.version>4.13.2</junit.version>
Now issue is code is running fine, tests are running parallely but static variables are shared among threads even after using reuseForks = False
Tried various combinations for failsafe config parallel, perCoreThreadCount,
useUnlimitedThreads, reuseForks but no luck.
Any idea what changes need to be done to make so that static data is not shared between threads. Thanks!
Any idea what changes need to be done to make so that static data is not shared between threads. Thanks!
Fundamentally, it is a property of static fields that there is only one. This means that you can not have a static fields that is not shared by all threads.
Instead you may want to look at using Dependency Injection. This will allow you avoid the use of static fields by injecting data into your step definition files. This data will be scoped to a scenario and not leak out (unless you use static fields ofcourse).

Conditional VDM Generation odata-generator-maven-plugin parameters

I am using following Maven Plugin to generate the VDMs for OData consumption.
<plugin>
<groupId>com.sap.cloud.sdk.datamodel</groupId>
<artifactId>odata-generator-maven-plugin</artifactId>
<version>3.13.0</version>
<executions>
<execution>
<id>generate-consumption</id>
<phase>process-resources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<overwriteFiles>true</overwriteFiles>
<inputDirectory>/src/main/resources/connectedsystem/edmx</inputDirectory>
<outputDirectory>${project.basedir}/src/gen/java</outputDirectory>
<deleteOutputDirectory>false</deleteOutputDirectory>
<packageName>com.sap.requisitioning.vdm</packageName>
</configuration>
</execution>
</executions>
</plugin>
However I do not want the VDM's to be generated in every maven build.
I would like to achieve the following behaviour
VDM are not generated in mvn clean install by default
VDM classes are generated when we pass come explicit parameter mvn clean install -D<>
Could you please suggest how can this be achieved ?
Regards
atanu
You can use Maven profiles to achieve this. Declare the plugin under a specific profile that is only active given a specific parameter like in this example.
Additionally you should take care that when running clean the generated sources are not deleted. This could happen if you generate them into the output directory (typically target).

How to execute integrationtests for own OData service in SAP Cloud SDK

We currently provide an own OData service in our Spring Boot application with the SAP Cloud Platform Provisioning SDK which is part of the SAP Cloud SDK. We are creating integration tests in the respective maven module, but when executing this via Maven it fails with the following stack trace:
[http-nio-auto-1-exec-1] ERROR com.sap.cloud.sdk.service.prov.v2.rt.cdx.CDXRuntimeDelegate - Error initializing the service <service-name>
java.lang.IllegalArgumentException: URI is not hierarchical
at java.io.File.<init>(File.java:418)
at com.sap.cloud.sdk.service.prov.v2.rt.cdx.CDXRuntimeDelegate.getFilefromFileName(CDXRuntimeDelegate.java:410)
at com.sap.cloud.sdk.service.prov.v2.rt.cdx.CDXRuntimeDelegate.getFileForService(CDXRuntimeDelegate.java:387)
at com.sap.cloud.sdk.service.prov.v2.rt.cdx.CDXRuntimeDelegate.initialize(CDXRuntimeDelegate.java:252)
at com.sap.cloud.sdk.service.prov.v2.rt.cdx.CDXRuntimeDelegate.getModelProvider(CDXRuntimeDelegate.java:204)
at com.sap.gateway.core.api.provider.delegate.ProviderFactory.createModelProvider(ProviderFactory.java:202)
at com.sap.gateway.core.api.provider.delegate.ProviderFactory.getEdmModelProvider(ProviderFactory.java:128)
at com.sap.gateway.core.odata4sap.ServiceFactory.createService(ServiceFactory.java:135)
Looking at the code this seems to be related to the following post:
Why is my URI not hierarchical?
In the SDK the OData EDMX file is read as a file however since during maven execution it is in a separate JAR file (of the application module) it cannot be accessed that way. Instead it would need to be read as a stream, which in turn seems to require some refactoring.
As a workaround I copied the EDMX file to the src/test/resources/edmx of the integration-tests module.
I'm now wondering if I am missing something here, or if the execution of the integration-tests as usually done per SAP Cloud SDK is not compatible with the provisioning framework?
Although I'm not too familiar with the use case you explained, I would recommend checking out the Maven documentation on additional resource folders. You can probably point your integration-tests module to the respective /resources folder of application modules, in addition to its own /resources folder. I think relative paths should be possible.
As an alternative to what Alexander already posted, you could also automate the copying of the files via maven, like in this snippet:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<!-- Copying the edmx files to the integration-tests project -->
<execution>
<id>copy-resources</id>
<phase>validate</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/src/test/resources/edmx</outputDirectory>
<resources>
<resource>
<directory>${project.parent.basedir}/srv/src/main/resources/edmx</directory>
<filtering>true</filtering>
</resource>
</resources>
</configuration>
</execution>
<execution>
<id>default-testResources</id>
<phase>process-test-resources</phase>
<goals>
<goal>testResources</goal>
</goals>
</execution>
<execution>
<id>default-resources</id>
<phase>process-resources</phase>
<goals>
<goal>resources</goal>
</goals>
</execution>
</executions>
</plugin>

Static metamodel class is not generated

I just started learning and using Jhipster. I have a question about JPA Static metamodel generation. The following is what I have done according to the Jhipster website but the static matemodel class(Class X_) is not generated:
I created an entity called: SalesByDepartment. After this entity generated, I changed its JOSN file from folder:.jhipster under my project folder by setting service to serviceImpl from no, and jpaMetamodelFiltering to true. My understanding is that I need to re-run entity sub-generator to regenerate the same entity to enable Filtering feature after I've done this change to this entity's JSON file. However, I only can find 'SalesByDepartmentCriteria' and 'SalesByDepartmentQueryService'. There is no class 'SalesByDepartment_' under the domain package. I also checked pom.xml and I can find the plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
<!-- For JPA static metamodel generation -->
<path>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<version>${hibernate.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
May I know if anything else I have missed to generate 'SalesByDepartment_' under domain package?
Thank you for the help.
By the way, it worked fine when I generated the first project. I did the same way and static metamodel classes were created automatically under project folder: 'com.xxx.domain'. I also can find them under target folder after build process with Maven. I guess there are something wrong but still have no idea why is that. Below is the screen shot for two projects that I have created using 'jhipster'. A is the previous project which I could generate static metamodel, but B doesn't work:
enter image description here
I had this problem too, the best way that I found for myself - add the dependency to maven and to annotation processor path
<dependencies>
...
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<version>${hibernate.version}</version>
</dependency>
</dependencies>
annotation processor
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>11</source>
<target>11</target>
<annotationProcessorPaths>
<path>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<version>${hibernate.version}</version>
</path>
...
</annotationProcessorPaths>
</configuration>
</plugin>
</plugins>
</build>
Hope it helps somebody )
JPA Static metamodel is generated by build process (maven or gradle) as explained in JHipster doc so you just have to build your app and you'll find SalesByDepartment_.java under target for maven and under build for gradle.
In my case the problem was a problem in services that the compiler don't notice.
I change a service class from implemented service to service so the implementation class was still existing, i erase that file an everything works fine.

Spark: Avoiding Namespace Conflict when building modified spark

I am building a custom spark into a jar file. And I want to use that while using the default spark build.
How do I change the namespace from org.apache.spark.allOfSpark into org.another.spark.allOfSpark without going through all files?
I want to do this in order to avoid conflict when importing modules. Thanks in advance.
Depending on the build tool you are using, you could use Maven's relocation feature to move your custom spark into a new package at build-time. There are similar features in sbt and other build tools.
If you specify what you are using to build your project, I can further help on your issue.
-- UPDATE
Here is a sample code for your pom.xml that should help you getting started :
<project>
<!-- Your project definition here, with the groupId, artifactId, and it's dependencies -->
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.spark</pattern>
<shadedPattern>shaded.org.apache.spark</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This will effectively move all of Spark into a new package called shaded.org.apache.spark when you package your application (when you ask Maven to produce a jar).
If you need to exclude certain packages, you can use the <exclude> tag as shown in the link of Maven's relocation.
If what you are trying to achieve is simply to customize some parts of Spark, I would advise you to either fork Spark's code and directly rewrite parts of MLLib, and then build it only for you (or contribue it to the community if it can useful).
Or you could simply pull it as a dependency from Maven and just overwrite the classes you are modifying, Maven should then use your own class instead of the one in the original Spark package.

Resources