Hopefully someone can help me with this.
It is my understanding that using a ClassLoader is the most reliable way to load in content.
public class Pipeline{
public static URL getResource(String filename) {
return ClassLoader.getSystemResource(filename);
}
public static InputStream getResourceAsStream(String filename) {
return ClassLoader.getSystemResourceAsStream(filename);
}
}
If you had a file at "[jar bundle]/resources/abc.png" ..You would load it by:
URL url = Pipeline.getResource("resources/abc.png");
Loading is simple.
Saving is what's getting me.
I have a program that collects data while running, saves that data on exit, and then loads the data back in next time and keeps adding to it.
Easiest solution I think would be to save back into the jar bundle so that ClassLoader can get at them. Is this even possible? Or recommended?
I don't mind having my resources outside of the jar, just as long as I don't have to resort to 'File' to get at them and save to them. (Unless it can be done cleanly)
folder/application.jar
folder/resources/abc.png
If you could ../ back one from where the ClassLoader is looking it would be easy to cleanly get data from the directory that actually contains the jar file
Pipeline.getResource("../resources/abc.png");
Any ideas?
This isn't really what class loaders are meant for. Loading resources from the class loader is meant so that you can bundle up your application as one package and components can read each other without worrying about how the system you're deploying to is setup.
If the file in the JAR is meant to be changed by the app, then it isn't part of the app and thus probably shouldn't be in the JAR.
I don't have a lot of context on your app, but hopefully my suggestion will be valid for your situation.
I recommend setting a requirement in your app that it has a work area to which it is allowed to read and write and accept a configuration setting that specifies where this directory is. Typical ways to do this in Java are with environment variables, system properties or JNDI settings (for container deployments).
Examples:
Tomcat's startup scripts figure out where it is installed and sets a system property called catalina.home and allows you to over-ride it with an environment variable called CATALINA_HOME.
JBoss looks for JBOSS_HOME
Java application servers typically look for JAVA_HOME to find the JDK.
Related
Using hazelcast 5.2.1
We are moving from a java-based config in a custom application to stand-alone serve with a yaml config, since we would like to use the public docker image as a base for a hazelcast member. We excpet to just add some jar files in ${HZ_HOME}bin/user-lib and a config file in ${HZ_HOME}/hazelcast.yaml.
Our config gets picked up, and the server starts. But when the clients try to put objects, things go bad. The server logs the error:
com.hazelcast.nio.serialization.HazelcastSerializationException: Cannot write null portable without explicitly registering class definition
How can we add ClassDefinition objects to the config?
We have classes implementing VersionedPortable, and have static ClassDefinition members for them.
Until now we have just added the class definitions programmatically while configuring the member instance in our own applications, but we cannot find a hook to do this when using yaml config?
I have a micronaut API like this:
#Get("/")
List<Club> listClubs()
#Get("/{id}")
Club show(Long id)
In my unit test, when I invoke the show method, the listClubs() method is actually getting invoked, instead.
Why is this happening?
Details:
Thinking that my URL mappings must be wrong, I started debugging into Netty to try to understand how the framework constructs URLs.
In HttpClientIntroductionAdvice, the context shows the API method like this:
Club show(Long param0)
The interceptor is setting param0 in the parameter map, which doesn't match the actual parameter name of my method. When the URI template is expanded, this causes the ID to get dropped (thus the URI becomes / instead of /1).
I am trying to follow this example:
https://github.com/alvarosanchez/micronaut-workshop/tree/master/ex02/solution/clubs
There is one important different in my project, which is that the endpoint is set at "/club" instead of at "/":
#Controller("/club")
#Client("/club")
I am using a diff tool to compare my project to the sample, but I am struggling to find any other difference (besides package name changes).
Why is this happening? What should I be looking for?
Thanks
Update:
Tested the target endpoint with the browser - looks fine.
Gradle clean does not resolve the issue.
I switched from debugging the Application class with IntelliJ to using "gradlew run" and in the process, I made a change to build.gradle (adding JVM properties pass-through from the gradle CLI). I also played with enabling/disabling the annotation processor in the IDE.
(note: In the previous project, I enabled annotation processing as soon as I imported into the IDE. On this project, I didn't enable it until I started having issues.)
I think the build.gradle alteration caused the problem to go away. Since the issue shows up unreliably, it's hard to tell for certain if this is the change that caused it to be fixed.
So, currently I am working on a project in Reactjs that displays a customised modal.
The configuration of the modal is fetched through a configurationLoader.js file.
Since, it is developed in React, my components are divided across different files.
Currently, what I am doing is, loading the full configuration file and extracting the relevant information when required.
What I find redundant is, I have to require the configuration file at the start of every .js file.
Is there a way, where I export my module once, and its valid globally? .i.e. I don't have to require it again and again?
Globals are registered in the window object for the browser and in the global object in node. So you could do:
window.myConfiguration = require('configurationLoader')
or
global.myConfiguration = require('configurationLoader')
depending on where your code will run. Then you should be able to access myConfiguration anywhere in your code without needing to require it.
I'm trying to use the node-config module to change some parameters of my configuration (basically logging level) during runtime.
In the official documentation says:
Environment variables can be used to override file configurations. Any environment variable that starts with $CONFIG_ is set into the CONFIG object.
I've checked that this is true when the server starts but it does not seem to work once it's up. (The handler of the watch function is never called when an environment variable is changed unlike a change in the runtime.json file or directly changing a config variable).
I'm currently watching the whole CONFIG object like this:
var CONFIG = require('config');
CONFIG.watch( CONFIG , null , function(object, propertyName, priorValue, newValue){
console.log("Configuration change detected");
});
Does anyone know if this is possible?
The environment is available during startup of a process.
If the process is running, you won't be able to change the environment anymore, the process is in.
The only option is to restart the process or use other mechanisms to communicate with it.
Say for example having a rest or tcp listener inside, where you can transfer your variable inside.
Best regards
Robert
As you must knowing, React is a single page application which is eventually when it is complied is a static page app that means all the files of the react application is complied into vanilla JS and CSS file bundle in a Tarball. Now that Tarball is eventually deployed on a web server. It could be Apache web server, nginx web server or anything which you are using it but an important point is the static app is running in someone else browser and someone access to website CSS and JS are downloaded in a browser and it is running in the browser runtime environment so technically you cannot have a runtime environment variable for someone else browser but may be there would be a way to access them during runtime.
SOLUTION
I have achieved this goal with the package called runtime-cra.
follow the steps on this official documentation: https://blog.risingstack.com/create-react-app-runtime-env-cra/
I'm having serious issues trying to send share custom objects between portlets in liferay. I have a Hook Plugin, with a servlet filter, which loads an object of Type MyCustomClass and inserts it into the request object as a parameter.
When i try to read this object in a portlet's render() i get a ClassCastException, though i am casting the object to the same class.
I understand that liferay plugins have different contexts, and i already tried to change the classloader before loading the object in the bean and portlet like this:
ClassLoader portalcl = PortalClassLoaderUtil.getClassLoader();
ClassLoader currentcl = Thread.currentThread().getContextClassLoader();
Thread.currentThread().setContextClassLoader(portalcl);
//do my stuff
Thread.currentThread().setContextClassLoader(currentcl);
however, it did not solved the problem, and the only way i found to solve the problem is to serialize the object into a json string, and deserialize it whenever i need it.
Isn't this kinda lame ? Does anyone know a better solution ?
Regards, DS
It sounds like the main problem you're seeing is that two different class loaders are loading the class which techncally makes them different classes (which it seems like you've already determined).
I haven't used LifeRay much but this has been a problem I've seen on other platforms as well. We were using WebSphere and solved this problem by putting the common MyCustomClass into a shared library that was on the server classpath. This way the server will load the class and make it available to all applications on the server through the server's single classloader. If you let each application load the class then you'll keep seeing this exception.