OpenCMS vfs to rfs not getting updated automatically - opencms

When i am updating any image from opencms front end vfs(virtual file system), the changes are not getting reflected but when the same file i am updating in the same location in backend rfs(Real File System), the changes get reflected.
Moreover if i am putting the file only in backend rfs, the changes do not get reflected until i publish the same from front end.
How to automatically update VFS to RFS in OpenCMS?

In a default opencms installation the export in the RFS is on demand.
Look at the opencms-importexport.xml file at the line:
...
<staticexport enabled="true">
<staticexporthandler>org.opencms.staticexport.CmsOnDemandStaticExportHandler</staticexporthandler>
...

Related

Mule not honouring log4j2.component.properties or log4j2.system.properties

I've a mule application which needs to load log4j2.xml from different locations as per the environment shown below.
app1
dev --> /etc/dev/app1/log4j2.xml
sit --> /etc/sit/app1/log4j2.xml
. . .
prod --> /etc/prod/app1/log4j2.xml
I don't want to use spring bean loading as by the time this bean is loaded, Mule would have already initiated log context for this app1 with default configuration and writes few logs to it.
Within log4j functionality, there are log4j2.system.properties and log4j2.component.properties files. When either of them is added to classpath (src/main/resources) with log4j.configurationFile property in it, it is supposed to pick up this file during application startup itself.
Reference: Log4j System Properties
log4j.configurationFile=${config.path}/app1/log4j2.xml
config.path is defined in wrapper as system property and available to app1 holding the env path ("/etc/dev" if dev or "/etc/sit" if sit etc..)
However, both of these files are not picking up by Mule and resolving to default configuration.
Can someone please assist in making any of these files pick up by Mule during application startup itself?
After long research, we have to update mule_artifact.json with "logConfig" key to define the location of external log4j2.xml file in server relative to mule_home path.
The same path may not work in local but you can create "mklink" to resemble server path in local.
I've tested successfully both.

Possible to edit web.config of cloud app deployed on windows Azure without redeploying app?

I would like to add rewrite URL code on azure web app's web.config without redeploying the whole app again. for this I am using 'app service editor' and 'kudu- debug console' for editing the web.config, first I cant save the file and gives me error.
after some search I found that under APP SETTING KEY value should be 0 instead 1
edited the value 1 to 0 and save the APP SETTING KEY, after that I am able to edited the config file, in order to test the code again I changed the value 0 to 1 and save the setting. but when I refresh the file which is opened in editor or kudu the pasted code disappeared, the site is connected with automatic azure deployment pipeline
How I can edited the web.config file without redeploying the app again.
Yes, it's possible to make changes without redeploying the app.
Some details:
Check Run the package document and we can find:
1.The zip package won't be extracted to D:\home\site\wwwroot, instead it will be uploaded directly to D:\home\data\SitePackages.
2.A packagename.txt which contains the name of the ZIP package to load at runtime will be created in the same directory.
3.App Service mounts the uploaded package as the read-only wwwroot directory and runs the app directly from that mounted directory. (That's why we can't edit the read-only wwwroot directory directly)
So my workaround is:
1.Navigate to D:\home\data\SitePackages in via kudu- debug console:
Download the zip(In my case it's 20200929072235.zip) which represents your deployed app, extract this zip file and do some changes to web.config file.
2.Zip those files(choose those files and right-click...) into a childtest.zip, please follow my steps carefully here!!! The folder structure of Run-from-package is a bit strange!!!
3.Then zip the childtest.zip into parenttest.zip(When uploading the xx.zip, the kudu always automatically extra them. So we have to zip the childtest.zip into parenttest.zip first)
4.Drag and drop local parenttest.zip into online SitePackages folder in kudu-debug console and we can get a childtest.zip now:
5.Modify the packagename.txt, change the content from 20200929072235.zip to childtest.zip and Save:
Done~
Check and test:
Now let's open App Service Editor to check the changes:
In addition: Though it answers the original question, I recommend using other deployment methods(web deploy...) as a workaround. It could be much easier~

Modifying tilestache.cfg file does reflect in the api calls (Suspected cache issues)

i have setup my tilestache server and serving my tilemill xml files.
I have followed this tutorial for serving own tilemill files.
https://go.yuri.at/running-a-map-server-with-mapnik-and-tilestache-on-ubuntu-16-04/
There are problems which i am facing after installing tilestache on ubuntu 16.04.
Any changes in the tilestache.cfg file does not reflect when doing a HTTP GET call. For example, if i alter the mapnik file name in tilestache.cfg to some random file location like ("provider": {"name": "mapnik", "mapfile": "/home/Documts/sample.xml"},), the server still gives me the old cached png while accessing localhost:8080/layername/0/0/0.png.
Any help would be appreciated!!
Thanks.

requireJS not updating, still references a deleted file

I used to reference a file called OptionRendererController.js, but I deleted the reference and the file. When I grep my app's directory for OptionRenderer, no results appear, and I've made sure to include urlArgs: "buset=" + (new Date()).getTime() to prevent browser caching, and I restart the python server on which I'm running my app, but I still get a 404 error on the deleted OptionRendererController.js file. How can I make sure my app no longer references that file?
It worked after I restarted Chrome.

Uploaded image only available after refreshing the page

When I upload a picture, the file is successfully saved and the path is successfully set. But the uploaded image is not displayed immediately after the form submit. Only when I reload the page, the uploaded image is displayed.
I'm saving the uploaded file as below:
InputStream is;
try {
File file = new File("C:\\****\\*****\\Documents\\NetBeansProjects\\EventsCalendary\\web\\resources\\images\\uploadPhoto.png");
is = event.getFile().getInputstream();
OutputStream os = new FileOutputStream(file);
setUserPhoto("\\EventsCalendary\\resources\\images\\"+file.getName());
byte buf[] = new byte[1024];
int len;
while ((len = is.read(buf)) > 0) {
os.write(buf, 0, len);
}
os.close();
is.close();
} catch (IOException ex) {
System.out.println(ex.getStackTrace());
}
Why is the uploaded image only displayed after reloading the page and how can I solve this?
You're writing the file straight into the IDE's project folder and your intent seems to save the file in the webapp's deploy folder. This is a bad idea and well due to the following 3 main reasons:
Changes in the IDE's project folder does not immediately get reflected in the server's work folder. There's kind of a background job in the IDE which takes care that the server's work folder get synced with last updates (this is in IDE terms called "publishing"). This is the main cause of the problem you're seeing.
In real world code there are circumstances where storing uploaded files in the webapp's deploy folder will not work at all. Some servers do (either by default or by configuration) not expand the deployed WAR file into the local disk file system, but instead fully in the memory. You can't create new files in the memory without basically editing the deployed WAR file and redeploying it.
Even when the server expands the deployed WAR file into the local disk file system, all newly created files will get lost on a redeploy or even a simple restart, simply because those new files are not part of the original WAR file.
You need to write it to a fixed path outside the project/deploy folder instead. For example, /var/webapp/uploads. Then, to get it to be served by your webapp, just add it as a new web application context to the server.
Based on your previous question, I know that you're using Glassfish 3.1. In this server, it's called a "virtual host". You can configure it at server level in the admin console at http://localhost:4848 > Configuration > HTTP Service > Virtual Servers, or at webapp level by adding the following line to the /WEB-INF/glassfish-web.xml (your IDE should have autogenerated one; note that this file is before Glassfish 3.1 called sun-web.xml, so if you're seeing manuals/blogs/tutorials referencing it, yes it's exactly the same file):
<property name="alternatedocroot_1" value="from=/uploads/* dir=/var/webapp" />
Either way, you should then be able to use http://localhost:8080/contextname/uploads/* to serve those uploaded images from by <img> the usual way.
See also:
How to upload files to server using JSP/Servlet?
Recommended way to save uploaded files in a servlet application (contains a Tomcat configuration example)
Reading/writing a text file in a servlet, where should this file be stored in JBoss? (contains JBoss configuration example)
Simplest way to serve static data from outside the application server in a Java web application

Resources