I have the following in the theme but I want to make it server independent so that it can be moved seemlessly from deve to test to prod.
how do I genericize http://www.devserver.com?
<resource>
<content-type>text/css</content-type>
<href>http://www.devserver.com/CommonElements.nsf/commonMAX.css</href>
</resource>
I would assume the following - but doesn't work as it makes the reference relative to the current .nsf
<resource>
<content-type>text/css</content-type>
<href>/CommonElements.nsf/commonMAX.css</href>
</resource>
From an article on the Notes & Domino App Dev wiki:
"When the XPages runtime emits a URL, it assumes that the root "/" is the root of the application (e.g. /mydb.nsf). But as this notion doesn't exist in the browser, this is added by the JSF runtime. To work around this add the syntax "/.ibmxspres/domino" to the beginning of the path to set the root as the Domino data directory."
In Chapter 14 of Mastering XPages there is a discussion of Themes. Beginning on page 597 there is a discussion of "Resource Paths" and the Path Aliases for three key paths:
/.ibmxspres/domino points to: /data/domino/html/
/.ibmxspres/global points to: /data/domino/java/xsp/
/.ibmxspres/dojoroot points to: /data/domino/js/dojo-1.4.3/ (or the current version of dojo)
So your block is:
<resource>
<content-type>text/css</content-type>
<href>/.ibmxspres/domino/CommonElements.nsf/commonMAX.css</href>
</resource>
Happy coding
/Newbs
Another option is to investigate the XSP Starter Kit on OpenNTF.org
This will allow you to build an OSGi plugin library for XPages and one of the starter examples that you can extend in the library shows you how to create your own resourceProvider service.
Depending on how you refactor the XSP Starter Kit to your own name space you would be able add your css and other files ( common icons, logos etc ) and then you can access them using your own file path of /.ibmxspres/.yourNameSpace/file.ext. You can even build your own directory structure in the library to make managing the files easier.
One big advantage of this over storing them in a NSF is that they are cached by the end users browser whereas files served via nsf are not always cached.
Also, if the library is set as a global library on the server then you don;t need to add a dependency to the calling application, the resource provider will be available for anything that needs it.
This works if your application is in root.
./commonMAX.css
Related
Remark: I do have searched for similar questions but found no solution that worked, hence I dare to ask this again. Please bare with me!
I am using Eclipse (Version: 2022-12 (4.26.0) / Build id: 20221201-1913 which includes WTP) to work on a Maven project. I am trying to deploy from Eclipse directly to a local Tomcat 8.5 server.
What is currently blocking me is that some of my application's resource files (like application.properties and logback.xml) contain place-holders and obviously these are not "filtered" when Eclipse/WTP deploys them to the wtpwebapps-folder before starting Tomcat.
When I run the build in Maven on the command-line then the resource files are all properly filtered (i.e. all properties defined in my settings.xml file or in the pom.xml do get properly replaced in the copy of these resource files that gets placed in the target-folder and which are then wrapped up into a .war file).
So, deploying that .war file to Tomcat works but starting it directly from Eclipse would be so much easier and quicker.
However, when I build this project in Eclipse then the files that get "published" to
<workspace>/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/<project_name>/WEB-INF before it starts Tomcat still contain the original placeholders which then causes the application to crash during startup. :-(
I do have the resource folder defined in the pom.xml (this was one suggestion that I had found) as:
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
but this had no effect on the filtering for Eclipse/WTP.
I also tried to enable the maven-war-plugin's explode-goal in M2E's lifecycle mapping because the war-plugin's config contains a corresponding "filtering"-config:
...
<resource>
<directory>${basedir}/src/main/resources</directory>
<filtering>true</filtering>
<targetPath>WEB-INF/classes</targetPath>
</resource>
...
but with no effect, either.
Any other idea, hint or suggestion to convince Eclipse/WTP to "filter" these resource files before starting Tomcat?
Background
I work in QA at a software company.
We have about a half a dozen different web applications, each of which may require, at any given site, some customised settings added to its web.config file.
These can range from which Oracle database/schema(s) the app connects to, to how many search results to cache, to which hierarchy to use when sorting items on a web page.
We make use of Microsoft's Deploy package, to get the new releases installed/updated on client sites.
When we put out a new release, some of these customised settings may have been added to or removed from the given web app's web.config file, but using Deploy to import the new release over the top of the old one will clobber any customisations that may have been made.
Alternatives
There are ways of handling this manually, such as merging via a plain text comparison of the old and new web.config files, but these are cumbersome and prone to human error.
I was reading about transformations and thought they could be of some use.
There is also the capability to use external files (tip #8) which seems like a good way to go.
Improvement?
Should our programmers be providing some sort of semi-automated merge facility for this web.config file? Does the Deploy package provide this somehow?
Should we be making use of the external config files, as a best practice?
Is the concept of customising this web.config file at each site so fundamentally flawed that the whole thing needs to be re-thought?
Microsoft provides Web.config transformations as the de-facto way to do this. You can create different deployment configurations within Visual Studio and the web projects. Then when you build or your build server builds with that particular configuration the web.config is transformed to contain the settings you want to see.
View more about web.config transforms here.
Had started my typical EE build (using a bootstrapped config) for a client when they announced they wanted another additional site using the MSM module (le sigh).
So added the MSM module, I commented out the $config['site_url'] and $config['cp_url'] and set those in index.php instead using $assign_to_config.
That's when I discovered this bug where MSM config file settings are not recognized, which is a pain but I can work around it. However, I noticed that when I created the secondary site, it wouldn't recognise my custom location for add-ons and so I had to add that to index.php as well to $assign_to_config['third_party_path'] = "../assets/third_party/";.
Then I discovered that when I create or modify a template file, it won't automatically sync and so I need to manually do that each time which is a real PITA.
Why would my templates not be syncing to the database? Is this related to the MSM config bug?
While I haven't tried bootstrapping the third party path yet, I've definitely been able to bootstrap the template path for MSM sites... What bootstrap method are you using?
Are your sites on subdomains or subfolders? I've only had experience with subfolders so perhaps that makes a difference (although it shouldn't).
Could you maybe walk through in a bit more detail what's happening? Your first site (site_id = 1) templates sync automatically from filesystem edits, but your second site does not? Yet if you go to CP > Design > Synchronize Templates, that works?
The $assign_to_config portion of MSM setup is definitely a weakspot when it comes to bootstrapping... I wonder if we need to work up an additional bootstrap for MSM+CP environment, where it looks at the cp cookie ($_COOKIE['exp_cp_last_site_id']), and sets values based on that.
It may be helpful if you let us know which bootstrap you are using. For example, if you look at this bootstrap the site_url and cp_url are set using the HTTP_HOST server variable, so this shouldn't clash with your MSM install (and multiple domains) at all.
Perhaps you could try using that boostrap file instead, and see if it fixes your issue with template syncing?
Finally, if you're going to use the EE template manager, you don't really need to store templates as files. Conversely, if you want to save templates as files, it's probably much easier editing them using Sublime Text or another editor, rather than the clunky built-in editor (which is really only useful for small/simple changes).
When I try to generate a javadoc, using the menu command Project\Generate Javadoc, the following warnings and error are produced for my custom classes in XPages:
javadoc: warning - No source files for package net.focul.utilties
javadoc: warning - No source files for package net.focul.workflow
javadoc: error - No public or protected classes found to document.
The packages are in the WebContent/WEB-INF/src folder which is configured in the build path and are selectable in the Generate Javadoc wizard. The classes are public with public methods.
Javadocs are generated for all of the Xpage and Custom Control classes if I select these.
You're experiencing this behavior because javadoc doesn't understand the Designer VFS (Virtual File System). It assumes that your project consists of a bunch of separate files in some folder structure on your local hard drive, not self-contained inside a single NSF. On the whole, the Designer VFS successfully tricks Eclipse into believing it's interacting with local files by intercepting read/write requests for project resources and importing/exporting DXL or CD records, etc. But apparently they haven't applied this sleight of hand to javadoc as well.
The Java source files corresponding to each XPage and Custom Control are processed successfully because, ironically, they are never stored in the NSF. During every project build, Designer discards any of these it has already generated and re-creates them based on the current contents of the various .xsp files. It then compiles those Java files into .class files, which are stored as design notes inside the NSF. At runtime, it's these files that are extracted from the VFS and executed... the source code no longer matters at this point, so there's no reason to ever bother including the .java files in the NSF, so they're just kept on the hard drive. One indication of this behavior is that the folder is named "Local" when viewed in Package Explorer / Navigator.
If you're using the built in (as of 8.5.3) version control integration (see this article for a great explanation of how to use this feature), you can tweak the Build Path to include the copy of the src folder stored in the on-disk project as a "linked source folder". This causes javadoc to consider the duplicate copies valid source files, and therefore includes them in the generated documentation. On the downside, it also causes Designer to consider them valid source files, which causes compilation errors due to the duplication. So this approach is only viable if you only need to generate the documentation on an infrequent basis, and can therefore break the Build Path temporarily just to run javadoc, then revert to the usual settings.
An alternative is to actually maintain your custom Java code this way on an ongoing basis: instead of creating the folder in WEB-INF inside the NSF, just create a folder on your hard drive that stores the source, then include that location as a linked source folder indefinitely. That way Designer can still find the source, but so can javadoc. NOTE: if you go this route, then you definitely need to use SCM. Because your source code no longer lives inside the NSF, providing the convenient container we're used to for getting the source code to other developers and ensuring inclusion in whatever backup schedule you use, the only place your source code now lives is on your local hard drive. So make sure you're regularly committing those files to Git / Subversion / Mercurial, etc., or, at the very least, storing them on some file server that is backed up regularly and, if applicable, accessible to all other members of the project team.
When you expand the net.focul.utilties in Designer, you will see all the methods and properties. But when you click on on of the methods, you will see neo source code.
So this is where javadoc fails to generate the documentation. I guess that the author of the application has not provided you with the source code. If you have the source somewhere, you can attach this code and then javadoc will be able to generate the documentation.
I run into the same situation and I have found the most straightforward method is to export the source to an external folder and then use regular Eclipse to generate the JavaDoc. Not sure my process is any less of a hassle than Tim's suggestions but for me it just feels less risky than trying to deal with the VFS vagaries.
I've got a working hello-world like webpart for my SPS3.0
I can compile, pack and deploy it using VS2008, makecab.exe and stsadm. So I know the theory of deploying sharepoint webparts.
My problem:
After I inserted an additional .webpart file, an elements.xml and a feature.xml to deploy the .webpart file and get knowledge about adding features to my webpart, the deployed webpart is missing its safe control entry in the web.config.
But the dll can be found in the gac and my features are also deployed to the right folders.
I didn't change anything in my manifest.xml especially not in it's -tag, because it definitely worked before i added my additional feature files.
Can anybody help me? Should i provide you some code snippets?
Thanks Stefan
You can try WSPBuilder, it will automate and ease your deployment process.
As far as I can tell, you are trying to find out how to register your web part as a safe control without using any tools, etc. and also without admin rights. I think you will find this impossible since the safe control registration needs to happen in the web.config file and one way or another (WSP Builder, manually, script) this file needs to be modified. Only admins can do this as far as I know.
If you are deploying your solution package using stsadm -o deploysolution, be sure that you are either using the allcontenturls parameter or that the url parameter is pointing to the correct web application. Which parameter you use (and how) will determine which web.config file(s) will have the safe control settings from manifest.xml applied to them.