So far I have:
installed and started sbt.sample-1.0.0.20140125-1133.ear on my WebSphere Application
Server,
added an URL resource for the SBT Properties file.
The Social Business Toolkit Samples app runs fine and I'm able to connect to my IBM Connections and retrieve some ActivityStream entries.
When I first loaded the application, I noticed this error:
Exception stack trace: com.ibm.websphere.naming.CannotInstantiateObjectException: A NameNotFoundException occurred on an indirect lookup on the name java:comp/env/url/ibmsbt-managedbeansxml. The name java:comp/env/url/ibmsbt-managedbeansxml maps to a JNDI name in deployment descriptor bindings for the application performing the JNDI lookup. Make sure that the JNDI name mapping in the deployment descriptor binding is correct. If the JNDI name mapping is correct, make sure the target resource can be resolved with the specified name relative to the default initial context.
In the Samples application's ibm-web-bnd.xml file I found this line:
<resource-ref name="url/ibmsbt-managedbeansxml" binding-name="url/ibmsbt-managedbeansxml" />
And in the web.xml:
<resource-ref>
<description>Reference to a URL resource which points to the managed bean configuration for the Social Business Toolkit.</description>
<res-ref-name>url/ibmsbt-managedbeansxml</res-ref-name>
<res-type>java.net.URL</res-type>
<res-auth>Container</res-auth>
<res-sharing-scope>Shareable</res-sharing-scope>
</resource-ref>
I'm wondering, why should there be an URL resource to the JSF Application Configuration Resource File (managed-beans.xml) in the first place? According to the Java EE documentation the JavaServer Faces implementation will look for it in the /WEB-INF/ folder.
Does the SBT uses JavaServer Faces technology somewhere? Or can I choose not to use the managed-beans.xml file in my own applications that use the SBT?
I wouldn't recommend you consider them related. managed-beans.xml had a prior name, and it's just a set of configuration objects. The project itself does not use Java Server Faces.
I just read the documentation again, more carefully than the first time, and I think I now have a better understanding of what I asked in my second question. From the documentation:
In a web application SBTFilter (HTTP servlet filter) is responsible
for initializing the application using servlet context. Application
does the initialization like loading the managed beans and properties
factories.
The sample app is a web application. I think in my own application I can choose to use com.ibm.commons.runtime.impl.app.ApplicationStandalone instead of com.ibm.commons.runtime.impl.servlet.ApplicationServlet and then configure an endpoint programmatically. Or alternatively do not use an Application at all, like so:
RuntimeFactory runtimeFactory = new RuntimeFactoryStandalone();
Application application = runtimeFactory.initApplication(null);
Context.init(application, null, null);
Related
I've a mule application which needs to load log4j2.xml from different locations as per the environment shown below.
app1
dev --> /etc/dev/app1/log4j2.xml
sit --> /etc/sit/app1/log4j2.xml
. . .
prod --> /etc/prod/app1/log4j2.xml
I don't want to use spring bean loading as by the time this bean is loaded, Mule would have already initiated log context for this app1 with default configuration and writes few logs to it.
Within log4j functionality, there are log4j2.system.properties and log4j2.component.properties files. When either of them is added to classpath (src/main/resources) with log4j.configurationFile property in it, it is supposed to pick up this file during application startup itself.
Reference: Log4j System Properties
log4j.configurationFile=${config.path}/app1/log4j2.xml
config.path is defined in wrapper as system property and available to app1 holding the env path ("/etc/dev" if dev or "/etc/sit" if sit etc..)
However, both of these files are not picking up by Mule and resolving to default configuration.
Can someone please assist in making any of these files pick up by Mule during application startup itself?
After long research, we have to update mule_artifact.json with "logConfig" key to define the location of external log4j2.xml file in server relative to mule_home path.
The same path may not work in local but you can create "mklink" to resemble server path in local.
I've tested successfully both.
I'm writing a precompiled Azure function that will perform a SOAP call to ServiceNow. The code works as a standalone exe but I can't seem to get it converted to a precompiled function. In know it's because my DLL can't find the app.config file but what's the best way to get around it. Error message below. ServiceNow requires I set certain bindings and endpoint configuration. The other contractors for their ServiceNowSoapClient class allow me to specify a url directly but don't seem to allow me to get to the binding settings.
Exception while executing function: Functions.TimerTriggerCSharp.
System.ServiceModel: Could not find endpoint element with name
'ServiceNowSoapDev' and contract 'ServiceNowReference.ServiceNowSoap'
in the ServiceModel client configuration section. This might be
because no configuration file was found for your application, or
because no endpoint element matching this name could be found in the
client element.
In WCF you can define your client binding and endpoint programmatically instead of using app.config. Use the constructor of the generated client with two parameters:
new ServiceNowSoapClient(binding, remoteAddress);
See more code here.
Despite adding OWIN authentication to my MVC site I keep getting redirected to /Account/Login even though I have set authentication to none in the web.config and changed the Owin LoginPath to /Login/
I have also noticed that the Startup.cs ConfigureAuth(IAppBuilder app) never gets hit.....
I have added the following packages
Microsoft.Owin
Microsoft.Aspnet.Identity.Owin
Microsoft.Owin.Security
Microsoft.Owin.Security.Cookies
Microsoft.Owin.Security.Oauth
Owin
Do I need to configure OWIN to use my Startup.cs class or should it just work?
You should take a look at OWIN Startup Class Detection
Every OWIN Application has a startup class where you specify
components for the application pipeline. There are different ways you
can connect your startup class with the runtime, depending on the
hosting model you choose (OwinHost, IIS, and IIS-Express). The startup
class shown in this tutorial can be used in every hosting application.
You connect the startup class with the hosting runtime using one of
the these approaches:
Naming Convention: Katana looks for a class named Startup in namespace matching the assembly name or the global namespace.
OwinStartup Attribute: This is the approach most developers will take to specify the startup class. The following attribute will
set the startup class to the TestStartup class in the StartupDemo
namespace.
[assembly: OwinStartup(typeof(StartupDemo.TestStartup))]
The OwinStartup attribute overrides the naming convention. You can also specify a friendly name with this attribute, however, using a
friendly name requires you to also use the appSetting element in the
configuration file.
There are a few other things in addition to the above, which you will find at the link provided.
I'm trying to implement OpenNTF Domino API as a replacement in our project but it fails with this message:
"OpenNTF Domino API: org.openntf.domino.utils.Factory is not initialized for this thread!"
Code snippet:
boolean init = Factory.isInitialized(); // false
Database db = Factory.getSession().getCurrentDatabase(); // This fails of course because no Session
I'm implementing the call in a JAVA DAO behind a EXTLib Servlet in XPages.
So it's not called by an XPage but as an REST API call.
The Domino API Demo DB is working so the server install seems to be OK.
Is there a setup, properties I'm missing to init it ?
Yes, there is specific setup require for non-XPages access, as done in OsgiWorlds on OpenNTF. Nathan has added a DAS extension specifically for REST access from Graph database. You basically need to initialise the session for the Factory before trying to access it, generally done in the Servlet when it initiates the HTTP connection. Please contact me on Twitter (Paulswithers) so the team can work with you. Also it's worth you having a look at the OsgiWorlds source code. Although that's for a Vaadin servlet and allows defining a development user to run as, in production mode it also uses the logged on user name and the configuration class and calls to it from the servlet are effectively what you need from the REST servlet.
Is there a way to expose a Java rest web service in Liferay but not in a portlet, that can receive JSON request and store the data in Journal Article?
Therefore when a user logs into Liferay they will be see web content
Yes there is : JSONWebServiceActionsManagerUtil.registerJSONWebServiceAction
For instance :
Class<?> serviceImplClass;
Method serviceMethod;
Object serviceImpl;
String path = jsonWebServiceMappingResolver.resolvePath(serviceImplClass, serviceMethod);
String method = jsonWebServiceMappingResolver.resolveHttpMethod(serviceMethod);
JSONWebServiceActionsManagerUtil.registerJSONWebServiceAction("/yourwspath", serviceImpl, serviceImplClass, serviceMethod, path, method);
You should then be able to see the new web service in http://SERVER/api/jsonws
Well yes, Liferay has a full API (even JSON-based, SOAP optional, no classic REST though) that you can use. A simple Stackoverflow answer is not the right place to give a full introduction on how to work with Liferay's API, but you might want to look up Servicebuilder (which is used to create Liferay's API) and then look at JournalArticleService and related services: The Web Content Management API is called "Journal" in Liferay (for historical reasons)