I have a Page which fills on every "preRenderView" some lists with values of a DB
//preRenderView Method
public void init(){
loadChapterStructure();
loadCategoryStructure();
}
Due to the fact, that the chapters and categories don't chance really often (e.g. just one time a day), they only should be loaded once for every user (on first page load).
When the user now performs some GET-requests on the same view (to keep the page etc. bookmarkable), it would be good not to load these "static" values again.
Is there a way to achieve e.g. loading the chapters and categories e.g. only once every hour? Is there any best-practice for this issue?
Thanks for any help!
You can implement an #ApplicationScoped managed bean which caches the DB values. Just access the data through it instead of directly using the DAO from your view beans:
#ManagedBean
#ApplicationScoped
public class CacheManager(){
private static Date lastChapterAccess;
private static Date lastCategoryAccess;
private List<Chapter> cachedChapters;
private List<Category> cachedCategories;
private Dao dao;
//Refresh the list if the last DB access happened
//to occur more than one hour before
public List<Chapter> loadChapterStructure(){
if (lastChapterAccess==null || new Date().getTime()
- lastChapterAccess.getTime() > 3600000){
cachedChapters = dao.loadChapterStructure();
lastChapterAccess = new Date();
}
return cachedChapters;
}
public List<Category> loadCategoryStructure(){
if (lastCategoryAccess==null || new Date().getTime()
- lastCategoryAccess.getTime() > 3600000){
cachedCategories = dao.loadCategoryStructure();
lastCategoryAccess = new Date();
}
return cachedCategories;
}
}
Then inject the bean wherever your want using the #ManagedProperty annotation:
#ManagedBean
#ViewScoped
public class ViewBean{
#ManagedProperty(value="#{cacheManager}")
private CacheManager cacheManager;
//preRenderView Method
public void init(){
chapters = cacheManager.loadChapterStructure();
categories = cacheManager.loadCategoryStructure();
}
}
Related
I could not find a definitive answer to whether it is safe to spawn threads within session-scoped JSF managed beans. The thread needs to call methods on the stateless EJB instance (that was dependency-injected to the managed bean).
The background is that we have a report that takes a long time to generate. This caused the HTTP request to time-out due to server settings we can't change. So the idea is to start a new thread and let it generate the report and to temporarily store it. In the meantime the JSF page shows a progress bar, polls the managed bean till the generation is complete and then makes a second request to download the stored report. This seems to work, but I would like to be sure what I'm doing is not a hack.
Check out EJB 3.1 #Asynchronous methods. This is exactly what they are for.
Small example that uses OpenEJB 4.0.0-SNAPSHOTs. Here we have a #Singleton bean with one method marked #Asynchronous. Every time that method is invoked by anyone, in this case your JSF managed bean, it will immediately return regardless of how long the method actually takes.
#Singleton
public class JobProcessor {
#Asynchronous
#Lock(READ)
#AccessTimeout(-1)
public Future<String> addJob(String jobName) {
// Pretend this job takes a while
doSomeHeavyLifting();
// Return our result
return new AsyncResult<String>(jobName);
}
private void doSomeHeavyLifting() {
try {
Thread.sleep(SECONDS.toMillis(10));
} catch (InterruptedException e) {
Thread.interrupted();
throw new IllegalStateException(e);
}
}
}
Here's a little testcase that invokes that #Asynchronous method several times in a row.
Each invocation returns a Future object that essentially starts out empty and will later have its value filled in by the container when the related method call actually completes.
import javax.ejb.embeddable.EJBContainer;
import javax.naming.Context;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
public class JobProcessorTest extends TestCase {
public void test() throws Exception {
final Context context = EJBContainer.createEJBContainer().getContext();
final JobProcessor processor = (JobProcessor) context.lookup("java:global/async-methods/JobProcessor");
final long start = System.nanoTime();
// Queue up a bunch of work
final Future<String> red = processor.addJob("red");
final Future<String> orange = processor.addJob("orange");
final Future<String> yellow = processor.addJob("yellow");
final Future<String> green = processor.addJob("green");
final Future<String> blue = processor.addJob("blue");
final Future<String> violet = processor.addJob("violet");
// Wait for the result -- 1 minute worth of work
assertEquals("blue", blue.get());
assertEquals("orange", orange.get());
assertEquals("green", green.get());
assertEquals("red", red.get());
assertEquals("yellow", yellow.get());
assertEquals("violet", violet.get());
// How long did it take?
final long total = TimeUnit.NANOSECONDS.toSeconds(System.nanoTime() - start);
// Execution should be around 9 - 21 seconds
assertTrue("" + total, total > 9);
assertTrue("" + total, total < 21);
}
}
Example source code
Under the covers what makes this work is:
The JobProcessor the caller sees is not actually an instance of JobProcessor. Rather it's a subclass or proxy that has all the methods overridden. Methods that are supposed to be asynchronous are handled differently.
Calls to an asynchronous method simply result in a Runnable being created that wraps the method and parameters you gave. This runnable is given to an Executor which is simply a work queue attached to a thread pool.
After adding the work to the queue, the proxied version of the method returns an implementation of Future that is linked to the Runnable which is now waiting on the queue.
When the Runnable finally executes the method on the real JobProcessor instance, it will take the return value and set it into the Future making it available to the caller.
Important to note that the AsyncResult object the JobProcessor returns is not the same Future object the caller is holding. It would have been neat if the real JobProcessor could just return String and the caller's version of JobProcessor could return Future<String>, but we didn't see any way to do that without adding more complexity. So the AsyncResult is a simple wrapper object. The container will pull the String out, throw the AsyncResult away, then put the String in the real Future that the caller is holding.
To get progress along the way, simply pass a thread-safe object like AtomicInteger to the #Asynchronous method and have the bean code periodically update it with the percent complete.
Introduction
Spawning threads from within a session scoped managed bean is not necessarily a hack as long as it does the job you want. But spawning threads at its own needs to be done with extreme care. The code should not be written that way that a single user can for example spawn an unlimited amount of threads per session and/or that the threads continue running even after the session get destroyed. It would blow up your application sooner or later.
The code needs to be written that way that you can ensure that an user can for example never spawn more than one background thread per session and that the thread is guaranteed to get interrupted whenever the session get destroyed. For multiple tasks within a session you need to queue the tasks.
Also, all those threads should preferably be served by a common thread pool so that you can put a limit on the total amount of spawned threads at application level.
Managing threads is thus a very delicate task. That's why you'd better use the built-in facilities rather than homegrowing your own with new Thread() and friends. The average Java EE application server offers a container managed thread pool which you can utilize via among others EJB's #Asynchronous and #Schedule. To be container independent (read: Tomcat-friendly), you can also use the Java 1.5's Util Concurrent ExecutorService and ScheduledExecutorService for this.
Below examples assume Java EE 6+ with EJB.
Fire and forget a task on form submit
#Named
#RequestScoped // Or #ViewScoped
public class Bean {
#EJB
private SomeService someService;
public void submit() {
someService.asyncTask();
// ... (this code will immediately continue without waiting)
}
}
#Stateless
public class SomeService {
#Asynchronous
public void asyncTask() {
// ...
}
}
Asynchronously fetch the model on page load
#Named
#RequestScoped // Or #ViewScoped
public class Bean {
private Future<List<Entity>> asyncEntities;
#EJB
private EntityService entityService;
#PostConstruct
public void init() {
asyncEntities = entityService.asyncList();
// ... (this code will immediately continue without waiting)
}
public List<Entity> getEntities() {
try {
return asyncEntities.get();
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
throw new FacesException(e);
} catch (ExecutionException e) {
throw new FacesException(e);
}
}
}
#Stateless
public class EntityService {
#PersistenceContext
private EntityManager entityManager;
#Asynchronous
public Future<List<Entity>> asyncList() {
List<Entity> entities = entityManager
.createQuery("SELECT e FROM Entity e", Entity.class)
.getResultList();
return new AsyncResult<>(entities);
}
}
In case you're using JSF utility library OmniFaces, this could be done even faster if you annotate the managed bean with #Eager.
Schedule background jobs on application start
#Singleton
public class BackgroundJobManager {
#Schedule(hour="0", minute="0", second="0", persistent=false)
public void someDailyJob() {
// ... (runs every start of day)
}
#Schedule(hour="*/1", minute="0", second="0", persistent=false)
public void someHourlyJob() {
// ... (runs every hour of day)
}
#Schedule(hour="*", minute="*/15", second="0", persistent=false)
public void someQuarterlyJob() {
// ... (runs every 15th minute of hour)
}
#Schedule(hour="*", minute="*", second="*/30", persistent=false)
public void someHalfminutelyJob() {
// ... (runs every 30th second of minute)
}
}
Continuously update application wide model in background
#Named
#RequestScoped // Or #ViewScoped
public class Bean {
#EJB
private SomeTop100Manager someTop100Manager;
public List<Some> getSomeTop100() {
return someTop100Manager.list();
}
}
#Singleton
#ConcurrencyManagement(BEAN)
public class SomeTop100Manager {
#PersistenceContext
private EntityManager entityManager;
private List<Some> top100;
#PostConstruct
#Schedule(hour="*", minute="*/1", second="0", persistent=false)
public void load() {
top100 = entityManager
.createNamedQuery("Some.top100", Some.class)
.getResultList();
}
public List<Some> list() {
return top100;
}
}
See also:
Spawning threads in a JSF managed bean for scheduled tasks using a timer
I tried this and works great from my JSF managed bean
ExecutorService executor = Executors.newFixedThreadPool(1);
#EJB
private IMaterialSvc materialSvc;
private void updateMaterial(Material material, String status, Location position) {
executor.execute(new Runnable() {
public void run() {
synchronized (position) {
// TODO update material in audit? do we need materials in audit?
int index = position.getMaterials().indexOf(material);
Material m = materialSvc.getById(material.getId());
m.setStatus(status);
m = materialSvc.update(m);
if (index != -1) {
position.getMaterials().set(index, m);
}
}
}
});
}
#PreDestroy
public void destory() {
executor.shutdown();
}
I have this problem: I´m making this wonderfull tutorial The NetBeans E-commerce Tutorial . But instead of make it in JSP as is presented, i´m making a JSF version. Just to undertands the logic in construction an application like that.
In certain part the ControllerServlet.java, has this code:
int orderId = orderManager.placeOrder(name, email, phone, address, cityRegion, ccNumber, cart);
// if order processed successfully send user to confirmation page
if (orderId != 0) {
// dissociate shopping cart from session
cart = null;
// end session
session.invalidate();
// get order details
Map orderMap = orderManager.getOrderDetails(orderId);
// place order details in request scope
request.setAttribute("customer", orderMap.get("customer"));
request.setAttribute("products", orderMap.get("products"));
request.setAttribute("orderRecord", orderMap.get("orderRecord"));
request.setAttribute("orderedProducts", orderMap.get("orderedProducts"));
userPath = "/confirmation";
// otherwise, send back to checkout page and display error
As you can see, the author invalidates the session, in order to permit another purchase order. I made an Managed Bean with session scope in order to mantain the data avalaible throught the whole session. But when I try to clean up the session, as in the tutorial the author does, I can´t receive the data for confirmation.
Then, I made a different managed bean in order to have one to process the order (CartManagerBean), and another one to present the confirmation (ConfirmationMBean). I just injected the confirmatioBean into the cartBean to pass the orderId, necessary to present the data. In the confirmationBean, I made a cleanUp() method that invalidates the session.
But always, the data is not presented. So if any one can tell me what to do, I´ll appreciate.
Here is the part of my cartBean's code that pass the data to the confirmation bean:
...
#ManagedProperty(value ="#{confirmationBean}")
private ConfirmationMBean confirmationBean;
...
public String makeConfirmation() {
FacesContext fc = FacesContext.getCurrentInstance();
if (!cartMap.isEmpty()) {
int orderId = orderManager.placeOrder(name, email, phone, address, credicard, cartMap);
// if order processed successfully send user to confirmation page
if (orderId != 0) {
// get order details
confirmationBean.setOrderId(orderId);
// dissociate shopping cart from session
cartMap.clear();
// end session
//fc.getExternalContext().invalidateSession();
}
}
return "confirmation";
}
As you can see, I commented the part that invalidates the session. Here is the code that I implemented for the ConfirmationMBean:
#ManagedBean(name = "confirmationBean")
#SessionScoped
public class ConfirmationMBean implements Serializable{
private Customer customer;
private List<OrderedProduct> orderedProducts;
private CustomerOrder orderRecord;
private List<Product> products;
private int orderId;
#EJB
private OrderManager orderManager;
public void cleanUp(){
FacesContext fc = FacesContext.getCurrentInstance();
fc.getExternalContext().invalidateSession();
}
private void init(){
Map<String, Object> orderMap = orderManager.getOrderDetails(orderId);
customer = (Customer) orderMap.get("customer");
orderRecord = (CustomerOrder) orderMap.get("orderRecord");
orderedProducts = (List<OrderedProduct>) orderMap.get("orderedProducts");
products = (List<Product>) orderMap.get("products");
}
public Customer getCustomer() {
return customer;
}
public void setCustomer(Customer customer) {
this.customer = customer;
}
public List<OrderedProduct> getOrderedProducts() {
return orderedProducts;
}
public void setOrderedProducts(List<OrderedProduct> orderedProducts) {
this.orderedProducts = orderedProducts;
}
public CustomerOrder getOrderRecord() {
return orderRecord;
}
public void setOrderRecord(CustomerOrder orderRecord) {
this.orderRecord = orderRecord;
}
public List<Product> getProducts() {
return products;
}
public void setProducts(List<Product> products) {
this.products = products;
}
public int getOrderId() {
return orderId;
}
public void setOrderId(int orderId) {
this.orderId = orderId;
init();
cleanUp();
}
}
As you can see, when the orderId is setted by the preceding bean, the data is requested from the database, and populates the variables to present in the facelet. ¿Where or how I have to use the cleanUp method in order to obtain the same result that the tutorial?
Thanks in advance.
Put the bean where you're invoking the action in the request scope instead of session scope and get hold of the desired session scoped bean as a (managed) property.
#ManagedBean
#RequestScoped
public class SubmitConfirmationBean {
#ManagedProperty("#{cartBean}")
private CartBean cartBean;
// ...
}
And reference it by #{submitConfirmationBean.cartBean...} instead of #{cartBean...}.
Alternatively, explicitly put the desired session scoped bean in the request scope in the same action method as where you're invalidating the session:
externalContext.getRequestMap().put("cartBean", cartBean);
This way the #{cartBean...} will refer the request scoped one instead of the session scoped one which is newly recreated at that point because you destroyed the session. The request scoped one is lost by next request anyway.
While inside a not trasient conversation, I need to start a new conversation for the bean.
The case is the following: I have a jsf page with a cdi bean to handle creation and altering of an order. On the menu of the page there is an item which is "new Order". So, when altering an Order, I need to click on "new Order" and the page must be refreshed with the new CID, and a new conversation scope. But if I try to do this, the conversation.getConverstaionId() always return the same value, even if I call conversation.end() and conversation.begin() first.
EDIT:
I have a page to edit an order. When clicking on a new button (of the menu), I want it to refresh and start a new conversation, to add a new order. So this button calls the method redirectToNewOrderPage(). But it has the problem described on the code and before.
#Named
#ConversationScoped
public class OrderEditBean implements Serializable {
private static final long serialVersionUID = 1L;
#Inject
private Conversation conversation;
[...]
public void redirectToNewOrderPage() {
String cid = createNewConversationId();
setOrder(null);
try {
FacesContext.getCurrentInstance().getExternalContext().redirect("/OrdersManager/restricted/orders/edit.xhtml?cid=" + cid);
} catch (IOException e) {
e.printStackTrace();
}
}
private String createNewConversationId() {
String oldConversationId = null;
String newConversationId = null;
oldConversationId = conversation.getId();
if (!conversation.isTransient() && conversation.getId() != null) {
conversation.end();
}
conversation.begin();
newConversationId = conversation.getId();
// **************
// at this point newConversationId is equal to
// oldConversationId if the conversation was NOT transient.
// **************
return newConversationId;
}
}
What you are trying to do, does not work. The conversation scope in CDI is not as power as the one from Seam 2 (if that's where you're coming from).
I am attempting to create functionality in a JSF1.2/ADF web app that will periodically & dynamically generate a sitemap for a website that will have hundreds of pages whose content will change daily. The catch is that I need to read some config from the application to use as the basis of the sitemap and to do so, I need FacesContext.
Here is what I have attempted to do: I created a class that implements a ServletContextListener and instantiates an application scoped bean. This bean does the heavy lifting to create sitemap.xml using FacesContext. I created a class that extends TimerTask that accesses the bean from application scope, calls the sitemap method and schedules future occurrences. When I run the application, the class that implements ServletContextListener fires and the bean appears to be created, but the class that extends TimerTask is never fired. Any help would be appreciated. If I can answer any questions or if I left anything out, please let me know.
Here are my code samples:
public class WebhomesApplicationContextListener implements ServletContextListener {
private static final String attribute = "SiteMapGenerator";
public void contextInitialized(ServletContextEvent event) {
SiteMapGenerator myObject = new SiteMapGenerator();
event.getServletContext().setAttribute(attribute, myObject);
}
public void contextDestroyed(ServletContextEvent event) {
SiteMapGenerator myObject = (SiteMapGenerator) event.getServletContext().getAttribute(attribute);
event.getServletContext().removeAttribute(attribute);
}
}
public class SiteMapGenerator {
public void generateSitemap() {
// code to generate map...
}
}
public class Scheduler extends TimerTask {
public void run() {
SiteMapGenerator sitemap = (SiteMapGenerator)FacesContext.getCurrentInstance().getExternalContext().getApplicationMap().get("SiteMapGenerator");
sitemap.generateSitemap();
}
}
class MainApplication {
public static void main(String[] args) {
Timer timer = new Timer();
timer.schedule(
new Scheduler(),
1000 * 60);
}
}
No, you can't. The FacesContext is only available in the thread associated with the HTTP servlet request whose URL matched the URL pattern of the FacesServlet and has invoked it. Instead, just pass the SiteMapGenerator to the Scheduler on its construction.
public class Scheduler {
private SiteMapGenerator sitemap;
public Scheduler(SiteMapGenerator sitemap) {
this.sitemap = sitemap;
}
// ...
}
The SiteMapGenerator is surely available at the point you're constructing the Scheduler.
Unrelated to the concrete problem, It's strongly discouraged to use TimerTask in a Java EE application. See also Spawning threads in a JSF managed bean for scheduled tasks using a timer.
Good evening,
In a test JSF 2.0 web app, I am trying to get the number of active sessions but there is a problem in the sessionDestroyed method of the HttpSessionListener.
Indeed, when a user logs in, the number of active session increases by 1, but when a user logs off, the same number remains as it is (no desincrementation happens) and the worse is that, when the same user logs in again (even though he unvalidated the session), the same number is incremented.
To put that in different words :
1- I log in, the active sessions number is incremented by 1.
2- I Logout (the session gets unvalidated)
3- I login again, the sessions number is incremented by 1. The display is = 2.
4- I repeat the operation, and the sessions number keeps being incremented, while there is only one user logged in.
So I thought that method sessionDestroyed is not properly called, or maybe effectively called after the session timeout which is a parameter in WEB.XML (mine is 60 minutes).
That is weird as this is a Session Listener and there is nothing wrong with my Class.
Does someone please have a clue?
package mybeans;
import entities.Users;
import java.io.*;
import java.util.Date;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.faces.bean.ManagedBean;
import javax.faces.context.FacesContext;
import javax.servlet.http.HttpSessionEvent;
import javax.servlet.http.HttpSessionListener;
import jsf.util.JsfUtil;
/**
* Session Listener.
* #author TOTO
*/
#ManagedBean
public class SessionEar implements HttpSessionListener {
public String ctext;
File file = new File("sessionlog.csv");
BufferedWriter output = null;
public static int activesessions = 0;
public static long creationTime = 0;
public static int remTime = 0;
String separator = ",";
String headtext = "Session Creation Time" + separator + "Session Destruction Time" + separator + "User";
/**
*
* #return Remnant session time
*/
public static int getRemTime() {
return remTime;
}
/**
*
* #return Session creation time
*/
public static long getCreationTime() {
return creationTime;
}
/**
*
* #return System time
*/
private String getTime() {
return new Date(System.currentTimeMillis()).toString();
}
/**
*
* #return active sessions number
*/
public static int getActivesessions() {
return activesessions;
}
#Override
public void sessionCreated(HttpSessionEvent hse) {
// Insert value of remnant session time
remTime = hse.getSession().getMaxInactiveInterval();
// Insert value of Session creation time (in seconds)
creationTime = new Date(hse.getSession().getCreationTime()).getTime() / 1000;
if (hse.getSession().isNew()) {
activesessions++;
} // Increment the session number
System.out.println("Session Created at: " + getTime());
// We write into a file information about the session created
ctext = String.valueOf(new Date(hse.getSession().getCreationTime()) + separator);
String userstring = FacesContext.getCurrentInstance().getExternalContext().getRemoteUser();
// If the file does not exist, create it
try {
if (!file.exists()) {
file.createNewFile();
output = new BufferedWriter(new FileWriter(file.getName(), true));
// output.newLine();
output.write(headtext);
output.flush();
output.close();
}
output = new BufferedWriter(new FileWriter(file.getName(), true));
//output.newLine();
output.write(ctext + userstring);
output.flush();
output.close();
} catch (IOException ex) {
Logger.getLogger(SessionEar.class.getName()).log(Level.SEVERE, null, ex);
JsfUtil.addErrorMessage(ex, "Cannot append session Info to File");
}
System.out.println("Session File has been written to sessionlog.txt");
}
#Override
public void sessionDestroyed(HttpSessionEvent se) {
// Desincrement the active sessions number
activesessions--;
// Appen Infos about session destruction into CSV FILE
String stext = "\n" + new Date(se.getSession().getCreationTime()) + separator;
try {
if (!file.exists()) {
file.createNewFile();
output = new BufferedWriter(new FileWriter(file.getName(), true));
// output.newLine();
output.write(headtext);
output.flush();
output.close();
}
output = new BufferedWriter(new FileWriter(file.getName(), true));
// output.newLine();
output.write(stext);
output.flush();
output.close();
} catch (IOException ex) {
Logger.getLogger(SessionEar.class.getName()).log(Level.SEVERE, null, ex);
JsfUtil.addErrorMessage(ex, "Cannot append session Info to File");
}
}
} // END OF CLASS
I am retrieving the active sessions number this way:
<h:outputText id="sessionsfacet" value="#{UserBean.activeSessionsNumber}"/>
from another managedBean:
public String getActiveSessionsNumber() {
return String.valueOf(SessionEar.getActivesessions());
}
My logout method is as follow:
public String logout() {
HttpSession lsession = (HttpSession) FacesContext.getCurrentInstance().getExternalContext().getSession(false);
if (lsession != null) {
lsession.invalidate();
}
JsfUtil.addSuccessMessage("You are now logged out.");
return "Logout";
}
// end of logout
I'm not sure. This seems to work fine for a single visitor. But some things definitely doesn't look right in your HttpSessionListener.
#ManagedBean
public class SessionEar implements HttpSessionListener {
Why is it a #ManagedBean? It makes no sense, remove it. In Java EE 6 you'd use #WebListener instead.
BufferedWriter output = null;
This should definitely not be an instance variable. It's not threadsafe. Declare it methodlocal. For every HttpSessionListener implementation there's only one instance throughout the application's lifetime. When there are simultaneous session creations/destroys, then your output get overridden by another one while busy and your file would get corrupted.
public static long creationTime = 0;
public static int remTime = 0;
Those should also not be an instance variable. Every new session creation would override it and it would get reflected into the presentation of all other users. I.e. it is not threadsafe. Get rid of them and make use of #{session.creationTime} and #{session.maxInactiveInterval} in EL if you need to get it over there for some reason. Or just get it straight from the HttpSession instance within a HTTP request.
if (hse.getSession().isNew()) {
This is always true inside sessionCreated() method. This makes no sense. Remove it.
JsfUtil.addErrorMessage(ex, "Cannot append session Info to File");
I don't know what that method exactly is doing, but I just want to warn that there is no guarantee that the FacesContext is present in the thread when the session is about to be created or destroyed. It may take place in a non-JSF request. Or there may be no means of a HTTP request at all. So you risk NPE's because the FacesContext is null then.
Nonetheless, I created the following test snippet and it works fine for me. The #SessionScoped bean implicitly creates the session. The commandbutton invalidates the session. All methods are called as expected. How many times you also press the button in the same browser tab, the count is always 1.
<h:form>
<h:commandButton value="logout" action="#{bean.logout}" />
<h:outputText value="#{bean.sessionCount}" />
</h:form>
with
#ManagedBean
#SessionScoped
public class Bean implements Serializable {
public void logout() {
System.out.println("logout action invoked");
FacesContext.getCurrentInstance().getExternalContext().invalidateSession();
}
public int getSessionCount() {
System.out.println("session count getter invoked");
return SessionCounter.getCount();
}
}
and
#WebListener
public class SessionCounter implements HttpSessionListener {
private static int count;
#Override
public void sessionCreated(HttpSessionEvent event) {
System.out.println("session created: " + event.getSession().getId());
count++;
}
#Override
public void sessionDestroyed(HttpSessionEvent event) {
System.out.println("session destroyed: " + event.getSession().getId());
count--;
}
public static int getCount() {
return count;
}
}
(note on Java EE 5 you need to register it as <listener> in web.xml the usual way)
<listener>
<listener-class>com.example.SessionCounter</listener-class>
</listener>
If the above example works for you, then your problem likely lies somewhere else. Perhaps you didn't register it as <listener> in web.xml at all and you're simply manually creating a new instance of the listener everytime inside some login method. Regardless, now you at least have a minimum kickoff example to build further on.
Something in a completely different direction - tomcat supports JMX. There is a JMX MBean that will tell you the number of active sessions. (If your container is not tomcat, it should still support JMX and provide some way to track that)
Is your public void sessionDestroyed(HttpSessionEvent se) { called ? I don't see why it won't increment. After the user calls session.invalidate() through logout, the session is destroyed, and for the next request a new one is created. This is normal behavior.