In liferay i have a requirement like if i am updating the roles of multiple users if one of them updating the role of the user is failed then i want to rollback all the updated roles of users. i have applied as follows.
#Transactional(isolation = Isolation.SERIALIZABLE,
propagation = Propagation.REQUIRES_NEW)
public int updateUserRole(long userId,long groupId,long roleId) throws SystemException{
try{
return UserTokenFinderUtil.updateUserRole(userId,groupId,roleId);
}
catch(Exception e){
System.out.println("Exception occured UserTokenServiceImpl");
e.printStackTrace();
return -1;
}
}
Can anyone help me out with fresh eyes?
The best way to do that is doing it in a custom service (i.e. ServiceBuilder) method. Something like MyCustomServiceUtil.addRoles(). Transactions are managed by Liferay in this case, and you'll get the expected result.
This should be handled by default by service builder.
for that you should be using LocalServiceImpl class rather than *Util class
The entry point to transaction in Liferay is the *LocalServiceImpl class.
DML operations update, insert and delete on one Entity from another do not use calls to LocalServiceUtil or LocalService as this will result in two transaction boundaries.
You can refer to below Link for more infor.
Transaction Management with liferay service
Related
I'm exploring ServiceStack and I'm not sure what is the best way to implement some business logic.
Using the "Bookings CRUD" example I would like to enforce the following rule:
a given Booking can only be saved (either created or updated) if the hotel has enough free rooms for the particular dates of that booking
Please note that I'm not asking how to calculate "free rooms".
What I'm asking is, from the architectural point of view, how should this be done.
For example, one way would be:
create a request DTO to query the number of configured rooms (lets call it "QueryRooms")
use the existing "QueryBookings" to query current bookings present in database
create a " : Service" class to customize the Booking Service, in order to intercept the "CreateBooking" and "UpdateBooking" requests
inside the custom methods for "CreateBooking" and "UpdateBooking", somehow get the results of "QueryRooms" and "QueryBookings", check if there are enough free rooms for the current request, and proceed only if so
This doesn't look very clean, because the service "CreateBooking" and "UpdateBooking" would depend of "QueryRooms" and "QueryBookings".
What would be an elegant and effcient solution, using ServiceStatck?
You can override AutoQuery CRUD operations with your own Service implementation using the AutoQuery DTO.
Where you can use the Service Gateway to call existing Services which you can use to perform any additional validation & modify the request DTO before executing the AutoQuery operation to implement the API, e.g:
public class MyCrudServices : Service
{
public IAutoQueryDb AutoQuery { get; set; }
public object Post(CreateBooking request)
{
var response = Gateway.Send(new QueryRooms
{
From = request.BookingStartDate,
To = request.BookingEndDate,
});
if (response.Results.Count == 0)
throw new Exception("No rooms available during those dates");
request.RoomNumber = response.Results[0].Id;
return AutoQuery.Create(request, base.Request);
}
}
Note: calling in-process Services with the Service Gateway is efficient as it calls the C# method implementation directly, i.e. without incurring any HTTP overhead.
My plugin encrypts/decrypts a field. Works on the field within a CRM form.
From my console application, a retrieve bypasses my plugin, e.g., it retrieves the encrypted value directly from the database without running the plugin. When debugging, breakpoints in the plugin are hit when the field is accessed from a form , but they are not hit when accessed from my console program.
I'm surprised that my plugin isn't invoked from a program. It bypasses my business rules.
Here is how I'm accessing the entity and the field from a program:
private static OrganizationServiceProxy service = null;
private static OrganizationServiceContext orgSvcContext = null;
public static void RetrieveSSNs()
{
var query = orgSvcContext.CreateQuery("bpa_consumer");
foreach (Entity consumer in query)
{
if (consumer.Attributes.Contains("bpa_ssn"))
{
string ssn = consumer["bpa_ssn"].ToString();
Console.WriteLine(string.Format("Consumer \"{0}\" has SSN {1}", consumer.Attributes["bpa_name"], ssn));
}
else
{
Console.WriteLine(string.Format("Consumer \"{0}\" doesn't have a SSN", consumer.Attributes["bpa_name"]));
}
}
}
I'm guessing you have the plugin registered on the Retrieve method? If so, add another identical registration on the RetrieveMultiple. This should get your plugin to execute on your foreach. I should warn you that this is an extremely dangerous thing to do from a performance standpoint though...
If you are concerned about performance my recommendation is to put the encrypted data into a separate entity with a lookup back. Using this method CRM only has to execute the Retrieve/RetrieveMultiple plug-in when a user needs to access the encrypted data, not every time a user accesses the primary record. This will also make it easier to secure the encrypted data.
Turns out the you must register your plugin for the event RetrieveMultiple when you query for a collection of Entities.
I'm working on a Microsoft Dynamics CRM 2011 plugin attached to SalesOrder entity on Create event. I need to get the Order parent Account, to access some of its properties. I'm trying the following code inside the Execute method, but key "accountid" is not present at time of execution.
Entity entity = (Entity)context.InputParameters["Target"]; // A salesorder entity
EntityReference accountRef = (EntityReference)entity.Attributes["accountid"];
Plugin is registered at Post-operation stage to execute in synchronous mode. Following image show all configuration.
Is there another way to get the parent Account for the SalesOrder entity?
It seem to be an error on the SDK documentation, because the accountid attribute is never available for the salesorder entity, even if I configure the plugin to run in Asynchronous mode. I ended changing the accountid attribute by customerid, which in fact can be an account (default behavior). That solved my problem and I could get a reference to the Account which Order belongs.
Entity entity = (Entity)context.InputParameters["Target"]; // A salesorder entity
EntityReference accountRef = (EntityReference)entity.Attributes["customerid"];
if (accountRef.LogicalName != "account") return;
There are two possible issues here. First, is your plugin registered in Synchronous execution mode with Pre-Operation eventing pipeline stage of execution?
Check out this settings, problem is probably there.
Second, if you correctly registered plugin, maybe you didn't set ParentAccount on SalesOrder form, which is probably not an issue :)
I've spent all day Googling and looking at various questions on here, trying to come up with the best solution for implementing authentication and authorization. I've come up with part of the solution now, but am hoping someone can fill in the gaps. I realise there is a lot of text below, but please bear with me :O)
Background
I have inherited a part completed CRM application which currently uses JSF 2.0, JavaEE 6, JPA and a PostgreSQL database. Unfortunately, the guys who originally started building this web app in their infinite wisdom decided that it would be best to leave authentication/authorization to the end - I've now got to put it in.
The application is essentially split into three layers - the views, the managed beans and the DAO's. This means that the managed beans are particularly 'fat' since they contain all of the business logic, validation and navigation logic.
Authentication/Authorization requirements
Forms based authentication, validating against credentials stored in the PostgreSQL database.
The only page that will be publicly accessible (by anonymous users) will be the login page.
I need to prevent access to certain areas of the application based on a users role. For example, only users with the 'Admin' role should be able to access the create/edit user page.
I also need to be able to restrict access to certain area's of a page. For example, a user with the 'Sales Rep' role should be able to view a customers details, but the save/edit button should only be displayed if the user has the 'Customer Service' role.
Where I'm at
The first thing I plan on doing is to follow this User Authentication and Authorization using JAAS and Servlet 3.0 Login example. This I believe will fulfil my first 3 requirements.
In order to show/hide save buttons etc on pages, I can use the technique described in this SO answer. This will partly solve requirement 4, however I think that I still need to secure the action methods and or the managed beans themselves. For example, I would like to be able to add an annotation or something to the save() method on the customer bean to ensure that only users with the 'Customer Service' role can call it - this is where I begin to run into issues.
I guess one option would be to do something similar to what I am proposing to do in the view and use facesContext to check if the current user "is in role". I'm not keen on this as it will just clutter up my code and would rather use annotations instead. If I did go down this route however, how would I return a http 403 status?
The javax.annotation.security.* annotations seem to be a good fit for declaritively defininig access to areas of the application, however as far as I understand, they can only be added to EJB's. This would mean that I would need to move all of my business logic out of the managed beans where it currently resides to new EJB's. I think this would have the added benefit of separating the business logic out into it's own set of classes (delegates, services or whatever you chooses to call them). This would be quite a large refactor however which isn't going to be aided by a lack of unit test or integration tests. I'm not sure whether the responsibility of access control should be at this new service level either - I think it should be on the managed beans.
Other alternatives
During my research I have found lots of people mentioning frameworks such as Spring and Seam. I have some limited experience with Seam, I think it would have been a good fit for this project and from what I recall I believe it solves the authorization issues I am having, but I think it is too late in the day to introduce it now.
I have also seen Shiro mentioned in various places. Having looked at the 10 minute tutorial this seemed like a good fit, especially in conjunction with Deluan Quintao's taglib but I have been unable to find any tutorials or examples of how to integrate it with a JSF web app.
The other alternative I have come across surprisingly regularly is implementing a custom solution - this seems crazy to me!
Summary
In summary then, I'd really like some guidance on whether I'm heading down the right path in terms of implementing authentication and authorization and how I fill in that missing piece of securing individual methods and/or managed beans (or at least the code they delegate to) and/or how I can manually return a HTTP Status 403.
Have you tried anything with Spring Security - Latest being version 3
http://janistoolbox.typepad.com/blog/2010/03/j2ee-security-java-serverfaces-jsf-spring-security.html
http://ocpsoft.org/java/jsf-java/spring-security-what-happens-after-you-log-in/
rather than using a request filter or using JAAS, spring security is a comprehensive security framework that will resolve most of your security concerns .
You can use it to authenticate a user using a db realm, authorize him and redirect as necessary based on the provided authentication information.
you can secure the methods that you have written
http://blog.solidcraft.eu/2011/03/spring-security-by-example-securing.html
#PreAuthorize("hasRole('ROLE_XXX')") is the way
to make certain elements of a page secure..
//content
more reading and examples
http://static.springsource.org/spring-security/site/petclinic-tutorial.html
After carrying out a lot of research I have come to the conclusion that firstly the requirements of my application would benefit from being deployed to an application server that fully implements the Java EE specification, rather than a servlet container like Tomcat. As the project I am working on uses Maven, the key thing here was getting the dependencies set up correctly - this wasn't easy and took a fair bit of googling and trial and error: I'm sure there is a more scientific approach that could be taken.
I then had to create a mysql module to get my application to talk to the database properly and then remove the factory that had been implemented to create DAO's and convert them to EJB's instead. I also had to update hibernate.cfg.xml to reference the datasource I added and persistence.xml to set the transaction type to JTA and also reference the JTA data source. The only other complication was that the Open Session In View pattern was being used which meant I ended up with hibernate lazy initialization errors when entities were accessed in the views. I reimplemented the filter as shown at the bottom of this answer, to get around this. I see this as a temporary measure to get things working again before I can hopefully refactor this area and remove the need for the filter.
Moving to JBoss took just over a day and I'm sure it could have been done much quicker if I was more experienced with Java EE and Maven. Now that I'm at that point I'm in a good position to be able to drop seam 3 security into the project and utilise that, rather than trying to hack together a solution which is essentially the direction I was going to take. The nice thing about Seam 3 is that you can to a certain extent pick and choose which modules you use rather than having to add the entire framework (like Seam 2). I think a number of the other modules are going to be helpful as well however and will help me amongst other things get rid of the open session in view pattern.
One thing that did concern me with using Seam was that I was told about DeltaSpike. This seems as though it will probably replace seam and there are no plans for any more versions of seam. I have decided that since seam is still being supported and if DeltaSpike takes as long to come to fruition as seam 3, then it is pretty safe to use seam 3.
I will hopefully get round to writing a proper blog post describing this migration in proper detail.
public class OSVRequestFilter implements Filter {
private static final String UserTransaction = "java:comp/UserTransaction";
private static Logger logger = LoggerFactory.getLogger(EntityManagerRequestFilter.class);
public void init(FilterConfig config) throws ServletException {
}
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException {
if (request instanceof HttpServletRequest) {
doFilter(request, response, chain, getUserTransaction());
}
}
private UserTransaction getUserTransaction() throws ServletException {
try {
Context ctx = new InitialContext();
return (UserTransaction)PortableRemoteObject.narrow(ctx.lookup(UserTransaction), UserTransaction.class);
}
catch (NamingException ex) {
logger.error("Failed to get " + UserTransaction, ex);
throw new ServletException(ex);
}
}
private void doFilter(ServletRequest request, ServletResponse response, FilterChain chain, UserTransaction utx) throws IOException, ServletException {
try {
utx.begin();
chain.doFilter(request, response);
if (utx.getStatus() == Status.STATUS_ACTIVE)
utx.commit();
else
utx.rollback();
}
catch (ServletException ex) {
onError(utx);
throw ex;
}
catch (IOException ex) {
onError(utx);
throw ex;
}
catch (RuntimeException ex) {
onError(utx);
throw ex;
}
catch (Throwable ex){
onError(utx);
throw new ServletException(ex);
}
}
private void onError(UserTransaction utx) throws IOException, ServletException {
try {
if ((utx != null) && (utx.getStatus() == Status.STATUS_ACTIVE))
utx.rollback();
}
catch (Throwable e1) {
logger.error("Cannot rollback transaction", e1);
}
}
public void destroy() {
}
}
i am adding a new method into CalEventLocalServiceImpl using hook...
my code is ..
public class MyCalendarLocalServiceImpl extends CalEventLocalServiceWrapper {
public MyCalendarLocalServiceImpl(CalEventLocalService calEventLocalService) {
super(calEventLocalService);
// TODO Auto-generated constructor stub
}
public List getUserData(long userId) throws SystemException{
DynamicQuery query=DynamicQueryFactoryUtil.forClass(CalEvent.class)
.add(PropertyFactoryUtil.forName("userId").eq(userId));
List deatils=CalEventLocalServiceUtil.dynamicQuery(query);
return deatils;
}
}
liferay-hook.xml:
<service>
<service-type>
com.liferay.portlet.calendar.service.CalEventLocalService
</service-type>
<service-impl>
com.liferay.portlet.calendar.service.impl.MyCalendarLocalServiceImpl
</service-impl>
</service>
my question is how to use getUserData from jsp file.
Can anybody help me out....
i think u didn't gt my question...i want list of events based on USERID from Calendar ...to achieve this task what i need to do??
I assume getUserData() is not overridden but a new method (can't look up currently). This is not what you can do when overriding a service. Instead you'd have to add a new Service and make it available to the portal.
Remember that a customized ("hooked") jsp is running in the portal classloader, while your overloaded service is running in the hook's classloader. Thus, if you create a new service and make the service.jar available to Liferay (e.g. on the global classpath) you can call it from JSPs. The interface of Liferay services can not be extended through an overloaded service.
In case getUserData() is already in the interface (as I said I can't look up currently), you just need to call the CalendarLocalServiceUtil from your jsp and it will be delegated to your wrapper.
Just to add to Olaf's answer and comments...
if you you want to extend CalEventLocalService service with just "getUsetData" and use it in one jsp than building your own service might be overkill. Simply put your code from "getUserData" in jsp. Otherwise follow Olaf's suggestions.