I am looking to migrate data from one of the legacy database to one of Liferay tables. I can write the migration script, but I was wondering if there will be some issue with "Counter" service of liferay.
For example I have legacy custom user table. I need to move the users inside this table to Liferay's User_ table. I can use sql script to move the data. I am wondering what happens with primary key. As far as I know Liferay has counter service to create primary key and keep track of current id.
So while migrating is there anything that I need to do so that the Counter is not messed up after migration.
There are more issues than just the counter. You should strictly use Liferay's API to import the content.
I could name a few potential issues to pay attention to, but this will probably miss a few more - however it'd make you (or others reading this answer) confident that they now can cope with all issues. You can't. Don't go there. Simply use the API and it will take care of updating all the required dependencies. Plus, the API makes obvious what other data you still need in conjunction with your imported data.
Anybody who describes with more detail will set you up for disaster. And we've seen people who discovered their disasters more than 6 months after their manual writing to the database (in a particular case with running into duplicate primary keys, e.g. specifically the counter that you mention). Any of the possible failures might bring down your portal from one second to the other when you've long forgotten that you (or someone else) manually wrote to the database.
You could migrate data from a legacy database with Pentaho Data Integration by calling a REST endpoint (defined through {MODEL_NAME}ServiceImpl.java) that returns a counter increment of the entity I want to insert the data to. You would just have to respect using the same counter for each entity that is getting the new id's.
You can loop from list of legacy user, and do create Liferay users by using this class:
package com.yourcompany.yourapp.util;
import java.util.Date;
import java.util.List;
import org.apache.log4j.Logger;
import com.liferay.counter.service.CounterLocalServiceUtil;
import com.liferay.portal.kernel.dao.orm.DynamicQuery;
import com.liferay.portal.kernel.dao.orm.DynamicQueryFactoryUtil;
import com.liferay.portal.kernel.dao.orm.OrderFactoryUtil;
import com.liferay.portal.kernel.dao.orm.PropertyFactoryUtil;
import com.liferay.portal.kernel.exception.PortalException;
import com.liferay.portal.kernel.exception.SystemException;
import com.liferay.portal.kernel.uuid.PortalUUIDUtil;
import com.liferay.portal.kernel.workflow.WorkflowConstants;
import com.liferay.portal.model.Account;
import com.liferay.portal.model.ClassName;
import com.liferay.portal.model.Company;
import com.liferay.portal.model.Contact;
import com.liferay.portal.model.Group;
import com.liferay.portal.model.LayoutSet;
import com.liferay.portal.model.User;
import com.liferay.portal.security.permission.PermissionChecker;
import com.liferay.portal.security.permission.PermissionCheckerFactoryUtil;
import com.liferay.portal.security.permission.PermissionThreadLocal;
import com.liferay.portal.service.AccountLocalServiceUtil;
import com.liferay.portal.service.ClassNameLocalServiceUtil;
import com.liferay.portal.service.CompanyLocalServiceUtil;
import com.liferay.portal.service.ContactLocalServiceUtil;
import com.liferay.portal.service.GroupLocalServiceUtil;
import com.liferay.portal.service.LayoutSetLocalServiceUtil;
import com.liferay.portal.service.RoleLocalServiceUtil;
import com.liferay.portal.service.UserLocalServiceUtil;
import com.liferay.portlet.asset.model.AssetEntry;
import com.liferay.portlet.asset.service.AssetEntryLocalServiceUtil;
public class UserUtil {
private static final Logger logger = Logger.getLogger(UserUtil.class);
private long companyId;
private long creatorUserId;
private long accountId;
private Date date;
public UserUtil() {
try {
DynamicQuery queryCompany = DynamicQueryFactoryUtil.forClass(Company.class)
.addOrder(OrderFactoryUtil.asc("companyId"));
List<Company> listCompany = (List<Company>) CompanyLocalServiceUtil.dynamicQuery(queryCompany, 0, 1);
companyId = listCompany.get(0).getCompanyId();
//-----------
DynamicQuery queryAccount = DynamicQueryFactoryUtil.forClass(Account.class)
.addOrder(OrderFactoryUtil.asc("accountId"));
List<Account> listAccount = (List<Account>) AccountLocalServiceUtil.dynamicQuery(queryAccount, 0, 1);
accountId = listAccount.get(0).getAccountId();
//-----------
DynamicQuery queryUser = DynamicQueryFactoryUtil.forClass(User.class)
.add(PropertyFactoryUtil.forName("defaultUser").eq(false))
.addOrder(OrderFactoryUtil.asc("createDate"));
List<User> listUser = (List<User>) UserLocalServiceUtil.dynamicQuery(queryUser, 0, 1);
creatorUserId = listUser.get(0).getUserId();
date = new Date();
} catch (SystemException ex) {
logger.error(ex.getMessage());
}
}
public void create(String screenName, String emailAddress, String hashedPassword, String fullName) {
try {
long contactId = CounterLocalServiceUtil.increment();//or use Contact.class.getName() as param
Contact contact = ContactLocalServiceUtil.createContact(contactId);
contact.setAccountId(accountId);
//contact.setBirthday(DateUtil.getDate("dd MM yyyy", "01 11 1986"));
contact.setCachedModel(true);
contact.setCompanyId(companyId);
contact.setCreateDate(date);
contact.setEmailAddress(emailAddress);
//contact.setEmployeeNumber(employeeNumber);
//contact.setEmployeeStatusId(employeeStatusId);
contact.setFirstName(fullName);
contact.setMale(true);
contact.setNew(true);
//contact.setUserId(creatorUserId);
User creatorUser = UserLocalServiceUtil.getUserById(creatorUserId);
contact.setUserName(creatorUser.getFullName());
contact.setUserUuid(creatorUser.getUuid());
ContactLocalServiceUtil.addContact(contact);
//----------------------
long userId = CounterLocalServiceUtil.increment();//or use User.class.getName() as param
//----------------------
User user = UserLocalServiceUtil.createUser(userId);
user.setAgreedToTermsOfUse(true);
user.setCachedModel(true);
user.setCompanyId(companyId);
user.setContactId(contactId);
user.setCreateDate(date);
user.setDefaultUser(false);
user.setDigest(null);
user.setEmailAddress(emailAddress);
user.setEmailAddressVerified(true);
user.setFirstName(fullName);
user.setGreeting("Hi " + user.getFirstName());
user.setLanguageId("en_US");
user.setModifiedDate(date);
user.setNew(true);
user.setPassword(hashedPassword);
user.setPasswordEncrypted(true);
user.setPasswordReset(false);
//user.setPasswordUnencrypted();
user.setScreenName(screenName);
user.setStatus(WorkflowConstants.STATUS_APPROVED);
user.setTimeZoneId("UTC+7");
user.setUserUuid(creatorUser.getUuid());
user.setUuid(PortalUUIDUtil.generate());
UserLocalServiceUtil.addUser(user);
//----------------------
try {
// to avoid "PermissionChecker not Initialized"
PermissionChecker checker = PermissionCheckerFactoryUtil.create(creatorUser);
PermissionThreadLocal.setPermissionChecker(checker);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
//----------------------
ClassName clsNameUser = ClassNameLocalServiceUtil.getClassName(Constants.USER_CLASS);
long classNameId = clsNameUser.getClassNameId();
long groupId = CounterLocalServiceUtil.increment();// or use Group.class.getName() as param
Group group = GroupLocalServiceUtil.createGroup(groupId);
group.setActive(true);
group.setCachedModel(true);
group.setClassNameId(classNameId);
group.setClassPK(userId);
group.setCompanyId(companyId);
group.setCreatorUserId(creatorUser.getUserId());
group.setCreatorUserUuid(creatorUser.getUuid());
group.setFriendlyURL(String.valueOf(userId));
group.setName(String.valueOf(userId));
group.setNew(true);
group.setSite(false);
group.setTreePath("/" + groupId + "/");
group.setType(0);
group.setUuid(PortalUUIDUtil.generate());
GroupLocalServiceUtil.addGroup(group);
//-----------------------------
long layoutSetId1 = CounterLocalServiceUtil.increment();//or use LayoutSet.class.getName() as param
LayoutSet layoutSet1 = LayoutSetLocalServiceUtil.createLayoutSet(layoutSetId1);
layoutSet1.setCachedModel(true);
//layoutSet.setColorSchemeId(colorSchemeId);
layoutSet1.setCompanyId(companyId);
layoutSet1.setCreateDate(date);
//layoutSet.setCss(css);
layoutSet1.setGroupId(groupId);
//layoutSet.setLogo(logo);
//layoutSet.setLogoId(logoId);
layoutSet1.setModifiedDate(date);
layoutSet1.setNew(true);
layoutSet1.setPrivateLayout(true);
//layoutSet.setThemeId(themeId);
LayoutSetLocalServiceUtil.addLayoutSet(layoutSet1);
//-----------------------------
long layoutSetId2 = CounterLocalServiceUtil.increment();// or use LayoutSet.class.getName() as param
LayoutSet layoutSet2 = LayoutSetLocalServiceUtil.getLayoutSet(layoutSetId1);
layoutSet2.setLayoutSetId(layoutSetId2);
layoutSet2.setPrivateLayout(false);
LayoutSetLocalServiceUtil.addLayoutSet(layoutSet2);
//-----------------------------
long assetEntryId = CounterLocalServiceUtil.increment();//or use AssetEntry.class.getName() as param
AssetEntry assetEntry = AssetEntryLocalServiceUtil.createAssetEntry(assetEntryId);
assetEntry.setCompanyId(companyId);
assetEntry.setClassPK(userId);
assetEntry.setGroupId(groupId);
assetEntry.setClassNameId(classNameId);
//ae.setTitle(title);
assetEntry.setUserId(userId);
AssetEntryLocalServiceUtil.addAssetEntry(assetEntry);
//--------------------------------------------------
//long orgAdminRoleId = RoleLocalServiceUtil.getRole(companyId, Constants.ORG_ADMIN_ROLE_NAME).getRoleId();
//UserGroupRoleLocalServiceUtil.addUserGroupRoles(userId, groupId, new long[] { orgAdminRoleId });
long orgUserRoleId = RoleLocalServiceUtil.getRole(companyId, Constants.ORG_USER_ROLE_NAME).getRoleId();
RoleLocalServiceUtil.addUserRole(userId, orgUserRoleId);
long siteMemberRoleId = RoleLocalServiceUtil.getRole(companyId, Constants.SITE_MEMBER_ROLE_NAME).getRoleId();
RoleLocalServiceUtil.addUserRole(userId, siteMemberRoleId);
//-----------------------------------------------------------
} catch (SystemException | PortalException ex) {
logger.error(ex.getMessage(), ex);
}
}
}
then you can create new instance of UserUtil and call method create() inside your loop of legacy user list, something like this:
UserUtil userUtil = new UserUtil();
for (LegacyUser user : listOfLegacyUser) {
userUtil.create(........);
}
note that hashedPassword depends on what your hash method is, defined in portal-ext.properties, default value is:
passwords.ecnryption.algorithm=PBKDF2WithHmacSHA1/160/128000
but you can use one of values below:
passwords.encryption.algorithm=BCRYPT/10
passwords.encryption.algorithm=MD2
passwords.encryption.algorithm=MD5
passwords.encryption.algorithm=NONE
passwords.encryption.algorithm=PBKDF2WithHmacSHA1/160/128000
passwords.encryption.algorithm=SHA
passwords.encryption.algorithm=SHA-256
passwords.encryption.algorithm=SHA-384
passwords.encryption.algorithm=SSHA
passwords.encryption.algorithm=UFC-CRYPT
Related
I am new to jhipster.
Is there a guide of steps to follow to implement search criteria?
I want to add to an entity, the possibility of filtering by any of its fields.
I have enabled filtering, but can't find a mode to show up on the entity's CRUD page.
I understand that I have to code in different places in the application ... but I did not find any examples or guides.
In advance, thank you very much for giving me some help
I don't know any guide for that but I also use Jhipster and developed my app search criteria and works fine. I hope this is what you're looking for.
I gonna use a comment table as an example where I want to filter by hidden, flagged, or both with the possibility of pagination on front. So I created a CommentRepositoryCustom.java:
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
public interface CommentRepositoryCustom {
Page<Comment> findCommentByCommentHiddenAndFlagged(Boolean hidden, Boolean flagged, Pageable pageable);
}
After that I created the implementation for my findCommentByCommentHiddenAndFlagged in another custom repository I created called CommentRepositoryCustomImpl.java:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.Transactional;
import javax.persistence.EntityManager;
import javax.persistence.TypedQuery;
import javax.persistence.criteria.CriteriaBuilder;
import javax.persistence.criteria.CriteriaQuery;
import javax.persistence.criteria.Predicate;
import javax.persistence.criteria.Root;
#Repository
#Transactional
public class CommentRepositoryCustomImpl implements CommentRepositoryCustom{
private static final String HIDDEN_LABEL = "hidden";
private static final String FLAGGED_LABEL = "flagged";
#Autowired
private EntityManager em;
#Override
public Page<Comment> findCommentByCommentHiddenAndFlagged(Boolean hidden, Boolean flagged, Pageable pageable) {
CriteriaBuilder criteriaBuilder = em.getCriteriaBuilder();
CriteriaQuery<Comment> commentCriteriaQuery = criteriaBuilder.createQuery(Comment.class);
Root<Comment> commentRoot = commentCriteriaQuery.from(Comment.class);
Predicate predicateHidden = criteriaBuilder.equal(commentRoot.get(HIDDEN_LABEL), hidden);
Predicate predicateFlagged = criteriaBuilder.equal(commentRoot.get(FLAGGED_LABEL), flagged);
Predicate concatenate;
TypedQuery<Comment> typedQuery;
CriteriaQuery<Long> countQuery = criteriaBuilder.createQuery(Long.class);
Root<Comment> countCommentRoot = countQuery.from(Comment.class);
Long count;
if (hidden != null && flagged != null) {
concatenate = criteriaBuilder.and(predicateHidden);
concatenate = criteriaBuilder.and(concatenate, predicateFlagged);
countQuery.select(criteriaBuilder.count(countCommentRoot)).where(concatenate);
typedQuery = em.createQuery(commentCriteriaQuery.select(commentRoot).where(concatenate));
} else if(hidden != null) {
concatenate = criteriaBuilder.and(predicateHidden);
countQuery.select(criteriaBuilder.count(countCommentRoot)).where(concatenate);
typedQuery = em.createQuery(commentCriteriaQuery.select(commentRoot).where(concatenate));
} else if (flagged != null) {
concatenate = criteriaBuilder.and(predicateFlagged);
countQuery.select(criteriaBuilder.count(countCommentRoot)).where(concatenate);
typedQuery = em.createQuery(commentCriteriaQuery.select(commentRoot).where(concatenate));
} else {
countQuery.select(criteriaBuilder.count(countCommentRoot));
typedQuery = em.createQuery(commentCriteriaQuery.select(commentRoot));
}
typedQuery.setFirstResult((int) pageable.getOffset()).setMaxResults(pageable.getPageSize());
count = em.createQuery(countQuery).getSingleResult();
return new PageImpl<>(typedQuery.getResultList(), pageable, count);
}
}
Now you just need to call findCommentByCommentHiddenAndFlagged with the right params in your service.
Let me know if this what you're looking for.
Do you have a Get endpoint in package web.rest that looks like this? #GetMapping("/clcomments") public ResponseEntity<List<Entity>> getAllComments(CommentsCriteria criteria) {... }
If you do, search by criteria works!
package org.apache.spark.examples.kafkaToflink;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintStream;
import java.nio.charset.StandardCharsets;
import java.util.Properties;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.sink.RichSinkFunction;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
import com.microsoft.azure.datalake.store.ADLException;
import com.microsoft.azure.datalake.store.ADLFileOutputStream;
import com.microsoft.azure.datalake.store.ADLStoreClient;
import com.microsoft.azure.datalake.store.IfExists;
import com.microsoft.azure.datalake.store.oauth2.AccessTokenProvider;
import com.microsoft.azure.datalake.store.oauth2.ClientCredsTokenProvider;
import scala.util.parsing.combinator.testing.Str;
public class App {
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "192.168.1.72:9092");
properties.setProperty("group.id", "test");
DataStream<String> stream = env.addSource(
new FlinkKafkaConsumer010<String>("tenant", new SimpleStringSchema(), properties), "Kafka_Source");
stream.addSink(new ADLSink()).name("Custom_Sink").setParallelism(128);
env.execute("App");
}
}
class ADLSink<String> extends RichSinkFunction<String> {
private java.lang.String clientId = "***********";
private java.lang.String authTokenEndpoint = "***************";
private java.lang.String clientKey = "*****************";
private java.lang.String accountFQDN = "****************";
private java.lang.String filename = "/Bitfinex/ETHBTC/ORDERBOOK/ORDERBOOK.json";
#Override
public void invoke(String value) {
AccessTokenProvider provider = new ClientCredsTokenProvider(authTokenEndpoint, clientId, clientKey);
ADLStoreClient client = ADLStoreClient.createClient(accountFQDN, provider);
try {
client.setPermission(filename, "744");
ADLFileOutputStream stream = client.getAppendStream(filename);
System.out.println(value);
stream.write(value.toString().getBytes());
stream.close();
} catch (ADLException e) {
System.out.println(e.requestId);
} catch (Exception e) {
System.out.println(e.getMessage());
System.out.println(e.getCause());
}
}
}
I am continuously trying to append a file which is in Azure data lake Store using while loop .But sometimes it gives this , Operation APPEND failed with HTTP500, error in starting or sometimes after 10 min. I am using java
Anubhav, Azure Data Lake streams are single-writer streams - i.e., you cannot write to the same stream from multiple threads, unless you do some form of synchronization between these threads. This is because each write specifies the offset it is writing to, and with multiple threads, the offsets are not consistent.
You seem to be writing from multiple threads (.setParallelism(128) call in your code)
In your case, you have two choices:
Write to a different file in each thread. I do not know your use-case, but we have found that for a lot of cases that is the natural use of different threads - to write to different files.
If it is important to have all the threads write to the same file, then you will need to refactor the sink a little bit so that all the instances have reference to the same ADLFileOutputStream, and you will need to make sure the calls to write() and close() are synchronized.
Now, there is one more issue here - the error you got should have been an HTPP 4xx error (indicating a lease conflict, since the ADLFileOutputStreams acquire lease), rather than HTTP 500, which says there was a server-side problem. To troubleshoot that, I will need to know your account name and time of access. That info is not safe to share on StackOverflow, so please open a support ticket for that and reference this SO question, so the issue gets eventually routed to me.
I am sorry this isn't a 'coding' question, but after some considerable time on the learning path with XPages and Java I still struggle with finding definitive information on the correct way to carry out the basics.
When developing an XPages application using Java which is the best or most efficient method for accessing data?
1) Setting up and maintaining a View, with a column for every field in the document and retrieving data by ViewEntry.getColumnValues().get(int); i.e. not accessing the document and retrieving the data from the view. Which is what I have been doing but my View Columns are continuing to increase along with the hassle of maintaining column sequences. My understanding is this a faster method.
or
2) Just drop everything into a document using a View only when necessary, but in the main using Database.getDocumentByUNID().getItemValueAsString("field") and not worrying about adding lots of columns, far easier to maintain, but is accessing the document slowing things down?
Neither 1) or 2).
Go this way to manage document's data in a Java class:
Read document's items all at once in your Java class into class fields
remember UNID in a class field too
recycle document after reading items
get document again with the help of UNID for every read/write and recycle afterwards
Database.getDocumentByUNID() is quite fast but call it only once for a document and not for every item.
Update
As you mentioned in a comment, you have a database with 50.000 documents of various types.
I'd try to read only those documents you really need. If you want to read e.g. all support ticket documents of a customer you would use a view containing support tickets sorted by customer (without additional columns). You would get the documents with getAllDocumentsByKey(customer, true) and put the Java objects (each based on a document) into a Map.
You can maintain a cache in addition. Create a Map of models (model = Java object for a document type) and use UNID as key. This way you can avoid to read/store documents twice.
This is a really great question. I would first say that I agree 100% with Knut and that's how I code my objects that represent documents.
Below I've pasted a code example of what I typically do. Note this code is using the OpenNTF Domino API which among other things takes care of recycling for me. So you won't see any recycle calls.
But as Knut says - grab the document. Get what you need for it then it can be discarded again. When you want to save something just get the document again. At this stage you could even check lastModified or something to see of another save took place since the time you loaded it.
Sometimes for convenience I overload the load() method and add a NotesViewEntry :
public boolean load(ViewEntry entry) {}
Then in there I could just get the document, in if it's a specific situation use view columns.
Now this works great when dealing with a single document at a time. It works really well if I want to loop through many documents for a collection. But if you get too many you might see some of the overheard start to slow things down. One app I have if I "injest" 30,000 docs like this into a collection it can get a little slow.
I don't have a great answer for this yet. I've tried the big view with many columns thing like it sounds like you did. I've tried creating a lower level basic version of the object with just needed fields and that was more designed to work on viewEntry and their columns. I don't have a great answer for that yet. Making sure you lazy load what you can is pretty important I think.
Anyway here's a code example that shows how I build most of my document driven objects.
package com.notesIn9.video;
import java.io.Serializable;
import java.util.Date;
import org.openntf.domino.Database;
import org.openntf.domino.Document;
import org.openntf.domino.Session;
import org.openntf.domino.View;
import org.openntf.domino.utils.Factory;
public class Episode implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
private String title;
private Double number;
private String authorId;
private String contributorId;
private String summary;
private String subTitle;
private String youTube;
private String libsyn;
private Date publishedDate;
private Double minutes;
private Double seconds;
private String blogLink;
private boolean valid;
private String unid;
private String unique;
private String creator;
public Episode() {
this.unid = "";
}
public void create() {
Session session = Factory.getSession(); // this will be slightly
// different if not using the
// OpenNTF Domino API
this.setUnique(session.getUnique());
this.setCreator(session.getEffectiveUserName());
this.valid = true;
}
public Episode load(Document doc) {
this.loadValues(doc);
return this;
}
public boolean load(String key) {
// this key is the unique key of the document. UNID would be
// faster/easier.. I just kinda hate using them and seeing them in URLS
Session session = Factory.getSession();
Database currentDb = session.getCurrentDatabase();
Database db = session.getDatabase(currentDb.getServer(), "episodes.nsf");
View view = db.getView("lkup_episodes");
Document doc = view.getDocumentByKey(key); // This is deprecated because
// the API prefers to use
// getFirstDocumentByKey
if (null == doc) {
// document not found. DANGER
this.valid = false;
} else {
this.loadValues(doc);
}
return this.valid;
}
private void loadValues(Document doc) {
this.title = doc.getItemValueString("title");
this.number = doc.getItemValueDouble("number");
this.authorId = doc.getItemValueString("authorId");
this.contributorId = doc.getItemValueString("contributorId");
this.summary = doc.getItemValueString("summary");
this.subTitle = doc.getItemValueString("subtitle");
this.youTube = doc.getItemValueString("youTube");
this.libsyn = doc.getItemValueString("libsyn");
this.publishedDate = doc.getItemValue("publishedDate", Date.class);
this.minutes = doc.getItemValueDouble("minutes");
this.seconds = doc.getItemValueDouble("seconds");
this.blogLink = doc.getItemValueString("blogLink");
this.unique = doc.getItemValueString("unique");
this.creator = doc.getItemValueString("creator");
this.unid = doc.getUniversalID();
this.valid = true;
}
public boolean save() {
Session session = Factory.getSession();
Database currentDb = session.getCurrentDatabase();
Database db = session.getDatabase(currentDb.getServer(), "episodes.nsf");
Document doc = null;
if (this.unid.isEmpty()) {
doc = db.createDocument();
doc.replaceItemValue("form", "episode");
this.unid = doc.getUniversalID();
} else {
doc = db.getDocumentByUNID(this.unid);
}
this.saveValues(doc);
return doc.save();
}
private void saveValues(Document doc) {
doc.replaceItemValue("title", this.title);
doc.replaceItemValue("number", this.number);
doc.replaceItemValue("authorId", this.authorId);
doc.replaceItemValue("contributorId", this.contributorId);
doc.replaceItemValue("summary", this.summary);
doc.replaceItemValue("subtitle", this.subTitle);
doc.replaceItemValue("youTube", this.youTube);
doc.replaceItemValue("libsyn", this.libsyn);
doc.replaceItemValue("publishedData", this.publishedDate);
doc.replaceItemValue("minutes", this.minutes);
doc.replaceItemValue("seconds", this.seconds);
doc.replaceItemValue("blogLink", this.blogLink);
doc.replaceItemValue("unique", this.unique);
doc.replaceItemValue("creator", this.creator);
}
// getters and setters removed to condense code.
public boolean remove() {
Session session = Factory.getSession();
Database currentDb = session.getCurrentDatabase();
Database db = session.getDatabase(currentDb.getServer(), "episodes.nsf");
if (this.unid.isEmpty()) {
// this is a new Doc
return false;
} else {
Document doc = db.getDocumentByUNID(this.getUnid());
return doc.remove(true);
}
}
}
It's all about balance. everything has its price. Big views (case 1) slow down indexing. Opening documents every time (case 2) slows down your code.
Find something in between.
I'm trying to integrate Liferay Dynamic Data Lists into Kaleo Workflow (Liferay 6.1 CE GA2), but how to get the ddlRecordId in workflow? I did some homework, I checked all Attributes in serviceContext, but there is no "ddlRecordId" in the serviceContext Attribute, only a key named "recordId" and its value always is 0. Also I can get some field value in the serviceContext Attributes, such as select and textarea. But what I want is the upload file field. Thanks.
long ddlRecordId = GetterUtil.getLong(serviceContext.getAttribute("ddlRecordId"));
DDLRecord ddlRecord = DDLRecordLocalServiceUtil.getRecord(ddlRecordId);
In Liferay 6.1 the DDLRecordId is equivalent to the entryClassPK in the Workflow Context Variables. This could be a helpful documentation (read section about Workflow Context Variables)
So, you can get the upload file field by this way :
import com.liferay.portlet.documentlibrary.store.DLStoreUtil;
import com.liferay.portlet.dynamicdatalists.model.DDLRecord;
import com.liferay.portlet.dynamicdatalists.service.DDLRecordLocalServiceUtil;
import com.liferay.portlet.dynamicdatamapping.storage.Field;
import com.liferay.portlet.dynamicdatamapping.model.DDMStructure;
import com.liferay.portal.kernel.json.JSONFactoryUtil;
import com.liferay.portal.kernel.json.JSONObject;
import com.liferay.portal.kernel.util.GetterUtil;
import java.io.File;
import java.io.Serializable
DDLRecord ddlRecord = DDLRecordLocalServiceUtil.getDDLRecord(GetterUtil.getLong(entryClassPK));
// get the upload field
Field field = ddlRecord.getField("field_attachment");
if (field != null){
DDMStructure structure = field.getDDMStructure();
Serializable fieldValue = field.getValue();
String value = String.valueOf(fieldValue);
if (!value.isEmpty()){
JSONObject fileJSONObject = JSONFactoryUtil.createJSONObject(value);
String fileName = fileJSONObject.getString("name");
String filePath = fileJSONObject.getString("path");
File file = DLStoreUtil.getFile(structure.getCompanyId(), 0L, filePath);
}
}
I hope this will help more than one...
i had the same problem. I´ve been like a week trying to solve it and finally I get it.
I hope it will solve yours to.
I had to recover all the DDLRecords in a list and find the one is ussing my workflow with the "recordSetId" attribute compared with the "recordSetId" of DDLRecord.
The final code is like this:
import com.liferay.portal.kernel.util.GetterUtil;
import com.liferay.portal.kernel.workflow.WorkflowConstants;
import com.liferay.portal.service.ServiceContext;
import com.liferay.portlet.dynamicdatamapping.storage.Field;
import com.liferay.portal.kernel.util.GetterUtil;
import com.liferay.portal.kernel.workflow.WorkflowConstants;
import com.liferay.portal.service.ServiceContext;
import com.liferay.portal.kernel.util.GetterUtil;
import com.liferay.portlet.dynamicdatalists.model.DDLRecord;
import com.liferay.portlet.dynamicdatalists.model.impl.DDLRecordImpl;
import com.liferay.portlet.dynamicdatalists.service.*;
long companyId = GetterUtil.getLong((String) workflowContext.get(WorkflowConstants.CONTEXT_COMPANY_ID));
String uuid = (String) workflowContext.get(WorkflowConstants.CONTEXT_USER_ID);
ServiceContext serviceContext = (ServiceContext) workflowContext.get(WorkflowConstants.CONTEXT_SERVICE_CONTEXT);
long ddlRecordId = GetterUtil.getLong(serviceContext.getAttribute("recordSetId"));
List ddlRecordList = DDLRecordLocalServiceUtil.getDDLRecords(0,DDLRecordLocalServiceUtil.getDDLRecordsCount());
for(DDLRecord o : ddlRecordList){
if(o.getRecordSetId()==ddlRecordId){
Field field = o.getField("status");
String status = GetterUtil.getString(field.getValue());
if (status.contains("not")) {
returnValue = "No"
}
else {
returnValue = "Yes"
}
}
}
I use an xe:objectData as a datasource for a xp:dataTable. objectData1 uses some Java code to retrieve all documents from a view that match a key ( username ). The Java code looks like this:
package com.isatweb.cois;
import static com.ibm.xsp.extlib.util.ExtLibUtil.getCurrentDatabase;
import static com.ibm.xsp.extlib.util.ExtLibUtil.getCurrentSession;
import java.io.Serializable;
import lotus.domino.Database;
import lotus.domino.Name;
import lotus.domino.Session;
import lotus.domino.View;
import lotus.domino.ViewEntryCollection;
public class ObjectDataVisits implements Serializable {
private static final long serialVersionUID = 1L;
ViewEntryCollection vec = null;
public ObjectDataVisits(){
try {
this.update();
} catch (Exception e) {
System.out.print(e);
}
}
public void update() {
try {
Database _db = getCurrentDatabase();
Session _session = getCurrentSession();
Name nam = _session.createName(_session.getEffectiveUserName());
String username = nam.getAbbreviated().replace(" ", "#").replace("/", "#").toUpperCase();
View view = _db.getView("vw_visit_open");
this.vec = view.getAllEntriesByKey(username);
} catch (Exception e) {
System.out.print(e);
}
}
public ViewEntryCollection getVisits(){
return this.vec;
}
}
The XPage has the following code
When I first load the page, the data is read from the wiew and the dataTable displays the NoteIDs of all matching documents.
When I refresh the page using the button, I get an "Object has been removed or recycled" error.
Can anyone pls. show me what I'm doing wrong? ( and perhaps, how to do it right )
The problem is, that Notes objects are not serializable. During the partial refresh the getVisits() method is executed before the update() method. The ViewEntryCollection is a references to a view, and this view is already recycled.
If you just want to store some note id's then you could store them in a Vector instead. Otherwise you have to call your update() method in your getVisits() method everytime.