Does anyone know how to create the date type predicate for Hazelcast?
I use Predicates.equal("date","value"); It doesn't work properly. I pass an existing date value in Hazelcast. It returns nothing. java.util.date should be comparable.
I don't know why it doesn't compare properly. Anybody can help, appreciate very much!
you can also try out your own predicate. i.e. if you have a map with key being Object and value being Date then you can do the following:
final Date requiredDate = /*your date object*/;
map.values(new Predicate<Object, Date>() {
public boolean apply(Entry<Object, Date> arg0) {
Date date = arg0.getValue();
if(requiredDate.equals(date))
return true;
else
return false;
}
});
you can do other forms of comparisons inside the apply method as well.
Related
Any idea how to achieve the Date query in Hazelcast 3.2 ? I looked at the source code for 3.2 and I do not find anything.
Is there something like a DatePredicate using which I can write queries like
new DatePredicate("joiningDate > 1/1/2014 and joiningDate <
10/1/2014")
??
Any help is appreciated.
You can use SqlPredicate with the following way :-
new SqlPredicate("joiningDate > "+joiningMinDate+" AND joiningDate < "+joiningMaxDate+" ").
Here joiningMinDate and joiningMaxDate are your values and joiningDate is the value in db.
If not wrong it should work using a SqlPredicate just as you wrote above.
Please see: com.hazelcast.query.SqlPredicateTest::testSql_withDate
why not write a custom predicate:
final Date fromDate = /*1/1/2014 date object*/;
final Date toDate = /*10/1/2014 date object*/;
map.values(new Predicate<Object, Date>() {
public boolean apply(Entry<Object, Date> entry) {
Date date = entry.getValue();
if(date.after(fromDate) && date.before(toDate))
return true;
else
return false;
}
});
I was wondering if CsvHelper by Josh Close has anything in the configuration I am missing to translate values to null. I am a huge fan of this library, but I always thought there should be some sort of configuration to let it know what values represent NULL in your file. An example would be a column with the value "NA", "EMPTY", "NULL", etc. I am sure I could create my own TypeConverter, but I was hoping there would be an easier option to set somewhere in a config as this tends to be fairly common with files I encounter.
Is there a configuration setting to do this relatively easily?
I found the TypeConversion in the CsvHelper.TypeConversion namespace but am not sure where to apply something like this or an example of the correct usage:
new NullableConverter(typeof(string)).ConvertFromString(new TypeConverterOptions(), "NA")
I am also using the latest version 2.2.2
Thank you!
I think some time in the last seven years and thirteen versions since this question was asked the options for doing this without a custom type map class expanded, e.g.:
csvReader.Context.TypeConverterOptionsCache.GetOptions<string>().NullValues.Add("NULL");
csvReader.Context.TypeConverterOptionsCache.GetOptions<DateTime?>().NullValues.AddRange(new[] { "NULL", "0" });
csvReader.Context.TypeConverterOptionsCache.GetOptions<int?>().NullValues.Add("NULL");
csvReader.Context.TypeConverterOptionsCache.GetOptions<bool>().BooleanFalseValues.Add("0");
csvReader.Context.TypeConverterOptionsCache.GetOptions<bool>().BooleanTrueValues.Add("1");
CsvHelper can absolutely handle nullable types. You do not need to roll your own TypeConverter if a blank column is considered null. For my examples I am assuming you are using user-defined fluent mappings.
The first thing you need to do is construct a CsvHelper.TypeConverter object for your Nullable types. Note that I'm going to use int since strings allow null values by default.
public class MyClassMap : CsvClassMap<MyClass>
{
public override CreateMap()
{
CsvHelper.TypeConversion.NullableConverter intNullableConverter = new CsvHelper.TypeConversion.NullableConverter(typeof(int?));
Map(m => m.number).Index(2).TypeConverter(intNullableConverter);
}
}
Next is setting the attribute on your CsvReader object to allow blank columns & auto-trim your fields. Personally like to do this by creating a CsvConfiguration object with all of my settings prior to constructing my CsvReader object.
CsvConfiguration csvConfig = new CsvConfiguration();
csvConfig.RegisterClassMap<MyClassMap>();
csvConfig.WillThrowOnMissingField = false;
csvConfig.TrimFields = true;
Then you can call myReader = new CsvReader(stream, csvConfig) to build the CsvReader object.
IF you need to have defined values for null such as "NA" == null then you will need to roll your own CsvHelper.TypeConversion class. I recommend that you extend the NullableConverter class to do this and override both the constructor and ConvertFromString method. Using blank values as null is really your best bet though.
I used "ConvertUsing"...
public class RecordMap : CsvHelper.Configuration.ClassMap<Record>
{
public RecordMap()
{
AutoMap();
Map(m => m.TransactionDate).ConvertUsing( NullDateTimeParser );
Map(m => m.DepositDate).ConvertUsing( NullDateTimeParser );
}
public DateTime? NullDateTimeParser(IReaderRow row)
{
//"CurrentIndex" is a bit of a misnomer here - it's the index of the LAST GetField call so we need to +1
//https://github.com/JoshClose/CsvHelper/issues/1168
var rawValue = row.GetField(row.Context.CurrentIndex+1);
if (rawValue == "NULL")
return null;
else
return DateTime.Parse(rawValue);
}
}
I am facing struggle to bind values to grid which are retrieved from data base. I have a database column of type DateTime which is nullable. So, when I am trying to bind that null value, it is throwing an error while adding those column value to object property . So, before adding values fetched from database, i am using a function that converts the value to its default type before adding to object. Since , the default value for datetime Type is 1/1/0001 12:00:00 AM . So, where ever null values are present , I am getting this value for this field.
How to solve this issue? Please give your sugesstions.
To explain my scenario , i am adding a piece of code here.
public static T GetValue<T>(object o)
{
T val = default(T);
if (o != null && o != DBNull.Value)
{
val = (T)o;
}
return val;
}
This is the helper function I am using while reading data from the data reader.
Since you declared datetime as nullable, instead of converting it to datetime,convert it by using datetime?
Thus it allows datetime values with null values
I am wondering about how to search in J2ME. I have been searching in the internet, so many result show to me, and I see in Java2s.com I got a result use RecordFilter and matches method for search in record store.
But my problem is, when I need to pass 2 or more parameters into it. How can result matches with these parameter?
And how to sort descending or ascending like bubble sort?
Concatenate your searches into a single String variable. Separate each of them with ; for example. In the code of the matches method explode the String to get each search criteria.
To make the filter in effect create an instance of SearchFilter and call the matches method with the concantenated String as its param.
For the sort implement the RecordComparator interface ; implement the compare method to build your sort criteria. Make a google search about j2me+recordcomparator to see examples about how to make sorts.
EDIT :
In the code of the matches method explode the String param obtained from the byte[] param. Treat each String exploded to make the criteria.
As I understand you want to pass two string as a search criteria when you wrote :
SearchFilter search = new SearchFilter(txtSearch.getString(), strType);
So in the constructor there should be two params !!!
When you want to make the matching then call
if searchFilter.matches((search1+";"+sType).getBytes())
Then explode the candidate param into two String when you code the matches method.
When I save my Data in RMS I save it as a String[] like I want to save Name, Age,Salary,EmpID for each employee I save it create an array and convert it to bytes and save it in RMS. When i retrieve it i do the reverse process. Now if i want to get employee with names starting with A and with salary 10000 i use the following filter
class UtilFilter implements RecordFilter{
public UtilFilter(String str_searchText,String str_searchText1)
{
this.str_searchText = str_searchText.toLowerCase();
this.str_searchText1 = str_searchText1.toLowerCase();
}
public boolean matches(byte[] bt_byteData)
{
String str_str = "";
String str_str1 = "";
//here goes code how u get back ur String[] from RMS say u get it in Data
str_str = Data[0].trim();
str_str1 = gd_cd.Data[2].trim();
if(str_searchText != null && str_searchText1 != null && str_str.equals(str_searchText) && str_str1.equals(str_searchText1 ))
{
return true;
}
else
{
return false;
}
}
}
This way i can filter any no of parameters.Hope tht helps! :)
SPListItem.GetFormattedValue seems to have a strange behavior for DateTime fields.
It retrieves the DateTime value through SPListItem's indexer which according to this MSDN article returns local time.
Here's a snippet from Reflector
public string GetFormattedValue(string fieldName)
{
SPField field = this.Fields.GetField(fieldName);
if (field != null)
{
return field.GetFieldValueAsHtml(this[fieldName]);
}
return null;
}
So it uses SPListItem's indexer to retrieve the value and than SPFields.GetFieldValueAsHtml to format the value. GetFieldValueAsHtml seems to assume the date is in UTC and convert it to local time no matter what kind it is. (Reflector shows that it uses GetFieldValueAsText which uses value.ToString() but for some reason it assumes the time to be UTC.)
The end result is that the string representation on a time field obtained trough listItem.GetFormattedValue() (at least in my case) is incorrect, being local time + (local time - UTC).
Have anybody encountered the same issue with SPListItem.GetFormattedValue() and what was your workaround?
Converting the date back to universal time before calling GetFieldValueAsHtml works just fine.
DateTime localTime = (DateTime)item["DueDate"];
// this is local time but if you do localDateTime.Kind it returns Unspecified
// treats the date as universal time..
// let's give it the universal time :)
DateTime universalTime = SPContext.Current.Web
.RegionalSettings.TimeZone.LocalTimeToUTC(localTime);
string correctFormattedValue =
item.Fields["DueDate"].GetFieldValueAsHtml(universalTime);
I have had a recognised bug with the date conversion from UTC in SharePoint. It was fixed in SP1.