I am using groovy in order to access gmail and read the Inbox. It is regular JavaMail and will not describe it here.
So for simplicity, after I connect to the store, I use this:
folder.open(Folder.READ_ONLY)
folder.messages.each { msg ->
...
doSomething with msg
...
}
this is working fine.
However I have a performance issue. Sometimes messages[] could be big. Some folders contain more than 1000 messages, and checking them all takes time.
I am looking for a quicker way to get only those emails that are the most recent (for example messages from the last 5 days or something like that)
of course I have the date information in each msg and I could do my comparison, but this is not efficient since it will loop through the entire collection.
Is there a better way to get those messages?
If you have JavaMail issue a SEARCH command with the criterion SINCE 04-JAN-2011, you'll get back the set messages in the currently-selected folder delivered since January 4th. (SENTSINCE 04-JAN-2011 will do the same thing, only based on the "Date" message header.)
Something along the lines of this:
folder.search(new ReceivedDateTerm(ComparisonTerm.GE, sinceDate));
Related
I feel silly asking this.. but its doing my head..
if I use 'https://maps.googleapis.com/maps/api/place/autocomplete/json' and set the input parameter to say - 'Palazzo Cast' I will get about 5 suggestions - none of which will be the one I'm looking for. if I set input to 'Palazzo Castellania' I will get zero results - even though there is a place called this (see below). I've set the region parameter to 'mt'...
If I use 'https://maps.googleapis.com/maps/api/place/findplacefromtext' and set the input parameter to 'Palazzo Castellania' - I will get 'the Ministry of Health' - which is correct - however, if I put a partial string in I'll get only a single candidate which will be something different - there doesn't seem to be a way to get multiple place candidates?
I'm guessing from an API side - I have to do a multi-step process - but it would be good to get some input.
My thoughts:
I start with 'https://maps.googleapis.com/maps/api/place/autocomplete/json' - if I get an empty result, I try 'https://maps.googleapis.com/maps/api/place/findplacefromtext'
if I get a single result from either then I can pass the placeID to the places API to get more detailed data.
Make sense? It feels argly..
Edit
So watching how https://www.google.com.mt/ does it... while typing it uses suggest (and never gives the right answer, just like the API) and then when I hit enter it uses search and gives the correct answer... leading me to the conclusion that there is actually two databases happening!
Basically "its by design".. there is no fix as of Feb 2023.. My thoughts are to cache results and do a first search against that otherwise I'll probably use bing or here
I have two lists of messages. The first one is short messages and the second one is a master file which has longer texts which includes the short messages in the first list but also has many new messages. I want to find the new ones in master file (second list) which has no partial matches.
something like above. then NO means they are new errors
I tried =IF(ISERROR(VLOOKUP(""&A2&"",C:C,1,0)),"No","Yes") but it is other way around. it will find short messages within master file with big messages. I want to check big messages which have the short messages inside compare with the list with short messages and if there is no (partial) match label it as new.
This should work, currently can't test it though
=if(sumproduct(--isnumber(search($a$2:$a$8,b2)))>0,"YES","NO")
Try:
=IF(OR(ISNUMBER(FIND(" "&$A$2:$A$8&" "," "&B2& " "))),"YES","NO")
Note the use of spaces otherwise aaa would be found in kkaaa
I am new to logstash, elasticsearch and kibana (ELK).
I know that I can create filters that parse specific logs and extract some fields from them. It looks like for each type of log I have to configure a specific filter. As I have around 20 different services, each writing around a hundred of different types of log this looks too difficult to me.
For type of logs I mean logs that have a specific template with parameters that change
This is a example of some logs:
Log1: User Peter has logged in
Log2: User John has logged in
Log3: Message "hello" sent by Peter
Log4: Message "bye" sent by John
I want ELK to discover automatically that here we have two types of log
Type1: User %1 has logged in
Type2: Message "%1" sent by %2
Is that possible? Is there any example to do that? I don't want to write manually the template for each type of log, I want it to be discovered automatically.
Then also extract the parameters. This is what I wold like to see in the index
Log1: Type1, params: Peter
Log2: Type1, params: John
Log3: Type2, params: hello, Peter
Log4: Type2, params: bye, John
After that I would like ELK to scan again my index and discover that param %1 of Type1 is usually param %2 in Type2 (the user name). Also it should discover that Log1 and Log3 are related (same user).
The last thing it should do is finding unusual sequences of actions (logins without the corresponding logout, for example)
Is any of this possible without having to manually configure all types of logs? If not, can you point me to some example of this multipass indexing even if it involves manual configuration?
Logstash has no discovery like this, you'll have to do the language parsing yourself. It's tedious and repetitive, but it gets the job done. You have a few options here, depending on your ability to influence other areas:
If the format of those logs is changeable, consider pushing for an authentication-logging standard. That way you only need one pattern.
Consider a modular approach to generating your filter pipeline. Log1 patterns go in one module, Log2 in another. It makes maintainability easier.
You have my sympathy with this problem. I've had to integrate Logstash with the authentication-logging of many systems by now, and each one describes what they're doing somewhat differently, all based on the whim of the developer who wrote it (which may have happened 25 years ago in some cases).
For the products we develop, I can at least influence how the logging looks. Moving away from a natural language grok format to something else, such as kv or even json goes a long way towards simplifying the parsing problem or me. The trick is convicing people that we only look at the logs through Kibana anyway, why do we need:
User %{user} logged into application %{app} in zone %{zone}
When we can have
user="%{user}" app="%{app}" zone=%{zone}
Or even:
{ "user": %{user}, "app": %{app}, "zone": %{zone} }
Since that's what it'll be when Logstash is done with it anyway.
I have been using NetSuite for only a short time, and already hate it. I am sorry if this is a stupid question, but I haven't been able to find an answer so far, either in the Netsuite docs, StackOverflow or other websites. In fact, the answers I found have resulted in an error.
My company requires a script to transfer inventory based on an EDI input file. Reading the file is no problem, even parsing it is working. However, actually inserting the data is proving problematic.
I have been able to insert normal records, but Inventory Transfer records are giving me problems.
From Stack Overflow I found and adapted some code into the following:
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
nlapiSelectNewLineItem('invt');
nlapiSetLineItemValue("invt","invtid",1, 189);
nlapiSetLineItemValue("invt","adjustqtyby", 1, "5");
nlapiCommitLineItem('invt');
var id = nlapiSubmitRecord(xfer);
The FormatDate function just exchanges the date from the text file into a system date NetSuite can understand.
However, when I run this code I get the following error:
USER_ERROR: You must enter at least one line item for this transaction.
I thought inserting the line item was the reason to use nlapiSelectNewLineItem, but I guess not. Also nlapiCreateNewLineItem doesn't seem to exist.
The values I am inserting are all just test data, as I'm testing this in the debugger. Location 9 exists, as does item 189.
My full script finds these id's based on string values from the text files. But since this is the section that doesn't work I have set it apart to test.
Can anyone help with this?
You did not specify the type of script you are using, but, looks like you are not setting the fields on the record object, but, setting the value on current record. Below, is the suggested code.
Also, there is no sublist named invt, it should be inventory. Also, there is no field as such invtid, I think most probably you want to setup item, the field Id should be item. You might want to refer SuiteScript Record Browser as well for a help on correct Ids
var xfer = nlapiCreateRecord("inventorytransfer");
xfer.setFieldValue("trandate", FormatDate("20160101"));
xfer.setFieldValue("location", 9);
xfer.setFieldValue("transferlocation", 9);
xfer.selectNewLineItem('inventory');
xfer.setCurrentLineItemValue("inventory", "item", 189);
xfer.setCurrentLineItemValue("inventory","adjustqtyby", "5");
xfer.commitLineItem('inventory');
var id = nlapiSubmitRecord(xfer);
If you are using Bin/Lot Numbered Items, please see help topic "Sample Scripts for Advanced Bin / Numbered Inventory Management"
My company uses script-generated emails to correspond with clients. Until now, we've had to manually sort through these emails, look up client info, print and file them. I'm writing a script that does this automatically and it was working fine until 10 minutes ago when Google stopped sending the subject with imap_fetch_overview().
Here's how I'm doing it:
$msgov=imap_fetch_overview($inbox,$uid,FT_UID);
$msgsub= $msgov[0]->subject;
$msgfr= $msgov[0]->from;
$msgid= $msgov[0]->uid;
$message = imap_fetchbody($inbox,$uid,1,FT_UID);
//echo message info, then message
echo "...";
And that worked fine until about 10 minutes ago when I started getting this error: Notice: Undefined property: stdClass::$subject in C:\wamp\www\gmil\index.php on line 113
So I proceed to echo var_dump($msgov); and suddenly it's not showing the subject anymore.. According to The Manual it should be giving me the subject. Am I doing something wrong or am I just unlucky enough to be doing this at the exact time Google decided to stop sending it?
I'm dumb.
After one message not didn't contain a subject, it stopped checking for that value in all subsequent loops. I solved it like this:
if(isset($overview[0]->subject)){$sub=$overview[0]->subject;}else{$sub="No Subject";}
and then called $sub instead of $overview[0]->subject.