Use-Case : I am having an Integration in place that creates multiple Vendor Bills in Netsuite for every 5 minutes. I want to export the vendor bills created in those time to FTP. For that I need to create a saved search that can preview vendor bills created in past five minutes. Do we have any criteria in Netsuite Save Search to accomplish that ?
Please advise.
A Netsuite inconsistency.
I keep a snippet for this.
function toNSLegalDatetime(date){
var formatted = <string>format.format({value:date, type:format.Type.DATETIMETZ});
return formatted.replace(/(:\d{2}):\d{2}/, '$1');
}
Then you can do:
const lastDT = new Date(Date.now() - 5*60000)); // 5 minutes ago
search.create({
type:'vendorbill',
filters:[
search.createFilter({name:'datecreated', operator:search.Operator.ONORAFTER, values:toNSLegalDatetime(lastDT)}),
...
BUT
timing like this is very tricky because small delays in timing could cause you to miss transactions. If you can keep track of the last internalid reported your next search could just use that and it wouldn't matter what the lag was.
search.createFilter({name:'internalidnumber', operator:search.Operator.GREATERTHAN, values:lastIdReported})
Related
I am trying to use zenpy to get information on tickets created or modified in the last week. This is the code I am using:
stream_status = False #Exit condition
while(stream_status==False):
print(start_date)
all_tickets = zendesk.tickets.incremental(start_time=start_date, include=['users','organizations'])
print(len(all_tickets))
print(all_tickets.end_time)
for ticket in all_tickets:
#grab ticket fields and store them in a Dataframe
df.loc[len(df)] = [ticket_id,created_at,requester,organization,product,subject,assignee,status,priority,opened_at,opened_by,solved_at,solved_by,closed_at,closed_by]
count +=1
print(count,ticket_id)
start_date = all_tickets.end_time
stream_status = all_tickets.end_of_stream
print(start_date,stream_status)
today = datetime.now().strftime("%Y%m%d%H%M")
df.to_excel('Ticket_report{0}.xlsx'.format(today))
Now there are several issues here. The date is calculated correctly and it's indeed 7 days ago. But the tickets I am getting are going back to at least April and they were definitely not been modified since. I stopped it at this point cause we have thousands of tickets.
Also, the incremental method returns a max of 1000 ticket objects, but even after 1000 the loop doesn't seem to restart (print statement at the end doesn't trigger). I am not sure I am using the stream_status flag correctly. Any advice is more than welcome. Thank you!
Zenpy documentation on incremental can be found here: http://docs.facetoe.com.au/zenpy.html#incremental-exports
It was an API issue after all.
For anyone facing similar issues, turns out that the start date is compared against the generated_timestamp instead of the updated_at or created_at fields. Updated_at holds the last action that generated a ticket event(e.g. change) but generated_timestamp is updated every time the ticket is affected even from the system. That leads to grabbing tickets updated or created before the input date. A solution for that is to filter out the results after the call:
all_tickets = zendesk_client.incremental(start_time=start_date)
accurate_tickets = [ticket for ticket in all tickets if parser.parse(ticket.updated_at) > start_date]
Not very sure why they designed the API like this, seems a bit wrong, but that's how it works unfortunately.
Source: https://developer.zendesk.com/documentation/ticketing/managing-tickets/using-the-incremental-export-api/#excluding-system-updated-tickets-time-based-exports
So I am making a bot that tells the user my shop's operating hours. The code works but the problem is, Dialogflow can't seem to recognize shortforms like 'next sat', 'on mon', 'ltr', 'tmr'. I tried using custom entities but I can't tell Dialogflow to recognize the custom entity as a system parameter like #sys.date, and I'm having trouble extracting the dates from phrases like these. I could hard code it, for example
var date = new Date();
var short = agent.parameters.shortform;
var day = date.getDay();
if (short == 'tmr'){
if(day==6){day=0;}else{day+=1;}
}
but if I do this, I'd have to take into account every possible shortform my user can write and write code for it, including every day of the week, every other shortform like nxt, ltr, hrs, mins. Is there an easier way?
I want to know if there’s a way to insert Data in database in a specific moment,
Exemple:
I want to send a text to the Database in 3 hours => As soon as i click on my button in my client side i want the document to be created in 3 hours.
Is it possible to do something like this ?
you can simply use setTimeOut like this.
let time = 60000 * 60 * 3;
function myFunc(arg) {
console.log(`arg was => ${arg}`);
}
setTimeout(myFunc, time, 'funky');
or use cronjob like this npm https://www.npmjs.com/package/node-cron
I don't think it's a good way to create documents. I don't know your problem, but I would solve it differently.
You could create the document when the button is clicked. But, you'll add a new property in your document like active_at that will hold a date value (timestamp or other). This value will be always current date + 3 hours. In your application, after that, you'll have to select/get documents where active_at is before the current date. This way, you'll have only the documents created at least 3 hours ago.
You can use setTimeout function in nodejs backend:
function myFunc(arg) {
//Insert text to mongodb
}
setTimeout(myFunc, 10800000); //myFunc will be called after 3 hours (10.8000.000 ms)
You can find more detail about setTimeout function here: https://nodejs.org/api/timers.html#timers_settimeout_callback_delay_args
Im developing an auction style web app, where products are available for a certain period of time.
I would like to know how would you model that.
So far, what I've done is storing products in DB:
{
...
id: p001,
name: Product 1,
expire_date: 'Mon Oct 7 2013 01:23:45 UTC',
...
}
Whenever a client requests that product, I test *current_date < expire_date*.
If true, I show the product data and, client side, a countdown timer. If the timer reaches 0, I disable the related controls.
But, server side, there are some operations that needs to be done even if nobody has requested that product, for example, notify the owner that his product has ended.
I could scan the whole collection of products on each request, but seems cumbersome to me.
I thought on triggering a routine with cron every n minutes, but would like to know if you can think on any better solutions.
Thank you!
Some thoughts:
Index the expire_date field. You'll want to if you're scanning for auction items older than a certain date.
Consider adding a second field that is expired (or active) so you can also do other types of non-date searches (as you can always, and should anyway, reject auctions that have expired).
Assuming you add a second field active for example, you can further limit the scans to be only those auction items that are active and beyond the expiration date. Consider a compound index for those cases. (As over time, you'll have more an more expired items you don't need to scan through for example).
Yes, you should add a timed task using your favorite technique to scan for expired auctions. There are lots of ways to do this -- your infrastructure will help determine what makes sense.
Keep a local cache of current auction items in memory if possible to make scanning efficient as possible. There's no reason to hit the database if nothing is expiring.
Again, always check when retrieving from the database to confirm that items are still active -- as there easily could be race conditions where expired items may expire while being retrieved for display.
You'll possible want to store the state of status e-mails, etc. in the database so that any server restarts, etc. are properly handled.
It might be something like:
{
...
id: p001,
name: "Product 1",
expire_date: ISODate("Mon Oct 7 2013 01:23:45 UTC"),
active: true,
...
}
// console
db.auctions.esureIndex({expire_date: -1, active: 1})
// javascript idea:
var theExpirationDate = new Date(2013, 10, 06, 0, 0, 0);
db.auctions.find({ expire_date : { "$lte" : theExpirationDate }, active: true })
Scanning the entire collection on each request sounds like a huge waste of processing time.
I would use something like pm2 to handle both keeping track of your main server process as well as running periodic tasks with its built-in cron-like functionality.
I want a list of all yesterday's emails from gmail. I am trying to process it using google apps script, by writing a query on my inbox, and then using GmailApp.search. The after: and before: search query in gmail returns results that are not expected, since the query searches on the basis of the SMTP server time that the mail is sent from (which is google's server). Hence, being in a different time zone, the search yields inappropriate results to me. Is there a way to search gmail using a time criteria, so that I can accommodate for the time zone difference?
Please note that the local time zone, calendar, gmail etc. is correctly configured for my timezone, so the emails that I see in my inbox are correctly timed. Only the search is creating an issue.
Figured out a way after some trial and error, that it is indeed possible to search gmail emails by time. Notice that the Date() returned in google apps script is according to your timezone.
The code below will return all previous day's emails in inbox, assuming new Date() is giving the date and time according to your timezone. Division by 1000 is done because getTime() returns milliseconds, while the newer / older search query expects seconds.
var month = new Date().getMonth();
var date = new Date().getDate();
var year = new Date().getFullYear();
var time1 = new Date(year, month, date, 0, 0, 0).getTime();
var time2 = time1 - 86400000;
var query = "newer:" + time2/1000 + " older:" + time1/1000 + " in:inbox";
var conversations = GmailApp.search(query);
Can you give the exact search string you are using along with how you construct the before and after dates ?
You can use the Utilities.formatDate() function to format the date string to the timezone you are in.
An alternate solution is to fetch all mails (maybe a 100 or so) and then discard all those which do not fit in the time period you are interested in.