I have recently noticed that retrieving the LastDateTimeModified (through the WebService API) from Acumatica gives me the date and time in a very different time zone - I am guessing GMT time.
However when I view this through a Generic Inquiry it seems that it is showing the correct time - based on my Time Zone set up in the user profile.
Is there a way to get the LastDateTimeModified in the correct time zone when retrieving from the Web Service API. I have attempted changing the Time Zone for the SDK user with no success
Thanks,
G
For most screens, except a few CRM screens, the LastDateTimeModified and CreatedDateTime are stored in the same time zone as the database server machine. When reading it using web services, you are retrieving the raw value form the database, with no timezone conversion. It is up to you to convert it to the desired timezone.
The Help->Audit History panel does a manual conversion to the current user time zone. I have not been able to get the generic inquiry to show the time as you mention in your question; it is only showing the date.
Related
I am working on an online SharePoint site collection, which have the following regional settings:-
and I hosted a remote event receiver inside azure web app,where the remote event receiver will get fired when adding/updating list item. now inside the remote event receiver i have the following code to get a DateTime value named ApproveDate:-
DateTime approveBy = (DateTime)projectItem["ApproveDate"];
but if the user enter a value for the ApproveDate equal let say 30/09/2020 inside the SharePoint form as follow:-
then the DateTime value inside the remote event receive will be 9/29/2020 8:00:00 PM instead of 9/30/2020 00:00:00 AM.. so why i am facing this issue? is this because the azure web service have different time zone compared to the SharePoint site? and how i can fix this?
I think this is no problem.
Assuming that the time zone of our web application server IIS is UTC time, then the parameter time passed by your SharePoint is 9/30/2020 00:00:00 AM, and the IIS server time to the web should be minus 4 hours. The time stored by the IIS server is correct. When the front end needs to return, it is UTC+4 hours.
For our program, we should get the parameters of the Sharepoint part of the front-end page, such as time zone and other information, and then we can do the corresponding processing in the code when we return.
When processing webhook posts, I see the dates are all in my (Eastern US) timezone. This is reported correctly if I apply the option to include Time Zone Information.
I can work with this, but I have a few questions:
why not send in UTC, which is used consistently -- I believe -- throughout the API? (At least, the REST API)
Is Time Zone Information working correctly? I ask because this report says it doesn't support fractional info, such as India's UTC+5.5 correctly.)
What Time Zone is used in the webhook posts? I have changed both my personal preference and my account setting to Katmandu, but I still Eastern. (Thus, I cannot test #2 myself.)
Thank you
1.) Most datetimes coming back from API are UTC, however SOAP and Connect messages follow a hidden account setting. You will need to ask support to change it. It's called "Time Zone used for Connect and SOAP".
2.) Fractional datetimes should work fine
3.) Webhook and SOAP currently use the same time zone as mentioned above. Your UI preferences (account settings and personal preferences) that you see are aimed at UI users. Since you are an API integrator, they follow a different setting.
Does IBM Domino track the last login date for web users(UserName/Password and internet certificate)? I know the access logs contains this information but wanted to know if there may be something built into Domino (maybe in Address Book). Trying to come up with a method to disable web accounts that have not accessed a domino server in a specified time period.
Thanks,
Kev
The User Activity area in the Database Properties picks up from the log.nsf, which is where this information is stored. But, typically, the log.nsf will only have a few days' worth of information. When I've had this requirement before, I've manually captured it via a custom login page or an initUser function I've had in applications.
One of the easiest solutions is to trigger an action from a live web page that generates a database.nsf?openagent event.
like:
or
Ideally you've use the openagent to print a content type and a response, but if you don't browsers do pretty well with invalid responses from servers.
inside your "myagent" you will have the users name available to you to write it to a document.
Your next challenge will be in getting the agent to trigger, but, not too often, ideally only on login.
When a user uses a custom login form it submits the username/password and redirection url in POST method. You could change that to ...?openagent&nexturl=/blablabla.nsf
Your tiny little agent would run one and only one time upon login and update a document in a your custom logging database.
That's a developer's solution.
There are also admin solutions. The server does keep track of active web sessions, but, it does not drop them into the log.nsf like it does upon session ending for a notes session. I don't think it would be too much work from an admin standpoint to get that information there are a lot of event triggers available to you. It's just been way too long since I worked on any server that anyone cared about statistics.
I am trying to create sales orders using WebServices from our CRM (Salesforce) to NetSuite. I am having an issue with international sales orders, in particular I am hitting this issue with United Kingdom.
If I create a sales order in the UI and set a bunch of field values and then set the Address to United Kingdom, I get a popup with the following message:
The address you have selected is based in a nexus for which you are required to charge a different kind of tax. Click OK to change the form to one that is applicable for that Nexus.
In the UI, when you click "ok", the page reloads and a few new tax fields appear (the fields are built in, denoted by the field ids). The problem is when the page reloads, all of the data is wiped out.
I did this UI testing to determine what was causing this. However, in WebServices, all the data sets at once (not able to be done in any "order"). What is happening with my webservice call is that it gets this same warning and then all of the data it tried to send is lost and then it tries to submit the record, hitting validation rules we have in place.
Has anyone hit this before? Is there a field or something I can set via webservices that is what NS is doing on the backend when you click "ok"?
I am open to any solution. I do have a ticket open with NS Support, however so far they have not been helpful. If I do get a resolution, I intend to post it for others.
to get around it, you can access the vendor first then select dropdown icon to open the purchase order
I've created a document and I'm sure it is saved in the Vault too since I can fetch it, but the TrueVault dashboard shows me "0 DOCUMENTS Stored In TrueVault". Is this a bug?
The Document count takes some time to load upon refresh of the dashboard. You should see the value updated after waiting some time (less than a minute). A revamp of the dashboard is on our roadmap.