I've hit a bit of a wall trying to figure out how to bulk remove old emails within Netsuite (ie. transactions emailed to clients, statements, etc...).
I have over 40,000 emails taking up way to much storage space within NS.
Can't seem to do it via Workflow or Mass Update etc...
Is there a simplistic method I'm missing? Or am I going to have to write some suitescript to do it.
Cheers
I've used a Mass Delete Script before which works great on other records. I haven't used it on messages, but I did confirm that you can deploy it on messages so it should be good. Steps to use it are:
Install bundle 18108.
Create script deployment for the record type you want to delete (Message in your case)
Go to Mass Updates and expand Custom Updates. You should see Mass_Delete_Script in there under Message.
Create mass update as you usually would - TAKING EXTRA CARE NOT TO CLICK "PERFORM UPDATE" UNLESS YOU'RE ABSOLUTELY SURE YOU'RE SEARCH IS CORRECT! Beware that you can of course cause a lot of grief very quickly with a tool like this. On the other hand you can save a lot of time too :)
Related
The question here is: Am I on the right path (this is the first time I'm trying this), and if not, what would be smarter to try? If this is the right path, can you offer suggestions on how to do this best, because if this works, I am going to use it often on a lot of different tasks in this app.
I'm running a PowerApps Canvas app. As part of its program, I want it to be able to reference (read-only) a collection of data. That data is in ServiceNow, and my group is not permitted to access ServiceNow using the API.
During testing of the app, I just had it reference a SharePoint list (which I had filled with some dummy data), but I can re-code those lines as needed to pull from some other data source.
Because I am touching a few different systems here, I am not sure if this is the right way to go and I'm afraid I'll spend too long trying and find out that it would never have worked because of x. Thus my question.
This is what I think will work. Am I headed in the right direction?
Set up the scheduled report in ServiceNow. (Done!)
Program ServiceNow to email the Excel file output. Make sure it is
always the same title. (Done!)
Build a Power Automate flow to capture that email and save the
attached file to a location (OneDrive?) that can be accessed by the
app. If there is a file there already, delete it first.
Add the Excel file as a data source to the app, and start
referencing it as needed.
8-12 hours later, ServiceNow pushes out another scheduled data
drop, and the whole thing updates again.
In my perfect world, this system would work completely unattended.
Offhand, a glitch I can see is that ServiceNow generates an Excel file, but it's not a table, and PowerApps I think must ingest as a data source an Excel file that is a table. But (shrug) I might be wrong.
Am I thinking of this correctly? Is this the best avenue to follow?
I'm creating a contract API solution to keep items in sync between multiple tenants. Is there any way to track the changes to InventoryCD? In this case one Franchiser would like to update items in their 6 franchisees. It's easy for me to find the records that changed, but harder to know when the CD has changed (importantly what it chagned FROM). Certainly I could write customization to do it, but I thought maybe Acumatica has some option inbuilt.
Ideally I'd like to have a log of the changes with old and new CD. It's hosted so I don't think I can make it happen with DB Triggers (which is how pre-Acumatica me would have handled id)
Thanks in advance.
It depends on the Acumatica version. But have you tried looking at Business Events? I believe there is the ability to access the old and previous values.
Also look at Acumatica's Audit history capabilities but be careful to only turn on the fields you need to track as the DB can grow very large if you turn on all fields on the Stock Item screen or for any screen.
I have 2 custom entities ex: Building and Business in CRM on-premise. Building has multiple Units/Suite #'s and each Unit/Suite# is occupied by a Business. If a Building Unit/Suite is under renovation, then the Business has to be temporarily closed.
How can i automatically update Business Open/Close when Building Unit/Suite status changes? The update does not need to happen instantly. I need search around 20000 records to update the correct Business entity. Also there are fields in Business like start and end date which is retrieved from Building and Closure duration is updated with end(minus)start date.
Is Plugins the only way and how can i achieve it using plugins! How difficult would it be, impact on server and i am mid level C# dev. Please provide any links in the right direction. The env is 2011 on-prem
Thank you very much !!!
A straightforward solution can be built with a plugin. A synchronous plugin can update the status changes to the Business entity immediately and (in the PreOperation or PostOperation stage of the Update message) even within the same database transaction.
Generally speaking, with plugins you can build the most efficient and seamlessly integrated business logic possible.
However, often you can actually achieve pretty much the same using a workflow. Some advantages of building workflows:
Does not require a skilled software developer to build;
Workflows can be modified ('configured') quickly.
Execution of workflows can be postponed (e.g. until a condition is met or a date has passed).
Some downsides of workflows are:
In CRM 2011 your code always runs asynchronously, outside of the original database transaction;
Some time may pass until the action takes place; the user does not get immediate feedback;
Querying and selecting related data is limited to n:1 relationships (from the n-side to the 1-side, not vice versa);
Execution of workflows requires more resources than plugins;
Extensive use of workflows can easily lead to spaghetti systems that are really hard to maintain and perform bad.
In your scenario it looks like the requirements for selecting the appropriate Business record are too complex to handle in a workflow. In a workflow you basically can only navigate from one record to the other by following lookup references on the record at hand. This means you can only get from one record to the other when there is a n:1 relationship and when you navigate from the n-side to the 1-side.
In plugins you do not have this limitation; there you can write a QueryExpression or Linq-query to get the records you need. So, in your case a plugin seems to be the right choice to me.
Updating can performance test script e.g. with LoadRunner can take a lot of time and be quite frustrating. If there has been some updates with the applications, you usually have to run the script and then find out what has to be changed, update and run again and so on. Does anyone have some concrete best practices how to ease this updating inferno? One obvious thing is good communication with developers.
It depends on the kind of updates. If the update is dramatic, like adding new fields for user to fill in, then, someone has to manually touch up the test scripts.
If, however, the update is minor, for example, some changes to the hidden fields or changes to the internal names of user-facing fields, then it's possible to write a script that checks the change and automatically updates the test script.
One of the performance test platforms, NetGend, automatically takes care of the hidden fields and the internal names of user-facing fields so it's very easy to create a script to performance-test a HTML form. Tester only needs to fill in the values that he/she would have to enter using a browser, so no correlation is necessary there. Please send me a message if you need to know more about it.
There are many things you can do to insulate your scripts from build to build variability. The higher up the OSI stack you go the lower the maintenance charge, but the higher the resource cost for the virtual user type. Assuming changes are limited to page level resources and a few hidden fields here and there for web sites or applications, then you can record in HTML mode. You blast the EXTRARES sections as the page parser in HTML mode will automatically parse the page and load the page resources even without an explicit reference - It can be a real pain to keep these sections in synch if you have developers who are experimenting quite a bit.
Next up, for forms which have a very high velocity in terms of change consider the use of a web_custom_request() for the one form. You can use correlation statements to pick up all of the name|value pairs as needed and build the form submit dynamically. There will be a little bit more up front work for this but you should have pay offs at around the fourth changed build where you would normally have been rebuilding some scripts.
Take a look at all of the hosts referenced in your code. Parameterize all of these items. I have a template that I use for web virtual users which pairs a default value and the ability to change any of the host names via the control panel extra attributes section. Take a look at the example for lr_get_attrib_string() for how you might implement the pickup and pair that with a check for NULL and a population with a default value in your code
This is going to seem counter intuitive, but comment your script heavily for changes that are occurring often so you know where to take the extra labor change up front to handle a more dynamic data set.
Almost nothing you do with any tool can save you from struuctural changes in the design and flow of the app, such as the insertion of a new page in the workflow, but paying attention to the design on the high change pages, of which there are typically a small number, can result in a test code with a very long life.
Of course if your application is web services based then there is a natual long life to the use of exposed public services. Code may change on the back end of the service, but typically the exposed public interface is very stable.
Is there any way to generate automatically the result of an SAP transaction? Let's say I want to see the production orders for one MRP controller (I have the COOIS transaction for this). Is there any way to generate an XML feed with the result of that transaction and refresh it let's say.. every 10 minutes?
Or to auto-export an .xls file with the result somewhere... ? I know I have the jobs and the spools but I have to manually download the result from the SAP GUI.
I don't have access to ABAP so I would like to know if there are other methods to get data from SAP?
Since "a transaction" might be anything from a simple report to a complex interactive application that does not even have a simple "result", I doubt that there's a way to provide any generic tool for this. You might try the following:
Schedule a job and have the result sent to some mailbox instead of printing it. Then use the programming language of your choice to grab and process the mail.
Check whether there are BAPIs available (BAPI_PRODORD_* or something like that - I'm not a CO expert, so I wouldn't know which one to use). You can call these BAPIs from an external program without having to write ABAP yourself - however, you'll most likely need the help of someone who knows ABAP in order to get the interface documentation and understand the concepts.