I need to get the data from SOLine and SOLineSplit in the report, but when I joined am getting duplicate records, can anyone please suggest to avoid getting duplicates.
Can I go with SQL Views, as i need data for display purpose only not to do any inserts or updates. please suggest.
Related
I have an entity Contact which has an array entity Specialties. Just a standard one to many relationship. One contact many specialties. (Specialties entity has few columns, might not be relevant). I have a screen to add list of Specialties to a contact on the PCF. I want to add a custom Remove All button on the Contact screen which will delete all values on the array against the specific contact. A contact can have large number of specialties (~10000)
What is best way to delete all the elements in the array?
Currently, I have the below function on the action property of the button and it is clocking and timing out.
for(spec in contact.Specialties)
{contact.removeFromSpecialties(spec) //OOTB remove from array method}
Any other better way to remove ~10000 records from the array entity?
From your question above, I assume that you will have a "Remove All" button in PCF screen with an action that deletes all the speciality records associated with that particular contact, but you will not delete partial records (like one or few records).
Also I assume the entity type of "Speciality" entity is "Editable" or "Retireable".
Keeping the above assumption in mind, you can create a new function and call that function from "Remove All" button action property.
Given below the code that will hard delete the entire records from database table just like you execute a delete query against a DB table,
function removeAllSpecialitiesForContact(contactFromPCF : Contact){
var specialityQuery = Query.make(entity.Speciality).compare(Speciality#Contact, Equals, contactFromPCF.ID)
var deleteBuilder = new com.guidewire.pl.system.database.query.DeleteBuilder.DeleteBuilder(specialityQuery)
deleteBuilder.execute()
}
This new approach might not get timeout after PCF action as you faced already.
How do I get transaction data based on an item filter for journals with null ITEM_ID? I'm a database guy not a finance guy and my finance people can pull a report from NetSuite that shows each transaction with posting period, filtered by item. I'm unable to do the same because the Transaction_Lines table has null ITEM_ID's for the entries I'm looking for, presumably because they are journal entries. This particular ITEM_ID is not in the Transaction_Lines table at all so I'm assuming that's because it's always handled by journal entries. Any help would be appreciated.
It appears that you're referring to the NetSuite.com Data Source querying via ODBC. If this is correct, I believe that you're looking for the "Posting_account_activity_pe" table, which should contain all journal transactions, whether or not they have an Item_ID associated with them:
More details can be found on the Connect Browser:
https://system.netsuite.com/help/helpcenter/en_US/srbrowser/Browser2020_1/odbc/record/posting_account_activity_pe.html
The answer ended up being to point to the Revenue_Plans, Revenue_Plan_Lines, and Revenue_Elements table but I had to be careful to join to the posting period rather than the planned period. Then I could do two select queries, one for journals and one for all other income types.
I have created a customization which contains DAC extension. There are few extra fields added to Stock Item screen (InventoryItem table). After publishing, the customer had a requirements for one of the field to change its datatype from string to bool. I did updated in customization and published it. However, it did not updated in database. As a result it was throwing an error. Is there anything I am missing here? Please suggest.
Acumatica doesn't delete columns out of the database as a result of a change in your customization. You have two options here, you can drop the column via SQL command and publish, or more realistically, you can just create a new UDF with a different name and change your code.
If I have an a list of items in a table, that are related to a master item in another table, and the identifier for the master items comes from the field in the first table, and it is indexed, is there a way to delete all those items with one call, rather than doing a query using the index, retrieving all the items, then looping through them one after another and deleting by the hashkey? We are using node.js.
You can use BatchWriteItem to delete up to 25 items at once. But you still need to run a query before to retrieve the items you want to delete.
I am running into an issue when adding a new field to an entity. In my entity I had 526 fields, I tried to add a new one but I got the error message "Attribute xxx cannot be created because we have hit a maximum number of attributes allowed for an entity (1000)". However I only have 526 fields in the entity. Furthermore, I deleted one field I don't need anymore and I tried to add my new field but I still have the same error message even though I've deleted one field. Could anyone help me out please. I do appreciate it. Thanks !
Walloud
You will be hitting the column limit, even with only 526 fields.
You have to remember that CRM will add its own fields in addition to the ones that you have, such as ones for base currency. This can take you over the limit.
Also consider that the Filtered Views that CRM automatically creates have extra columns for lookup (name), optionset (name) and date (local) fields which can also mean you exceed the limit there.
I think your only options are to delete more fields in order to free up the room, or re-architect your entity design to split it over multiple related entities instead
Looks like you are facing the SQL Server 8k row limit
you can check the field names and verify this.
more information here:
http://mileyja.blogspot.com/2013/09/dealing-with-8k-sql-server-row-limits.html
You need to remember that some fields, such as currency will have two separate fields in SQL thus increment the column count by two for each one you create. If you have 500+ fields on a single entity then it is time to normalize the data and start breaking it out into related entities. It will be better for performance and your data in the long run.
http://en.wikipedia.org/wiki/Database_normalization
Cheers!