I have an object which contains a Mutable List
object TrackingEventsList {
var EventDate: String = ""
var EventDescription: String = ""
}
object Waybill {
var WaybillNumber: String = ""
var OriginHub: String = ""
var TrackingEvents: MutableList<TrackingEventsList> = ArrayList()
}
When I try to add Waybill.TrackingEvents all previous instances are overwritten to and duplicate the last TrackingEvent added.
private fun fillTracking(events: NodeList) {
var list = TrackingEventsList
for (x: Int in 0 until events.length) {
var tName = (events.item(x) as Element).tagName
var event = (events.item(x).firstChild as Text).wholeText
if (tName == "EventDate") {
list.EventDate = event
}
if (tName == "EventDescription") {
list.EventDescription = event
}
}
Waybill.TrackingEvents.plus(list)
}
The result after calling fillTracking 3 times:
Waybill.TrackingEvents[0].EventDescription = "Event3"
Waybill.TrackingEvents[1].EventDescription = "Event3"
Waybill.TrackingEvents[2].EventDescription = "Event3"
object in Kotlin is a singleton, which means you can't initialize it, and it only has one instance globally. So when you change the items of a single instance, you will override the previous data.
You should change both of them (or at least TrackingEventsList) to class instead. If Waybill's variables are instance-sensitive, it needs to be a class too. But in the code you added, I couldn't find anything that would say you use it as a singleton, so I left it as one.
class TrackingEventsList (
var eventDate: String = "",
var eventDescription: String = "")
/**
* Also want to point out that this is still a singleton. If the data inside is instance-specific, you need to change it
* to a class.
*/
object Waybill {
var waybillNumber: String = ""
var originHum: String = ""
var trackingEvents: MutableList<TrackingEventsList> = ArrayList()
}
private fun fillTracking(events: NodeList) {
val item = TrackingEventsList()
for (x: Int in 0 until events.length) {
var tName = (events.item(x) as Element).tagName
var event = (events.item(x).firstChild as Text).wholeText
if (tName == "EventDate") {
item.eventDate = event
}
if (tName == "EventDescription") {
item.eventDescription = event
}
}
Waybill.trackingEvents.add(item)
}
And you should look into the naming conventions for Kotlin; fields never start with an upper-case letter, unless it's a static constant (in which case it's all-upper)
Since TrackingEventsList is an object, it means it's a singleton (or only has a single instance.) When you go through your loop, you're always updating the same instance of your TrackingEventsList object.
Change TrackingEventsList to be this:
data class TrackingEventsList(var eventDate: String, var eventDescription: String)
Create a new instance each time you go through your loop, and then add it to your list at the end:
private fun fillTracking(events: NodeList) {
var eventDate: String = ""
var eventDescription: String = ""
for (x: Int in 0 until events.length) {
val tName = (events.item(x) as Element).tagName
val event = (events.item(x).firstChild as Text).wholeText
if (tName == "EventDate") {
eventDate = event
}
if (tName == "EventDescription") {
eventDescription = event
}
}
Waybill.TrackingEvents.plus(TrackingEventsList(eventDate, eventDescription))
}
Related
Is it possible to change the default option all column filters? I'm pretty sure I can accomplish this with some JavaScript, but I'd like to know if there's any way within the Acumatica framework to change this.
The answer will be no. The filter is inside the PX.Web.UI.dll in the PXGridFilter class which is an internal class. The property that you are interested in is the Condition.
The value is set inside one of the private methods of the PXGrid class. The code of the method is below:
private IEnumerable<PXGridFilter> ab()
{
List<PXGridFilter> list = new List<PXGridFilter>();
if (this.FilterID != null)
{
Guid? filterID = this.FilterID;
Guid guid = PXGrid.k;
if (filterID == null || (filterID != null && filterID.GetValueOrDefault() != guid))
{
using (IEnumerator<PXResult<FilterRow>> enumerator = PXSelectBase<FilterRow, PXSelect<FilterRow, Where<FilterRow.filterID, Equal<Required<FilterRow.filterID>>, And<FilterRow.isUsed, Equal<True>>>>.Config>.Select(this.DataGraph, new object[]
{
this.FilterID.Value
}).GetEnumerator())
{
while (enumerator.MoveNext())
{
FilterRow row = enumerator.Current;
string dataField = row.DataField;
PXCache pxcache = PXFilterDetailView.TargetCache(this.DataGraph, new Guid?(this.FilterID.Value), ref dataField);
if (this.Columns[row.DataField] != null)
{
List<PXGridFilter> list2 = list;
int valueOrDefault = row.OpenBrackets.GetValueOrDefault();
string dataField2 = row.DataField;
string dataField3 = row.DataField;
int value = (int)row.Condition.Value;
object value2 = pxcache.ValueFromString(dataField, row.ValueSt);
string valueText = row.ValueSt.With((string _) => this.Columns[row.DataField].FormatValue(_));
object value3 = pxcache.ValueFromString(dataField, row.ValueSt2);
string value2Text = row.ValueSt2.With((string _) => this.Columns[row.DataField].FormatValue(_));
int valueOrDefault2 = row.CloseBrackets.GetValueOrDefault();
int? #operator = row.Operator;
int num = 1;
list2.Add(new PXGridFilter(valueOrDefault, dataField2, dataField3, value, value2, valueText, value3, value2Text, valueOrDefault2, #operator.GetValueOrDefault() == num & #operator != null));
}
}
return list;
}
}
}
if (this.FilterRows != null && this.FilterRows.Count > 0)
{
for (int i = 0; i < this.FilterRows.Count; i++)
{
PXFilterRow row = this.FilterRows[i];
list.Add(new PXGridFilter(row.OpenBrackets, row.DataField, row.DataField, (int)row.Condition, row.Value, row.Value.With(delegate(object _)
{
if (this.Columns[row.DataField] == null)
{
return _.ToString();
}
return this.Columns[row.DataField].FormatValue(_);
}), row.Value2, row.Value2.With(delegate(object _)
{
if (this.Columns[row.DataField] == null)
{
return _.ToString();
}
return this.Columns[row.DataField].FormatValue(_);
}), row.CloseBrackets, row.OrOperator));
}
}
return list;
}
UPDATE
The below JS code allows you to change the condition of the last set filter:
px_all.ctl00_phG_grid_fd.show();
px_all.ctl00_phG_grid_fd_cond.items.items[7].setChecked(true);
px_all.ctl00_phG_grid_fd_ok.element.click();
px_all.ctl00_phG_grid_fd_cond.items.items[7].value "EQ"
px_all.ctl00_phG_grid_fd_cond.items.items[8].value "NE"
px_all.ctl00_phG_grid_fd_cond.items.items[9].value "GT"
px_all.ctl00_phG_grid_fd_cond.items.items[13].value "LIKE"
px_all.ctl00_phG_grid_fd_cond.items.items[14].value "LLIKE"
px_all.ctl00_phG_grid_fd_cond.items.items[15].value "RLIKE"
px_all.ctl00_phG_grid_fd_cond.items.items[16].value "NOTLIKE"
px_all.ctl00_phG_grid_fd_cond.items.items[18].value "ISNULL"
px_all.ctl00_phG_grid_fd_cond.items.items[19].value "ISNOTNULL"
I'm using the SuiteTalk (API) service for NetSuite to retrieve a list of Assemblies. I need to load the InventoryDetails fields on the results to view the serial/lot numbers assigned to the items. This is the current code that I'm using, but the results still show those fields to come back as NULL, although I can see the other fields for the AssemblyBuild object. How do I get the inventory details (serials/lot#'s) to return on a transaction search?
public static List<AssemblyBuildResult> Get()
{
var listAssemblyBuilds = new List<AssemblyBuildResult>();
var service = Service.Context();
var ts = new TransactionSearch();
var tsb = new TransactionSearchBasic();
var sfType = new SearchEnumMultiSelectField
{
#operator = SearchEnumMultiSelectFieldOperator.anyOf,
operatorSpecified = true,
searchValue = new string[] { "_assemblyBuild" }
};
tsb.type = sfType;
ts.basic = tsb;
ts.inventoryDetailJoin = new InventoryDetailSearchBasic();
// perform the search
var response = service.search(ts);
response.pageSizeSpecified = true;
// Process response
if (response.status.isSuccess)
{
// Process the records returned in the response
// Get more records with pagination
if (response.totalRecords > 0)
{
for (var x = 1; x <= response.totalPages; x++)
{
var records = response.recordList;
foreach (var t in records)
{
var ab = (AssemblyBuild) t;
listAssemblyBuilds.Add(GetAssemblyBuildsResult(ab));
}
if (response.pageIndex < response.totalPages)
{
response = service.searchMoreWithId(response.searchId, x + 1);
}
}
}
}
// Parse and return NetSuite WorkOrder into assembly WorkOrderResult list
return listAssemblyBuilds;
}
After much pain and suffering, I was able to solve this problem with the following code:
/// <summary>
/// Returns List of AssemblyBuilds from NetSuite
/// </summary>
/// <returns></returns>
public static List<AssemblyBuildResult> Get(string id = "", bool getDetails = false)
{
// Object to populate and return results
var listAssemblyBuilds = new List<AssemblyBuildResult>();
// Initiate Service and SavedSearch (TransactionSearchAdvanced)
var service = Service.Context();
var tsa = new TransactionSearchAdvanced
{
savedSearchScriptId = "customsearch_web_assemblysearchmainlist"
};
// Filter by ID if specified
if (id != "")
{
tsa.criteria = new TransactionSearch()
{
basic = new TransactionSearchBasic()
{
internalId = new SearchMultiSelectField
{
#operator = SearchMultiSelectFieldOperator.anyOf,
operatorSpecified = true,
searchValue = new[] {
new RecordRef() {
type = RecordType.assemblyBuild,
typeSpecified = true,
internalId = id
}
}
}
}
};
}
// Construct custom columns to return
var tsr = new TransactionSearchRow();
var tsrb = new TransactionSearchRowBasic();
var orderIdCols = new SearchColumnSelectField[1];
var orderIdCol = new SearchColumnSelectField();
orderIdCols[0] = orderIdCol;
tsrb.internalId = orderIdCols;
var tranDateCols = new SearchColumnDateField[1];
var tranDateCol = new SearchColumnDateField();
tranDateCols[0] = tranDateCol;
tsrb.tranDate = tranDateCols;
var serialNumberCols = new SearchColumnStringField[1];
var serialNumberCol = new SearchColumnStringField();
serialNumberCols[0] = serialNumberCol;
tsrb.serialNumbers = serialNumberCols;
// Perform the Search
tsr.basic = tsrb;
tsa.columns = tsr;
var response = service.search(tsa);
// Process response
if (response.status.isSuccess)
{
var searchRows = response.searchRowList;
if (searchRows != null && searchRows.Length >= 1)
{
foreach (SearchRow t in searchRows)
{
var transactionRow = (TransactionSearchRow)t;
listAssemblyBuilds.Add(GetAssemblyBuildsResult(transactionRow, getDetails));
}
}
}
// Parse and return NetSuite WorkOrder into assembly WorkOrderResult list
return listAssemblyBuilds;
}
private static string GetAssemblyBuildLotNumbers(string id)
{
var service = Service.Context();
var serialNumbers = "";
var tsa = new TransactionSearchAdvanced
{
savedSearchScriptId = "customsearch_web_assemblysearchlineitems"
};
service.searchPreferences = new SearchPreferences { bodyFieldsOnly = false };
tsa.criteria = new TransactionSearch()
{
basic = new TransactionSearchBasic()
{
internalId = new SearchMultiSelectField
{
#operator = SearchMultiSelectFieldOperator.anyOf,
operatorSpecified = true,
searchValue = new[] {
new RecordRef() {
type = RecordType.assemblyBuild,
typeSpecified = true,
internalId = id
}
}
}
}
};
// Construct custom columns to return
var tsr = new TransactionSearchRow();
var tsrb = new TransactionSearchRowBasic();
var orderIdCols = new SearchColumnSelectField[1];
var orderIdCol = new SearchColumnSelectField();
orderIdCols[0] = orderIdCol;
tsrb.internalId = orderIdCols;
var serialNumberCols = new SearchColumnStringField[1];
var serialNumberCol = new SearchColumnStringField();
serialNumberCols[0] = serialNumberCol;
tsrb.serialNumbers = serialNumberCols;
tsr.basic = tsrb;
tsa.columns = tsr;
var response = service.search(tsa);
if (response.status.isSuccess)
{
var searchRows = response.searchRowList;
if (searchRows != null && searchRows.Length >= 1)
{
foreach (SearchRow t in searchRows)
{
var transactionRow = (TransactionSearchRow)t;
if (transactionRow.basic.serialNumbers != null)
{
return transactionRow.basic.serialNumbers[0].searchValue;
}
}
}
}
return serialNumbers;
}
private static AssemblyBuildResult GetAssemblyBuildsResult(TransactionSearchRow tsr, bool getDetails)
{
if (tsr != null)
{
var assemblyInfo = new AssemblyBuildResult
{
NetSuiteId = tsr.basic.internalId[0].searchValue.internalId,
ManufacturedDate = tsr.basic.tranDate[0].searchValue,
SerialNumbers = tsr.basic.serialNumbers[0].searchValue
};
// If selected, this will do additional NetSuite queries to get detailed data (slower)
if (getDetails)
{
// Look up Lot Number
assemblyInfo.LotNumber = GetAssemblyBuildLotNumbers(tsr.basic.internalId[0].searchValue.internalId);
}
return assemblyInfo;
}
return null;
}
What I learned about pulling data from NetSuite:
Using SavedSearches is the best method to pull data that doesn't automatically come through in the API objects
It is barely supported
Don't specify an ID on the SavedSearch, specify a criteria in the TransactionSearch to get one record
You will need to specify which columns to actually pull down. NetSuite doesn't just send you the data from a SavedSearch automatically
You cannot view data in a SavedSearch that contains a Grouping
In the Saved Search, use the Criteria Main Line = true/false to read data from the main record (top of UI screen), and line items (bottom of screen)
I was wondering if ORMLite had a QueryMultiple solution like dapper.
My use case is in getting paged results.
return new {
Posts = conn.Select<Post>(q => q.Where(p => p.Tag == "Chris").Limit(20, 10))
TotalPosts = conn.Count<Post>(q.Where(p => p.Tag == "Chris"))
};
I also have a few other cases where I'm calculating some other stats in addition to a main query, and I'm keen to avoid multiple roundtrips.
(Probably unrelated, but I'm using PostgreSQL)
You can probably do something like this:
var bothThings = db.Exec(cmd => {
cmd.CommandText = #"
select * from TableA
select * from TableB";
var both = new BothAandB();
using (var reader = cmd.ExecuteReader())
{
both.a = reader.ConvertToList<A>();
reader.NextResult();
both.b = reader.ConvertToList<B>();
}
return both;
});
It might be possible to wrap this up in an extension method, but nothing clever is coming to mind.
You can create some helper OrmLite extensions (works in v 3.9.55.0) pretty easily that will NOT wrap the reader. It is rather easy since the methods you need are public. Here is how I did it.
public static class MultiResultReaderOrmLiteExtensions
{
public static IList CustomConvertToList<T>(this IDataReader dataReader)
{
var modelDef = ModelDefinition<T>.Definition;
var type = typeof (T);
var fieldDefs = modelDef.AllFieldDefinitionsArray;
var listInstance = typeof(List<>).MakeGenericType(type).CreateInstance();
var to = (IList)listInstance;
var indexCache = dataReader.GetIndexFieldsCache(modelDef);
while (dataReader.Read())
{
var row = type.CreateInstance();
row.PopulateWithSqlReader(dataReader, fieldDefs, indexCache);
to.Add(row);
}
return to;
}
public static Dictionary<string, int> GetIndexFieldsCache(this IDataReader reader,
ModelDefinition modelDefinition = null)
{
var cache = new Dictionary<string, int>();
if (modelDefinition != null)
{
foreach (var field in modelDefinition.IgnoredFieldDefinitions)
{
cache[field.FieldName] = -1;
}
}
for (var i = 0; i < reader.FieldCount; i++)
{
cache[reader.GetName(i)] = i;
}
return cache;
}
}
Then you can call like something like this:
using (var db = _connectionFactory.OpenDbConnection())
{
var cmd = db.api_GetSprocWithMultResults(id);
using (IDataReader reader = cmd.DbCommand.ExecuteReader())
{
meta = reader.CustomConvertToList<Element_Media_Meta>().Cast<Element_Media_Meta>().ToList();
reader.NextResult();
queues = reader.CustomConvertToList<Element_Media_ProcessQueue>().Cast<Element_Media_ProcessQueue>().ToList();
}
}
e.g. myfunc(from:c:\my\dir,to:c:\my\other\file.ext) ==> ..\other\file.ext .
new Uri() need not apply, unless there's a remedy for it returning URI format not Windows filename format. .LocalPath fails.
This should do what you want.
string firstDirectory = "c:\\my\\dir";
string secondDirectory = "c:\\my\\other\\file.ext";
var first = firstDirectory.Split('\\');
var second = secondDirectory.Split('\\');
var directoriesToGoBack = first.Except(second);
var directoriesToGoForward = second.Except(first);
StringBuilder directory = new StringBuilder();
bool initial = true;
foreach (string s in directoriesToGoBack)
{
if (initial)
{
initial = false;
} else
{
directory.Append('\\');
}
directory.Append("..");
}
foreach (string s in directoriesToGoForward)
{
directory.Append('\\');
directory.Append(s);
}
Console.WriteLine(directory.ToString());
I have a method that calls stored procedure and returns the data after executing DataReader.
I am trying to test the method using mock. I am not sure how to return value?
Anyone did this? Appreciate your responses.
Here is my code:
// Call the StoredProcedure
public List<string> GetCompletedBatchList(int fileId)
{
List<string> completedBatches = new List<string>();
StoredProcedure sp = new StoredProcedure("GetDistributedBatches", this.dataProvider);
sp.Command.AddParameter("FileID", fileId, DbType.Int32, ParameterDirection.Input);
sp.Command.AddParameter("Result", null, DbType.Int32, ParameterDirection.InputOutput);
using (var rdr = sp.ExecuteReader())
{
while (rdr != null && rdr.Read())
{
if (rdr[0] != null)
{
completedBatches.Add(rdr[0].ToString());
}
}
}
return completedBatches;
}
Here is the Test Method:
[Test]
public void Can_get_completedBatches()
{
var file = new File() { FileID = 1, DepositDate = DateTime.Now };
repo.Add<File>(file);
CompletedBatches completedBatches = new CompletedBatches(provider.Object);
//Here I am not sure how to Return
provider.Setup(**x => x.ExecuteReader(It.IsAny<QueryCommand>())).Returns** =>
{
cmd.OutputValues.Add(0);
});
var completedBatchesList = completedBatches.GetCompletedBatchList(file.FileID);
Assert.AreEqual(0, completedBatchesList.Count());
}
If you want to create a DataReader of a certain shape and size then I suggest you look at DataTable.CreateDataReader DataTable.CreateDataReader. You can then setup the ExecuteReader in your example to return this datareader.
The following link helped me...
How to mock an SqlDataReader using Moq - Update
I used MockDbDataReader method to mock the data
[Test]
public void Can_get_completedBatches_return_single_batch()
{
var date = DateTime.Now;
var file = new File() { FileID = 202, DepositDate = DateTime.Now };
var batch1 = new Batch() { FileID = 202, BatchID = 1767, LockboxNumber = "1", IsLocked = true, LockedBy = "testUser" };
var transaction1 = new Transaction() { BatchID = 1767, TransactionID = 63423, CheckAmount = 100.0 };
var distribution1 = new Distribution() { TransactionID = 63423, InvoiceNumber = "001", Amount = 100.0, DateCreated = date, DateModified = date, TransType = 2 };
repo.Add<File>(file);
repo.Add<Batch>(batch1);
repo.Add<Transaction>(transaction1);
repo.Add<Distribution>(distribution1);
CompletedBatches completedBatches = new CompletedBatches(provider.Object);
provider.Setup(x => x.ExecuteReader(It.IsAny<QueryCommand>())).Returns(MockDbDataReader());
var completedBatchesList = completedBatches.GetCompletedBatchList(202);
Assert.AreEqual(1, completedBatchesList.Count());
}
// You should pass here a list of test items, their data
// will be returned by IDataReader
private DbDataReader MockDbDataReader(List<TestData> ojectsToEmulate)
{
var moq = new Mock<DbDataReader>();
// This var stores current position in 'ojectsToEmulate' list
int count = -1;
moq.Setup(x => x.Read())
// Return 'True' while list still has an item
.Returns(() => count < ojectsToEmulate.Count - 1)
// Go to next position
.Callback(() => count++);
moq.Setup(x => x["BatchID"])
// Again, use lazy initialization via lambda expression
.Returns(() => ojectsToEmulate[count].ValidChar);
return moq.Object;
}