For some reason the TimeSpan property on my class is not being persisted into the database by Subsonic it is simply being ignored!? All other properties are being saved OK. I am using SimpleRepository and RunMigrations, Subsonic v3.0.0.3.
public TimeSpan Time { get; set; }
Are TimeSpans not supported?
TimeSpan is not a valid 2005/2008 SQL data type.
Store it as a numeric based SQL data type. Convert your TimeSpan to an appropriate duration based on the accuracy you require:
// Define an interval of 1 day, 15+ hours.
TimeSpan interval = new TimeSpan(1, 15, 42, 45, 750);
Console.WriteLine("Value of TimeSpan: {0}", interval);
Console.WriteLine("{0:N5} minutes, as follows:", interval.TotalMinutes);
Beware that there is a distinct difference between interval.Minutes and interval.TotalMinutes!
http://msdn.microsoft.com/en-us/library/system.timespan.totalminutes.aspx
Thus:
// my duration
int duration = Time.TotalMinutes;
// now insert this into the database
Related
We have a customer who has requested to have the "Duration" or "End Time" field rounded up to the nearest quarter hour in Service Appointment entry (whichever makes the most sense) Service Appointment Entry screen
With the service appointments there's alot going on. Does anyone have any thoughts on where would be the best way to approach this? There are a lot of events going on that modify the duration and end time and I wouldn't want to have to modify all of them. I'm wondering if its possible to just modify something on the DAC to automatically round the end time up to the quarter hour.
Got Pretty far with modifying the DAC. No errors in the code and as I step through it I can see its rounding the time like I want it to. However the field isnt setting in the screen. Am i missing something silly?
namespace PX.Objects.FS
{
public class FSAppointmentDetServiceExt : PXCacheExtension<PX.Objects.FS.FSAppointmentDetService>
// public class FSAppointmentDetServiceExt : PXCacheExtension<PX.Objects.FS.FSAppointmentDetService>
{
#region ActualDateTimeEnd
[PXDBDateAndTime(UseTimeZone = true, PreserveTime = true, DisplayNameDate = "Actual Date End", DisplayNameTime = "Actual Time End - Nicole")]
[PXUIField(DisplayName = "Actual Date", Visibility = PXUIVisibility.SelectorVisible)]
public virtual DateTime? ActualDateTimeEnd
{
get
{
return this._ActualDateTimeEnd;
}
set
{
this._ActualDateTimeEnd = RoundUp(value, TimeSpan.FromMinutes(15));
}
}
#pragma warning disable PX1026 // Underscores cannot be used in DAC declarations
public DateTime? _ActualDateTimeEnd;
#pragma warning restore PX1026 // Underscores cannot be used in DAC declarations
public static DateTime? RoundUp(DateTime? dt, TimeSpan d)
{
DateTime nt = Convert.ToDateTime(dt);
if (nt != null)
{
return new DateTime((nt.Ticks + d.Ticks - 1) / d.Ticks * d.Ticks, nt.Kind);
}
else
{
return nt;
}
}
#endregion
}
}
I'm wondering if its possible to just modify something on the DAC to
automatically round the end time up to the quarter hour.
Yes this is possible and I would recommend it since it's the easiest (perhaps not best) approach.
Create a DAC extension in CODE section, override (replace) the field dates you need to round. Add a backing field to the property and modify the setter property to round the time.
Example:
[PXDBDateAndTime(UseTimeZone = true, PreserveTime = true, DisplayNameDate = "Actual Date Time Begin", DisplayNameTime = "Actual Start Time")]
[PXUIField(DisplayName = "Actual Date", Visibility = PXUIVisibility.SelectorVisible)]
public virtual DateTime? ActualDateTimeBegin
{
get { return _ActualDateTimeBegin; }
set
{
_ActualDateTimeBegin = RoundTime(value);
}
}
public DateTime? _ActualDateTimeBegin;
public DateTime? RoundTime(DateTime? dateTime)
{
// return the rounded datetime
}
I have a problem in getting cardinality service in hazelcast.
In following code, I add some visitor to cardinality estimator. messageID is unique. In test environment that filled by 5K unique item by this code.
public void AppendVisitor(String messageID, String visitor) {
CardinalityEstimator cs = this.hazelcast.getCardinalityEstimator(messageID);
cs.add(visitor);
this.viewsList.putIfAbsent(messageID, Long.valueOf(System.currentTimeMillis() / 1000L));
}
In expiration of viewsList in another class I write listener code. The entry event key was messageID in previous code. But when call this function, visit is 0. Seems it get another object that is empty.
#Override
public void entryEvicted(EntryEvent<String, Long> entryEvent) {
long visit = hazelcast.getCardinalityEstimator(messageID).estimate();
}
I’m using contract-based SOAP APIs to try to import about 25,000 Journal Entry lines from a banking system into a single Acumatica GL batch.
If I try to add all the records at once to the same GL batch, my
request times out after a few hours. Since it uses the same GL
batch, this solution does not leverage multi-threading.
I've also tried adding the 25000 lines one line at a time to a
single GL batch and the requests does not time out, but
performance-speed starts decreasing significantly after
approximately 3000 records or so are added to the GL batch. This
process takes several hours to run and since it uses the same GL
batch, this solution does not leverage multi-threading.
I looked into multi-threading as well to import the data into
several smaller GL-batches of 5000 lines each and that works without
any timeout issues. but it still takes about an hour and a half to
run. Also, the customer does not accept this multi-batch approach;
they want all their daily data in a single GL batch.
25,000 records does not seem like a lot to me, so I wonder if Acumatica’s APIs were not built for this volume of lines in a single transaction. All I’m doing in my code is building the entity info by reading a text file and then calling the put method to create the GL batch using that entity with 25,000 line records.
I've read a couple of articles about optimizing the APIs, but they primarily deal with different instances of an entity, as in several different GL batches or several different Stock Items for example. In those cases, multi-threading is a great asset because you can have multiple threads creating multiple "different" GL batches, but multi-threading is not helpful when updating the same GL batch.
Here's what I've read so far:
https://asiablog.acumatica.com/2016/12/optimizing-large-import.html
https://adn.acumatica.com/blog/contractapioptimization/
I'm at a loss here, so any pointers would be greatly appreciated.
I look forward to your response.
Here's my code:
public static void CreateMultipleLinesPerJournalEntryBatchContractTEST(MyStoreContract.DefaultSoapClient soapClient, List<JournalEntry> journalEntries)
{
string myModuleForBatchLookup = "GL";
//list holding the values of all the records belonging to the batch in process
List<JournalEntry> allBatchItems = journalEntries;
//List used to store objects in format required by Acumatica
List<MyStoreContract.JournalTransactionDetail> myJournalTransactionsFormatted = new List<MyStoreContract.JournalTransactionDetail>();
try
{
//Creating a header and returning a batch value to be used for all line iterations.
JournalEntry myHeaderJournalEntryContract = allBatchItems.First();
string myBatchNumberToProcess = AddGLBatchHeaderContractTEST(soapClient, myHeaderJournalEntryContract);
// Do something with then n number of items defined in processing subBatch size or remaining items if smaller
foreach (JournalEntry je in allBatchItems)
{
//Moving the items in each batch from the original unformatted list to the formatted list one at a time
myJournalTransactionsFormatted.Add(new MyStoreContract.JournalTransactionDetail
{
BranchID = new MyStoreContract.StringValue { Value = je.Branch },
Account = new MyStoreContract.StringValue { Value = je.Account },
Subaccount = new MyStoreContract.StringValue { Value = je.Subaccount },
ReferenceNbr = new MyStoreContract.StringValue { Value = je.RefNumber },
DebitAmount = new MyStoreContract.DecimalValue { Value = je.DebitAmount },
CreditAmount = new MyStoreContract.DecimalValue { Value = je.CreditAmount },
TransactionDescription = new MyStoreContract.StringValue { Value = je.TransactionDescription },
UsrTransactionTime = new MyStoreContract.StringValue { Value = je.UsrTransactionTime },
UsrTransactionType = new MyStoreContract.StringValue { Value = je.UsrTransactionType },
UsrTranSequence = new MyStoreContract.StringValue { Value = je.UsrTranSequence },
UsrTellerID = new MyStoreContract.StringValue { Value = je.UsrTellerID }
});
}
//Specify the values of a new Jornal Entry using all the collected elements from the batch(list) created
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data and details added by list generated by loop
BatchNbr = new MyStoreContract.StringSearch { Value = myBatchNumberToProcess }, //This is one of two lines used to lookup/search the batch needing to be updated
Module = new MyStoreContract.StringSearch { Value = myModuleForBatchLookup }, //This is one of two lines used to lookup/search the batch needing to be updated
Details = myJournalTransactionsFormatted.ToArray() // this is the line adding the array containing all the line details
};
soapClient.Put(journalToBeCreated);
Console.WriteLine("Added " + allBatchItems.Count.ToString() + " line transactions");
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine("The following error was encountered and all entries for this batch need to be logged in error table");
Console.WriteLine();
Console.WriteLine(e.Message);
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
}
public static string AddGLBatchHeaderContractTEST(MyStoreContract.DefaultSoapClient soapClient, JournalEntry je)
{
try
{
//Specify the values of a new Jornal Entry Batch header
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data
BranchID = new MyStoreContract.StringValue { Value = "PRODWHOLE" }, //This is the default branch
TransactionDate = new MyStoreContract.DateTimeValue { Value = je.TransactionDate.AddDays(-1) }, //Reduced 1 day from the batch
CurrencyID = new MyStoreContract.StringValue { Value = je.CurrencyCode }, //Currency to be used for the batch
Description = new MyStoreContract.StringValue { Value = je.TransactionDescription },
Hold = new MyStoreContract.BooleanValue { Value = true }
};
//Create a Journal Entry with the specified values
MyStoreContract.JournalTransaction newJournalTransaction = (MyStoreContract.JournalTransaction)soapClient.Put(journalToBeCreated);
string myBatchToProcess = newJournalTransaction.BatchNbr.Value;
return myBatchToProcess;
}
catch (Exception e)
{
Console.WriteLine("Error was caught while trying to create the header for the batch...");
Console.WriteLine();
Console.WriteLine(e);
Console.WriteLine();
return null;
}
}
My custom class for legacy system line items which I then need format into Acumatica's format:
class JournalEntry
{
public DateTime TransactionDate { get; set; }
public string CurrencyCode { get; set; }
public string Description { get; set; }
public string Branch { get; set; }
public string Account { get; set; }
public string Subaccount { get; set; }
public string RefNumber { get; set; }
public decimal DebitAmount { get; set; }
public decimal CreditAmount { get; set; }
public string TransactionDescription { get; set; }
//Added custom fields for customer
public string UsrTellerID { get; set; }
public string UsrTransactionType { get; set; }
public string UsrTransactionTime { get; set; }
public string UsrTranSequence { get; set; }
//Adding original file data for the line
public string FileLineData { get; set; }
}
I tried Yuriy's approach described below, but my custom fields are not updating. Only standard fields are being updated. Which command should I use to update the extension (custom) fields. See code below:
//Here I create instance of GLTran
GLTran row = graph.GLTranModuleBatNbr.Cache.CreateInstance() as GLTran;
//here I get a handle to graph extension GLTranExt to be able to use the added fields.
var rowExt = row.GetExtension<GLTranExt>();
row = graph.GLTranModuleBatNbr.Insert(row);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "AccountID", JE.Account);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "SubID", JE.Subaccount);
row.TranDesc = "my line description";
row.Qty = 1.0m;
row.CuryDebitAmt = (JE.DebitAmount);
row.CuryCreditAmt = (JE.CreditAmount);
rowExt.UsrTellerID = "Test teller";
rowExt.UsrTransactionTime = "Test Transaction Time";
rowExt.UsrTransactionType = "Test Transaction Type";
rowExt.UsrTranSequence = "Test Transaction Sequence";
row = graph.GLTranModuleBatNbr.Update(row);
graph.Actions.PressSave();
In multi threaded import of Sales Orders I've got 18000 lines per hour ( 4 cores, 32Gb RAM ). So your 25000 is very similar to what I've get ( one Sales order had 1 - 6 lines ). For second link that you provided, what were parameters of your API call, what was number of your Acumatica instances ( CPU, RAM, parameters of SQL Server )?
I propose you to consider scaling Acumatica horizontally and also scale your database via SQL sharding.
Edit
In case if you need to have one GL Batch with 25000 lines on it, then I propose you following workaround:
Create one more Acumatica page that has text box and button Import.
In code of button Import button
2.1 read text box information as xml ( or JSON )
2.2 Create instance of GL Graph
2.3 Insert via Graph needed amount ( in your case 25000 ) lines
2.4 Call to graph.PressSave()
Send Web API your request not to GL Batch but to created by you page.
I know this is an old question, but I am answering here for the benefit of anyone who stumbles across this page. There's a performance issue in the Journal Transaction screen where the time to create a transaction increases non-linearly with the number of rows to insert.
A customization-based workaround was provided to us by Acumatica support which significantly improved performance. I don't have the exact version where this fix was included, but builds newer than today (September 2021) should include this already.
Customization fix:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using PX.Api;
using PX.Data;
using PX.Common;
using PX.Objects.Common;
using PX.Objects.Common.Extensions;
using PX.Objects.CS;
using PX.Objects.CM;
using PX.Objects.CA;
using PX.Objects.Common.Bql;
using PX.Objects.Common.GraphExtensions.Abstract;
using PX.Objects.Common.GraphExtensions.Abstract.DAC;
using PX.Objects.Common.GraphExtensions.Abstract.Mapping;
using PX.Objects.GL.DAC;
using PX.Objects.GL.FinPeriods;
using PX.Objects.GL.JournalEntryState;
using PX.Objects.GL.JournalEntryState.PartiallyEditable;
using PX.Objects.GL.Overrides.PostGraph;
using PX.Objects.GL.Reclassification.UI;
using PX.Objects.PM;
using PX.Objects.TX;
using PX.Objects.Common.Tools;
using PX.Objects.GL.DAC.Abstract;
using PX.Objects.Common.EntityInUse;
using PX.Objects.GL.FinPeriods.TableDefinition;
using PX.Data.SQLTree;
using PX.Objects.CR;
using PX.Data.BQL.Fluent;
using PX.Data.BQL;
using PX.Objects;
using PX.Objects.GL;
namespace PX.Objects.GL
{
public class JournalEntry_Extension : PXGraphExtension<JournalEntry>
{
public delegate void PopulateSubDescrDelegate(PXCache sender, GLTran Row, Boolean ExternalCall);
[PXOverride]
public void PopulateSubDescr(PXCache sender, GLTran Row, Boolean ExternalCall, PopulateSubDescrDelegate baseMethod)
{
if (Base.IsImport || Base.IsExport || Base.IsContractBasedAPI)
{
return;
}
baseMethod(sender,Row,ExternalCall);
}
}
}
I am using the same method for my time picker as I am for my date picker. My date is working perfecty fine, but the time dialog keeps showing me an error. I have looked into similar issues, but nobody has been in a relatable situation.
This is the code for the time picker dialog:
timePickerDialog = new TimePickerDialog(AddReminderActivity.this,
new TimePickerDialog.OnTimeSetListener() {
#Override
public void onTimeSet(TimePicker view, int hourOfDay, int minute) {
Calendar calendar = Calendar.getInstance();
calendar.set(hourOfDay, minute);
SimpleDateFormat format = new SimpleDateFormat("hh:mm a");
String timeString = format.format(calendar.getTime());
r_time.setText((timeString));
}
}, mHour, mMinute);
timePickerDialog.show();
Here is also the error I keep getting:
Cannot resolve constructor 'TimePickerDialog(json.google_services.newreminderapp.AddReminderActivity, anonymous android.app.TimePickerDialog.OnTimeSetListener, int, int)'
Thank you.
After mMinute add true or false as another parameter. True - if you want 24 hour time, false if you want 12 hour time
I am trying to perform an azure table query.
My table (that saves logs) has thousands of rows of data, and it gets populated with more each second.
Right now i have only 1 partition key, but it doesn't affect the next question.
How can i get back lets say only the 100 latest results.
this is my Entity:
public MainServerLogEntity(string Message)
{
this.PartitionKey = "mainserverlogs";
this.RowKey = (DateTime.MaxValue.Ticks - DateTime.UtcNow.Ticks).ToString();
this.Message = Message;
this.Date = DateTime.UtcNow;
}
public MainServerLogEntity() { }
public string Message { get; set; }
public DateTime Date { get; set; }
Right now this is the query i am performing inside a web api i have:
[Route("MainServerLogs")]
[HttpGet]
public IEnumerable<MainServerLogEntity> GetMainServerLogs()
{
CloudTable table = AzureStorageHelpers.GetWebApiTable(connectionString, "mainserverlogs");
TableQuery<MainServerLogEntity> query = new TableQuery<MainServerLogEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "mainserverlogs"));
return table.ExecuteQuery(query);
}
But the problem is that i am getting alot of data, and i am requesting this api every few seconds in order to update the ui.
What should i do? is it possible to define in the query that i only want the 100 first rows?
If it is not possible then what other technique should i use?
Try implementing a .Take(100) on the query like so:
[Route("MainServerLogs")]
[HttpGet]
public IEnumerable<MainServerLogEntity> GetMainServerLogs()
{
CloudTable table = AzureStorageHelpers.GetWebApiTable(connectionString, "mainserverlogs");
TableQuery<MainServerLogEntity> query = new TableQuery<MainServerLogEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "mainserverlogs")).Take(100);
return table.ExecuteQuery(query);
}