How to read last message in MessageQueue - c#-4.0

I am trying to read the messages in MessageQueue start from newest one. I know I can use GetAllMessages(). Then loop through. But it is quite slow when there are large amount of messages in the queue. Is there a better way to do this?
Thanks

In MSMQ 3 there is a feature that may be what you're looking for. The ReceiveByLookupId method can get the last message on the queue.
http://msdn.microsoft.com/en-us/library/3w50th9h.aspx
You would use the following action:
MessageLookupAction.Last: Receives the last message in the queue and
removes it from the queue. The lookupId parameter must be set to 0.
If you're using an earlier version of MSMQ you will probably have to use Peek similar to how this blog post explains how to get a count of the message queue. When you got to the last one, you could then ReceiveById.
http://jopinblog.wordpress.com/2008/03/12/counting-messages-in-an-msmq-messagequeue-from-c/

Assuming that the performance hit is in the looping through the messages:
There is no need to loop through all the messages if all you want is the last one. Since GetAllMessages() returns an array of type Message just go to the last element in the array by index.

public class PersonDetails
{
public string FirstName { get; set; }
public string LastName { get; set; }
public DateTime DateTime { get; set; }
}
const string queueName = #".\private$\PersonQueue";
public void GetMessageFromQueue(string queueName)
{
MessageQueue perosnMessageQueue = new MessageQueue(queueName);
try
{
XmlMessageFormatter xmlMessageFormatter = new XmlMessageFormatter(new Type[] {
(typeof(PersonDetails)) });
perosnMessageQueue.Formatter = xmlMessageFormatter;
perosnMessageQueue.Refresh();
for (int i = perosnMessageQueue.GetAllMessages().Count(); i !=0; i--)
{
var personDetailsFromQueue =
(PersonDetails)perosnMessageQueue.Receive
(MessageQueueTransactionType.Automatic).Body;
Console.WriteLine("FistName : {0} \n LastName : {1} \n Date Time :
{2}",personDetailsFromQueue.FirstName,
personDetailsFromQueue.LastName,
personDetailsFromQueue.DateTime);`
}

Related

Delete entity after specified time with JpaRepository

I am using Spring Boot and H2 db. I have a Product entity and I want my application to be able to remove the product from the database, but my requirement is this: first set the active flag to false ( then this row will not be taken into account during fetching ) and after a specific period of time completely remove this row from db.
#Entity
#Table(name = "products")
public class Product {
#Id
#GeneratedValue(generator = "inc")
#GenericGenerator(name = "inc", strategy = "increment")
private int id;
private boolean active = true;
// getters & setters
}
And my method from the service layer responsible for setting the active flag to false and later complete deletion (I have nothing that does the second part of my requirement - complete deletion after specific period of time)
#Transactional
public void deleteProduct(int id) {
var target = repository.findProductById(id)
.orElseThrow(() -> new IllegalArgumentException("No product with given id"));
target.setActive(false);
// what should I add here to remove the target after a specific time?
}
EDIT
OK, I solved my problem:
#Transactional
public void deleteProduct(int id) {
var target = repository.findProductByIdAndActiveTrue(id)
.orElseThrow(() -> new IllegalArgumentException("No product with gicen id"));
target.setActive(false);
// complete removal after 150 seconds
new Thread(() -> {
try {
Thread.sleep(150000);
repository.deleteById(id);
} catch (Exception e) {
logger.error("Error removing the product");
}
}).start();
}
But now my question is if this is a safe solution as the user may start too many threads now so I think there is a better solution to my problem (safer in terms of multithreading)
I am not an expert but i think what you trying to achieve is bad practice.
I believe you should do a scheduling, for example ones per day.
You should update in db the value active. Set a schedule that will check the entries each time and if they have an active value of false then delete. Something like this:
public void deleteProduct(int id) {
//update value to false
repository.updateProductValue(id,false);
}
and your scheduling method:
#Scheduled(fixedRate = 150000)
public void deleteNonActiveProducts() {
List<Product> products = repository.findAllByFalseValue();
products.forEach(product -> repository.deleteById(product.getId());
}
With this what you are doing is that every 150000 milliseconds you repeat that task and each execution of this task is independent and non parallel.
Hope is useful to you.

Importing Thousands of Records Into Acumatica via SOAP Contract-Based API

I’m using contract-based SOAP APIs to try to import about 25,000 Journal Entry lines from a banking system into a single Acumatica GL batch.
If I try to add all the records at once to the same GL batch, my
request times out after a few hours. Since it uses the same GL
batch, this solution does not leverage multi-threading.
I've also tried adding the 25000 lines one line at a time to a
single GL batch and the requests does not time out, but
performance-speed starts decreasing significantly after
approximately 3000 records or so are added to the GL batch. This
process takes several hours to run and since it uses the same GL
batch, this solution does not leverage multi-threading.
I looked into multi-threading as well to import the data into
several smaller GL-batches of 5000 lines each and that works without
any timeout issues. but it still takes about an hour and a half to
run. Also, the customer does not accept this multi-batch approach;
they want all their daily data in a single GL batch.
25,000 records does not seem like a lot to me, so I wonder if Acumatica’s APIs were not built for this volume of lines in a single transaction. All I’m doing in my code is building the entity info by reading a text file and then calling the put method to create the GL batch using that entity with 25,000 line records.
I've read a couple of articles about optimizing the APIs, but they primarily deal with different instances of an entity, as in several different GL batches or several different Stock Items for example. In those cases, multi-threading is a great asset because you can have multiple threads creating multiple "different" GL batches, but multi-threading is not helpful when updating the same GL batch.
Here's what I've read so far:
https://asiablog.acumatica.com/2016/12/optimizing-large-import.html
https://adn.acumatica.com/blog/contractapioptimization/
I'm at a loss here, so any pointers would be greatly appreciated.
I look forward to your response.
Here's my code:
public static void CreateMultipleLinesPerJournalEntryBatchContractTEST(MyStoreContract.DefaultSoapClient soapClient, List<JournalEntry> journalEntries)
{
string myModuleForBatchLookup = "GL";
//list holding the values of all the records belonging to the batch in process
List<JournalEntry> allBatchItems = journalEntries;
//List used to store objects in format required by Acumatica
List<MyStoreContract.JournalTransactionDetail> myJournalTransactionsFormatted = new List<MyStoreContract.JournalTransactionDetail>();
try
{
//Creating a header and returning a batch value to be used for all line iterations.
JournalEntry myHeaderJournalEntryContract = allBatchItems.First();
string myBatchNumberToProcess = AddGLBatchHeaderContractTEST(soapClient, myHeaderJournalEntryContract);
// Do something with then n number of items defined in processing subBatch size or remaining items if smaller
foreach (JournalEntry je in allBatchItems)
{
//Moving the items in each batch from the original unformatted list to the formatted list one at a time
myJournalTransactionsFormatted.Add(new MyStoreContract.JournalTransactionDetail
{
BranchID = new MyStoreContract.StringValue { Value = je.Branch },
Account = new MyStoreContract.StringValue { Value = je.Account },
Subaccount = new MyStoreContract.StringValue { Value = je.Subaccount },
ReferenceNbr = new MyStoreContract.StringValue { Value = je.RefNumber },
DebitAmount = new MyStoreContract.DecimalValue { Value = je.DebitAmount },
CreditAmount = new MyStoreContract.DecimalValue { Value = je.CreditAmount },
TransactionDescription = new MyStoreContract.StringValue { Value = je.TransactionDescription },
UsrTransactionTime = new MyStoreContract.StringValue { Value = je.UsrTransactionTime },
UsrTransactionType = new MyStoreContract.StringValue { Value = je.UsrTransactionType },
UsrTranSequence = new MyStoreContract.StringValue { Value = je.UsrTranSequence },
UsrTellerID = new MyStoreContract.StringValue { Value = je.UsrTellerID }
});
}
//Specify the values of a new Jornal Entry using all the collected elements from the batch(list) created
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data and details added by list generated by loop
BatchNbr = new MyStoreContract.StringSearch { Value = myBatchNumberToProcess }, //This is one of two lines used to lookup/search the batch needing to be updated
Module = new MyStoreContract.StringSearch { Value = myModuleForBatchLookup }, //This is one of two lines used to lookup/search the batch needing to be updated
Details = myJournalTransactionsFormatted.ToArray() // this is the line adding the array containing all the line details
};
soapClient.Put(journalToBeCreated);
Console.WriteLine("Added " + allBatchItems.Count.ToString() + " line transactions");
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine("The following error was encountered and all entries for this batch need to be logged in error table");
Console.WriteLine();
Console.WriteLine(e.Message);
Console.WriteLine();
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
}
public static string AddGLBatchHeaderContractTEST(MyStoreContract.DefaultSoapClient soapClient, JournalEntry je)
{
try
{
//Specify the values of a new Jornal Entry Batch header
MyStoreContract.JournalTransaction journalToBeCreated = new MyStoreContract.JournalTransaction
{
//Header data
BranchID = new MyStoreContract.StringValue { Value = "PRODWHOLE" }, //This is the default branch
TransactionDate = new MyStoreContract.DateTimeValue { Value = je.TransactionDate.AddDays(-1) }, //Reduced 1 day from the batch
CurrencyID = new MyStoreContract.StringValue { Value = je.CurrencyCode }, //Currency to be used for the batch
Description = new MyStoreContract.StringValue { Value = je.TransactionDescription },
Hold = new MyStoreContract.BooleanValue { Value = true }
};
//Create a Journal Entry with the specified values
MyStoreContract.JournalTransaction newJournalTransaction = (MyStoreContract.JournalTransaction)soapClient.Put(journalToBeCreated);
string myBatchToProcess = newJournalTransaction.BatchNbr.Value;
return myBatchToProcess;
}
catch (Exception e)
{
Console.WriteLine("Error was caught while trying to create the header for the batch...");
Console.WriteLine();
Console.WriteLine(e);
Console.WriteLine();
return null;
}
}
My custom class for legacy system line items which I then need format into Acumatica's format:
class JournalEntry
{
public DateTime TransactionDate { get; set; }
public string CurrencyCode { get; set; }
public string Description { get; set; }
public string Branch { get; set; }
public string Account { get; set; }
public string Subaccount { get; set; }
public string RefNumber { get; set; }
public decimal DebitAmount { get; set; }
public decimal CreditAmount { get; set; }
public string TransactionDescription { get; set; }
//Added custom fields for customer
public string UsrTellerID { get; set; }
public string UsrTransactionType { get; set; }
public string UsrTransactionTime { get; set; }
public string UsrTranSequence { get; set; }
//Adding original file data for the line
public string FileLineData { get; set; }
}
I tried Yuriy's approach described below, but my custom fields are not updating. Only standard fields are being updated. Which command should I use to update the extension (custom) fields. See code below:
//Here I create instance of GLTran
GLTran row = graph.GLTranModuleBatNbr.Cache.CreateInstance() as GLTran;
//here I get a handle to graph extension GLTranExt to be able to use the added fields.
var rowExt = row.GetExtension<GLTranExt>();
row = graph.GLTranModuleBatNbr.Insert(row);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "AccountID", JE.Account);
graph.GLTranModuleBatNbr.Cache.SetValueExt(row, "SubID", JE.Subaccount);
row.TranDesc = "my line description";
row.Qty = 1.0m;
row.CuryDebitAmt = (JE.DebitAmount);
row.CuryCreditAmt = (JE.CreditAmount);
rowExt.UsrTellerID = "Test teller";
rowExt.UsrTransactionTime = "Test Transaction Time";
rowExt.UsrTransactionType = "Test Transaction Type";
rowExt.UsrTranSequence = "Test Transaction Sequence";
row = graph.GLTranModuleBatNbr.Update(row);
graph.Actions.PressSave();
In multi threaded import of Sales Orders I've got 18000 lines per hour ( 4 cores, 32Gb RAM ). So your 25000 is very similar to what I've get ( one Sales order had 1 - 6 lines ). For second link that you provided, what were parameters of your API call, what was number of your Acumatica instances ( CPU, RAM, parameters of SQL Server )?
I propose you to consider scaling Acumatica horizontally and also scale your database via SQL sharding.
Edit
In case if you need to have one GL Batch with 25000 lines on it, then I propose you following workaround:
Create one more Acumatica page that has text box and button Import.
In code of button Import button
2.1 read text box information as xml ( or JSON )
2.2 Create instance of GL Graph
2.3 Insert via Graph needed amount ( in your case 25000 ) lines
2.4 Call to graph.PressSave()
Send Web API your request not to GL Batch but to created by you page.
I know this is an old question, but I am answering here for the benefit of anyone who stumbles across this page. There's a performance issue in the Journal Transaction screen where the time to create a transaction increases non-linearly with the number of rows to insert.
A customization-based workaround was provided to us by Acumatica support which significantly improved performance. I don't have the exact version where this fix was included, but builds newer than today (September 2021) should include this already.
Customization fix:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using PX.Api;
using PX.Data;
using PX.Common;
using PX.Objects.Common;
using PX.Objects.Common.Extensions;
using PX.Objects.CS;
using PX.Objects.CM;
using PX.Objects.CA;
using PX.Objects.Common.Bql;
using PX.Objects.Common.GraphExtensions.Abstract;
using PX.Objects.Common.GraphExtensions.Abstract.DAC;
using PX.Objects.Common.GraphExtensions.Abstract.Mapping;
using PX.Objects.GL.DAC;
using PX.Objects.GL.FinPeriods;
using PX.Objects.GL.JournalEntryState;
using PX.Objects.GL.JournalEntryState.PartiallyEditable;
using PX.Objects.GL.Overrides.PostGraph;
using PX.Objects.GL.Reclassification.UI;
using PX.Objects.PM;
using PX.Objects.TX;
using PX.Objects.Common.Tools;
using PX.Objects.GL.DAC.Abstract;
using PX.Objects.Common.EntityInUse;
using PX.Objects.GL.FinPeriods.TableDefinition;
using PX.Data.SQLTree;
using PX.Objects.CR;
using PX.Data.BQL.Fluent;
using PX.Data.BQL;
using PX.Objects;
using PX.Objects.GL;
namespace PX.Objects.GL
{
public class JournalEntry_Extension : PXGraphExtension<JournalEntry>
{
public delegate void PopulateSubDescrDelegate(PXCache sender, GLTran Row, Boolean ExternalCall);
[PXOverride]
public void PopulateSubDescr(PXCache sender, GLTran Row, Boolean ExternalCall, PopulateSubDescrDelegate baseMethod)
{
if (Base.IsImport || Base.IsExport || Base.IsContractBasedAPI)
{
return;
}
baseMethod(sender,Row,ExternalCall);
}
}
}

How to retrieve only a predefined number of results in Azure Tables

I am trying to perform an azure table query.
My table (that saves logs) has thousands of rows of data, and it gets populated with more each second.
Right now i have only 1 partition key, but it doesn't affect the next question.
How can i get back lets say only the 100 latest results.
this is my Entity:
public MainServerLogEntity(string Message)
{
this.PartitionKey = "mainserverlogs";
this.RowKey = (DateTime.MaxValue.Ticks - DateTime.UtcNow.Ticks).ToString();
this.Message = Message;
this.Date = DateTime.UtcNow;
}
public MainServerLogEntity() { }
public string Message { get; set; }
public DateTime Date { get; set; }
Right now this is the query i am performing inside a web api i have:
[Route("MainServerLogs")]
[HttpGet]
public IEnumerable<MainServerLogEntity> GetMainServerLogs()
{
CloudTable table = AzureStorageHelpers.GetWebApiTable(connectionString, "mainserverlogs");
TableQuery<MainServerLogEntity> query = new TableQuery<MainServerLogEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "mainserverlogs"));
return table.ExecuteQuery(query);
}
But the problem is that i am getting alot of data, and i am requesting this api every few seconds in order to update the ui.
What should i do? is it possible to define in the query that i only want the 100 first rows?
If it is not possible then what other technique should i use?
Try implementing a .Take(100) on the query like so:
[Route("MainServerLogs")]
[HttpGet]
public IEnumerable<MainServerLogEntity> GetMainServerLogs()
{
CloudTable table = AzureStorageHelpers.GetWebApiTable(connectionString, "mainserverlogs");
TableQuery<MainServerLogEntity> query = new TableQuery<MainServerLogEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "mainserverlogs")).Take(100);
return table.ExecuteQuery(query);
}

Whats wrong in this Parallel.For Code?

this is the code that i want to run.
Parallel.For(1, itemCount, 1, () =>
{
return new ThreadLocalStateCache()
{
//assigning values to local variables
Receipient = serMailObj.ReceipientList.Dequeue(), //get a single recepeint for the email
mail = serMailObj.Email, //Object of type MailMessage
client = client //object of type SmtpClient
};
}
, (i, loopState) =>
{
doWork(i, loopState.ThreadLocalState);
});
}
//class to store local vairables for each thread
public class ThreadLocalStateCache
{
public KeyValuePair<string, string> Receipient { get; set; }
public MailMessage mail { get; set; }
public SmtpClient client { get; set; }
}
private static void doWork(int instance, ThreadLocalStateCache threadInstance)
{
//send mail
}
and it keeps on saying
The type arguments for method 'System.Threading.Tasks.Parallel.For(long, long, System.Func, System.Func, System.Action)' cannot be inferred from the usage. Try specifying the type arguments explicitly.
I could not find any resource on the internet that explains clearly how to use parallel.for with thread local variables. I am trying to process long list of email recipients and send mails to them. Please tell how can i use parallel.for.
EDIT 1: I am trying this code after reading this article http://www.lovethedot.net/2009/02/parallelfor-deeper-dive-parallel.html
The Parallel.For overloads that take step as the third argument were removed from .NET 4; see comments to http://blogs.msdn.com/b/pfxteam/archive/2009/05/26/9641563.aspx.
Due to that, your call with 5 arguments is resolved to this overload:
For<TLocal>(Int32, Int32, Func<TLocal>, Func<Int32, ParallelLoopState, TLocal, TLocal>, Action<TLocal>)
And obviously the compiler cannot match types of the arguments.
Since the step is 1 anyway, just remove it.
Then you will need to fix the body delegate which must have three parameters (since thread local variable is now separate from loop state object), and add another delegate that will be applied to thread local variables for final computation. At the end, it should be something like this:
Parallel.For( 1, itemCount,
() =>
{ return new ThreadLocalStateCache()
{
Receipient = serMailObj.ReceipientList.Dequeue(),
mail = serMailObj.Email,
client = client
};
},
(i, loopState, threadLocal ) =>
{
doWork(i, threadLocal);
return threadLocal;
},
(threadLocal) => {}
);

Retrieving values of ReadOnly fields from DynamicData DetailsView in Edit Mode on Updating using LinqDataSource

I have several tables in my database that have read-only fields that get set on Inserting and Updating, namely: AddDate (DateTime), AddUserName (string), LastModDate (DateTime), LastModUserName (string).
All of the tables that have these values have been set to inherit from the following interface:
public interface IUserTrackTable
{
string AddUserName { get; set; }
DateTime AddDate { get; set; }
string LastModUserName { get; set; }
DateTime LastModDate { get; set; }
}
As such, I have the following method on the Edit.aspx page:
protected void DetailsDataSource_Updating(object sender, LinqDataSourceUpdateEventArgs e)
{
IUserTrackTable newObject = e.NewObject as IUserTrackTable;
if (newObject != null)
{
newObject.LastModUserName = User.Identity.Name;
newObject.LastModDate = DateTime.Now;
}
}
However, by the time it hits this method, the e.OriginalObject has already lost the values for all four fields, so a ChangeConflictException gets thrown during the actual Update. I have tried adding the four column names to the DetailsView1.DataKeyNames array in the Init event handler:
protected void Page_Init(object sender, EventArgs e)
{
// other things happen before this
var readOnlyColumns = table.Columns.Where(c => c.Attributes.SingleOrDefaultOfType<ReadOnlyAttribute>(ReadOnlyAttribute.Default).IsReadOnly).Select(c => c.Name);
DetailsView1.DataKeyNames = DetailsView1.DataKeyNames.Union<string>(readOnlyColumns).ToArray<string>();
DetailsView1.RowsGenerator = new CustomFieldGenerator(table, PageTemplates.Edit, false);
// other things happen after this
}
I've tried making that code only happen on PostBack, and still nothing. I'm at a lose for how to get the values for all of the columns to make the round-trip.
The only thing the CustomFieldGenerator is handling the ReadOnlyAttribute, following the details on C# Bits.
UPDATE: After further investigation, the values make the round trip to the DetailsView_ItemUpdating event. All of the values are present in the e.OldValues dictionary. However, they are lost by the time it gets to the LinqDataSource_Updating event.
Obviously, there are the "solutions" of making those columns not participate in Concurrency Checks or other ways that involve hard-coding, but the ideal solution would dynamically add the appropriate information where needed so that this stays as a Dynamic solution.
i Drovani, I assume you want data auditing (see Steve Sheldon's A Method to Handle Audit Fields in LINQ to SQL), I would do this in the model in EF4 you can do it like this:
partial void OnContextCreated()
{
// Register the handler for the SavingChanges event.
this.SavingChanges += new EventHandler(context_SavingChanges);
}
private static void context_SavingChanges(object sender, EventArgs e)
{
// handle auditing
AuditingHelperUtility.ProcessAuditFields(objects.GetObjectStateEntries(EntityState.Added));
AuditingHelperUtility.ProcessAuditFields(objects.GetObjectStateEntries(EntityState.Modified), InsertMode: false);
}
internal static class AuditingHelperUtility
{
internal static void ProcessAuditFields(IEnumerable<Object> list, bool InsertMode = true)
{
foreach (var item in list)
{
IAuditable entity = item as IAuditable;
if (entity != null)
{
if (InsertMode)
{
entity.InsertedBy = GetUserId();
entity.InsertedOn = DateTime.Now;
}
entity.UpdatedBy = GetUserId();
entity.UpdatedOn = DateTime.Now;
}
}
}
}
Sadly this is not possible with EF v1

Resources