I am trying solve this problem. I have WCF service. Client can call web method from this service which only "fire" another method (this method only write data to database) in another thread.
Code is here:
//this method will write data to database
public void WriteToDb()
{
}
//this web method will call only mehod WriteToDb() in another thread
public void SomeWebMethod()
{
new Task(WriteToDb).Start();
}
Problem is that in same time can web method call 5 clients. This cause that method WriteToDb is called 5 times in 5 thread.
In all 5 cases method WriteToDb will use same data.
My aim is achieve this behavior. 5 clients called web method SomeWebMethod. Method WriteToDb will run in 5 thread.
But I would like execute first thread, then second thread ....etc and on the end 5th thread.
I don’t want run method WriteToDb in same time in 5 thread.
So maybe I can use lock.
{
private object locker = new object();
//this method will write data to database
public void WriteToDb()
{
lock(locker)
{
//write to DB
}
}
I am not sure because .net assembly is host on app domain a app domain is host on win process. I woud like to avoid deadlocks.
What happens if I have a machine with 6 CPU? Use mutex instead lock ?
Thank you for help...
I'm not particulary sure what you are writing to DB, but your question is loosely coupled with WCF to be frank, try to read CLR via C# on multithreading etc.
Also regarding WCF, you can setup how your service object is created upon requests, ie per call, per session or singleton, and for later use specify if it's methods will stuck in queue or will be called on object concurrently.
So depending on choosing architecture you can either relay on WCF ability to host single object which will have logic you described or you can go the way tried.
Links
http://msdn.microsoft.com/en-us/magazine/cc163590.aspx
http://msdn.microsoft.com/en-us/library/ms731193.aspx
A lock is fine here, but you should make your locker object static so the same object instance is used in the lock every time.
It does not matter how many cores you have - if you hold the lock on an object then any other threads that attempt to acquire the lock will wait until the lock is released.
A deadlock can only occur if you are acquiring multiple locks in different orders in different threads.
I suggest you read Joe Albahari's excellent free ebook
Related
I want my Rest Controller POST Endpoint to only allow one thread to execute the method and every other thread shall get 429 until the first thread is finished.
#ResponseStatus(code = HttpStatus.CREATED)
#PostMapping(value ="/myApp",consumes="application/json",produces="application/json")
public Execution execute(#RequestBody ParameterDTO StartDateParameter)
{
if(StartDateParameter.getStartDate()==null) {
throw new ResponseStatusException(HttpStatus.BAD_REQUEST);
}else {
if(Executer.isProcessAlive()) {
throw new ResponseStatusException(HttpStatus.TOO_MANY_REQUESTS);
}else {
return Executer.execute(StartDateParameter);
}
}
}
When I send multithreaded requests, every request gets 201. So I think the requests get in earlier than the isAlive() method beeing checked. How can I change it to only process the first request and "block" every other?
Lifecycle of a controller in spring is managed by the container and by default, it is singleton, which means that there is one instance of the bean created at startup and multiple threads can use it. The only way you can make it single threaded is if you use a synchronized block or handle the request call through an Executor service. But that defeats the entire purpose of using spring framework.
Spring provides other means to make your code thread safe. You can use the #Scope annotation to override the default scope. Since you are using a RestController, you could use the "request" scope (#Scope("request")), which creates a new instance to process your every http request. Doing it this way will make ensure that only 1 thread will be accessing your controller code at any given time.
I'm using MVC4 ApiController to upload data to Azure Blob. Here is the sample code:
public Task PostAsync(int id)
{
return Task.Factory.StartNew(() =>
{
// CloudBlob.UploadFromStream(stream);
});
}
Does this code even make sense? I think ASP.NET is already processing the request in a worker thread, so running UploadFromStream in another thread doesn't seem to make sense since it now uses two threads to run this method (I assume the original worker thread is waiting for this UploadFromStream to finish?)
So my understanding is that async ApiController only makes sense if we are using some built-in async methods such as HttpClient.GetAsync or SqlCommand.ExecuteReaderAsync. Those methods probably use I/O Completion Ports internally so it can free up the thread while doing the actual work. So I should change the code to this?
public Task PostAsync(int id)
{
// only to show it's using the proper async version of the method.
return TaskFactory.FromAsync(BeginUploadFromStream, EndUploadFromStream...)
}
On the other hand, if all the work in the Post method is CPU/memory intensive, then the async version PostAsync will not help throughput of requests. It might be better to just use the regular "public void Post(int id)" method, right?
I know it's a lot questions. Hopefully it will clarify my understanding of async usage in the ASP.NET MVC. Thanks.
Yes, most of what you say is correct. Even down to the details with completion ports and such.
Here is a tiny error:
I assume the original worker thread is waiting for this UploadFromStream to finish?
Only your task thread is running. You're using the async pipeline after all. It does not wait for the task to finish, it just hooks up a continuation. (Just like with HttpClient.GetAsync).
I have a .NET 4.5 WCF client app that uses the async/await pattern to make volumes of calls. My development machine is dual-proc with 8gb RAM (production will be 5 CPU with 8gb RAM at Amazon AWS) . The remote WCF service called by my code uses out and ref parameters on a web method that I need. My code instances a proxy client each time, writes any results to a public ConcurrentDictionary, and then returns null.
I ran Perfmon, watching the thread count on the system, and it goes between 28-30. It takes hours for my client to complete the volumes of calls that are made. Yes, hours. The remote service is backed by a big company, they have many servers to receive my WCF calls, so the more calls I can throw at them, the better.
I think that things are actually still happening synchronously, even though the method that makes the WCF call is decorated with "async" because the proxy method cannot have "await". Is that true?
My code looks like this:
async private void CallMe()
{
Console.WriteLine( DateTime.Now );
var workTasks = this.AnotherConcurrentDict.Select( oneB => GetData( etcetcetc ).Cast<Task>().ToList();
await Task.WhenAll( workTasks );
}
private async Task<WorkingBits> GetData(etcetcetc)
{
var commClient = new RemoteClient();
var cpResponse = new GetPackage();
var responseInfo = commClient.GetData( name, password , ref (cpResponse.aproperty), filterid , out cpResponse.Identifiers);
foreach (var onething in cpResponse.Identifiers)
{
// add to the ConcurrentDictionary
}
return null; // I already wrote to the ConcurrentDictionary so no need to return anything
responseInfo is not awaitable beacuse the WCF call has ref and out parameters.
I was thinking that way to speed this up is not to put async/await in this method, but instead create a wrapper method where I can make things await/async, but I am not that is the smartest/safest way to work it.
What is a smart way to get more outbound calls to the service (expand IO completion thread pool, trick calls into running in the background so Task.WhenAll can complete quicker)?
Thanks for all ideas/samples/pointers. I am hitting a bottleneck somewhere.
1) Make sure you're really calling it asynchronously, rather than just blocking on the calls. Code samples would help here.
2) You may need to do this:
ServicePointManager.DefaultConnectionLimit = 100;
By default it only allows 2 simultaneous connections to the same server.
3) Make sure you dispose the proxy object after the call is complete so you're not tying up resources.
If you're doing things asynchronously the threadpool size shouldn't be a bottleneck. To get a better idea of what kind of problem you're having, you can use Interlocked.Increment and Interlocked.Decrement to track the number of pending calls and see if it's being limited somewhere.
You could also substitute your real call with a call to a very simple method that you know will not have any bottlenecks, to see if the problem is in the client or server.
I have a very long running query that takes too long to keep my client connected. I want to make a call into my DomainService, create a new worker thread, then return from the service so that my client can then begin polling to see if the long running query is complete.
The problem I am running into is that since my calling thread is exiting right away, I am getting exceptions thrown when my worker tries to access any entities since the ObjectContext gets disposed when the original thread ends.
Here is how I create the new context and call from my Silverlight client:
MyDomainContext context = new MyDomainContext();
context.SearchAndStore(_myParm, SearchQuery,
p => {
if (p.HasError) { // Do some work and return to start
} // polling the server for completion...
}, null);
The entry method on the server:
[Invoke]
public int SearchAndStore(object parm)
{
Thread t = new Thread(new ParameterizedThreadStart(SearchThread));
t.Start(parms);
return 0;
// Once this method returns, I get ObjectContext already Disposed Exceptions
}
Here is the WorkerProc method that gets called with the new Thread. As soon as I try to iterate through my query1 object, I get the ObjectContext already Disposed exception.
private void WorkerProc(object o)
{
HashSet<long> excludeList = new HashSet<long>();
var query1 = from doc in this.ObjectContext.Documents
join filters in this.ObjectContext.AppliedGlobalFilters
.Where(f => f.FilterId == 1)
on doc.FileExtension equals filters.FilterValue
select doc.FileId;
foreach (long fileId in query1) // Here occurs the exception because the
{ // Object Context is already disposed of.
excludeList.Add(fileId);
}
}
How can I prevent this from happening? Is there a way to create a new context for the new thread? I'm really stuck on this one.
Thanks.
Since you're using WCF RIA. I have to assume that you're implementing two parts:
A WCF Web Service
A Silverlight client which consumes the WCF Service.
So, this means that you have two applications. The service running on IIS, and the Silverlight running on the web browser. These applications have different life cycles.
The silverlight application starts living when it's loaded in the web page, and it dies when the page is closed (or an exception happens). On the other hand (at server side), the WCF Web Service life is quite sort. You application starts living when the service is requested and it dies once the request has finished.
In your case your the server request finishes when the SearchAndStore method finishes. Thus, when this particular method starts ,you create an Thread which starts running on background (in the server), and your method continues the execution, which is more likely to finishes in a couple of lines.
If I'm right, you don't need to do this. You can call your method without using a thread, in theory it does not matter if it takes awhile to respond. this is because the Silvelight application (on the client) won't be waiting. In Silverlight all the operations are asynchronous (this means that they're running in their own thread). Therefore, when you call the service method from the client, you only have to wait until the callback is invoked.
If it's really taking long time, you are more likely to look for a mechanism to keep the connection between your silverlight client and your web server alive for longer. I think by modifying the service configuration.
Here is a sample of what I'm saying:
https://github.com/hmadrigal/CodeSamples/tree/master/wcfria/SampleWebApplication01
In the sample you can see the different times on client and server side. You click the button and have to wait 30 seconds to receive a response from the server.
I hope this helps,
Best regards,
Herber
I am writing a web service which has to be able to reply to multiple http requests.
From what I understand, I will need to deal with HttpListener.
What is the best method to receive a http request(or better, multiple http requests), translate it and send the results back to the caller? How safe is to use HttpListeners on threads?
Thanks
You typically set up a main thread that accepts connections and passes the request to be handled by either a new thread or a free thread in a thread pool. I'd say you're on the right track though.
You're looking for something similar to:
while (boolProcessRequests)
{
HttpListenerContext context = null;
// this line blocks until a new request arrives
context = listener.GetContext();
Thread T = new Thread((new YourRequestProcessorClass(context)).ExecuteRequest);
T.Start();
}
Edit Detailed Description If you don't have access to a web-server and need to roll your own web-service, you would use the following structure:
One main thread that accepts connections/requests and as soon as they arrive, it passes the connection to a free threat to process. Sort of like the Hostess at a restaurant that passes you to a Waiter/Waitress who will process your request.
In this case, the Hostess (main thread) has a loop:
- Wait at the door for new arrivals
- Find a free table and seat the patrons there and call the waiter to process the request.
- Go back to the door and wait.
In the code above, the requests are packaged inside the HttpListernContext object. Once they arrive, the main thread creates a new thread and a new RequestProcessor class that is initialized with the request data (context). The RequsetProcessor then uses the Response object inside the context object to respond to the request. Obviously you need to create the YourRequestProcessorClass and function like ExecuteRequest to be run by the thread.
I'm not sure what platform you're on, but you can see a .Net example for threading here and for httplistener here.