Which idtcpclient functions are thread safe? - multithreading

I've already seen some indy implementations in which the main-thread sends data and the worker thread reads.
My question is regarding IdTCPClient.Disconnect, is it really thread safe? If somehow it was called in the worker thread and external thread at the same time is it really safe?

Related

How exactly thread pools are working in ASP.NET Core?

I suppose, there is a thread pool which the web server are using to serve requests. So the controllers are running within one of the thread of this thread pool. Say it is the 'serving' pool.
In one of my async action method I use an async method,
var myResult = await myObject.MyMethodAsync();
// my completion logic here
As explained many places, we are doing this, to not block the valuable serving pool thread, instead execute MyMethodAsync in an other background thread... then continue the completion logic in again a serving pool thread, probably in other one, but having the http context, and some othe minor things marshaled there correctly.
So the background thread in which MyMethodAsync runs must be from an other thread pool, unless the whole thing makes no sense.
Question
Please confirm or correct my understanding and in case if it is correct, I still miss why would one thread in one pool more valuable resource than other thread in another pool? In the end of the day the whole thing runs on a same particular hardware with given number of cores and CPU performance...
There is only one thread pool in a .NET application. It has both worker threads and I/O threads, which are treated differently, but there is only one pool.
I suppose, there is a thread pool which the web server are using to serve requests. So the controllers are running within one of the thread of this thread pool. Say it is the 'serving' pool.
ASP.NET uses the .NET thread pool to serve requests, yes.
As explained many places, we are doing this, to not block the valuable serving pool thread, instead execute MyMethodAsync in an other background thread... then continue the completion logic in again a serving pool thread, probably in other one, but having the http context, and some othe minor things marshaled there correctly.
So the background thread in which MyMethodAsync runs must be from an other thread pool, unless the whole thing makes no sense.
This is the wrong part.
With truly asynchronous methods, there is no thread (as described on my blog). While the code within MyMethodAsync will run on some thread, there is no thread dedicated to running MyMethodAsync until it completes.
You can think about it this way: asynchronous code usually deals with I/O, so lets say for example that MyMethodAsync is posting something to an API. Once the post is sent, there's no point in having a thread just block waiting for a response. Instead, MyMethodAsync just wires up a continuation and returns. As a result, most asynchronous controller methods use zero threads while waiting for external systems to respond. There's no "other thread pool" because there's no "other thread" at all.
Which is kind of the point of asynchronous code on ASP.NET: to use fewer threads to serve more requests. Also see this article.

C++11 thread event : std::future vs std::condition_variable

I'm writing a network test program.
The idea is to have 2 threads. The client and the server.
I want to add some eventing between the 2 threads.
Basicly
Main thread runs the server and creates a new thread for client.
Server thread waits for client to connect.
Client thread sends some data and notifies the server thread.
Client Thread waits.
Server reads the data. Checks if data is intact and notifies the client thread to send more data.
Repeat a number of times in a loop, but server does not know how many times.
After all the data has been send, client thread notifies server thread to exit his loop. Server thread ( main thread joins the client thread. )
I have implemented this using a global std::condition_variable and global variables and it works. I'm writing multiple of these tests functions. Each test function does what I described above but with different data.
Here are some questions that I have:
I found std::promise and std::future. I like the fact that it waits for a value to be set in another thread. Could i use this instead of std::condition_variable? In general, what are use cases for using one method over another when waiting for a variable to be set? Differences, advantages/disadvantages?
Would it be better to declare the std::condition_variable and the variables locally in each test function and pass references to the thread instead of using global variables? For some reason I don't like using global variables.. What would be a better practice?
Do you need to join a thread if you are certain it will end before the main thread? My client thread notifies the server thread ( main thread ) when it is done sending and will exit, So really the server thread is waiting for the client thread to exit. Do i still need to join it in the main thread?

how is blocking I/O handled via the threadpool in Node.js?

I have recently studied Node.js and tried understanding the Node.js architecture in depth. But still after going through multiple articles and links like stack overflow, node.js blogs, I am confused as how both single thread with event loop and the multi-threaded blocking I/O requests that are part of client requests or events can happen at the same time.
According to my study, the single thread with event loop keeps on polling on event queue to know if the client request has come. As soon as, the event or request is found, it checks if it is blocking i/o or non-blocking operation. If it is found to be non-blocking then the response is sent back to the client immediately. But if the request has blocking i/o operation then the request is assigned a thread from threadpool,and the single thread continues with the other requests. Essentially, this means that every blocking I/O operation within the clients requests are assigned a thread and here system is working as multithreaded.
My confusion is that how can that single thread and the threaded blocking I/O operations be performed at the same time. The execution of single thread is concurrent but the blocking i/o(s) are also happening at the same time. How can this be achieved on a single machine which is having both single thread and blocking I/O thread executing in parallel on a single core processor when CPU can execute one thread at a time. Also, are these threads both single event loop thread and threadpool threads user-level threads?
Although, I know that the Blocking I/O threads are handled by the libraries of the external modules but still those modules will be using up threads and executing them in the same space as that of the single level thread. So, how are the two getting executed?
I am new to this framework.
Node.js process consists of the main thread which runs an event loop and worker threads. These worker threads are not explicitly available to the coder.
Now when you do a syscall from your code (i.e. calling a Node.js function which internally does a syscall) then depending on whether these are blocking (e.g. file I/O) or non-blocking (e.g. socket I/O) the job might be send to a worker thread. However a callback is always registered with the event loop. So if the worker finishes processing a job it notifies the event loop and so from coder's point of view the operation was asynchronous.
Now it doesn't really matter whether CPU is multi or single threaded. That's because reading from a disk takes some time and during that time CPU is not busy on that thread. The OS knows that it should switch context during that time. So even if you have single threaded CPU then the event loop takes majority of its time.
And also by threads I understand real kernel-space threads, not user-space threads. Doing that over user-space threads is pointless since you would block the whole kernel-space thread during blocking I/O.

Thread Pool in NodeJs

I understand that NodeJS uses a thread pool for blocking I/O calls. What does it do if all the threads in the thread pool are busy with some work and another request comes in?
In a case where the thread pool is needed and no workers are available, the request would be queued until a worker is free. The thread pool is not the sole approach, though. There are three operation types that utilize the thread pool in libuv as documented at the bottom of the page here under the title File I/O.
These operations types are:
Filesystem operations
DNS functions (getaddinfo and getnameinfo)
User-specified code
While not a direct answer to your question, I believe this post by Jason does a wonderful job of explaining thread pools in Node.js. Without going extremely in-depth, it introduces you to the functionality provided the libuv library and has links to very informative literature on the subject of the thread pool.

What kind of operations are handled by nodejs worker threads?

I need some clarification on what exactly are the nodejs worker threads doing.
I found contradicting info on this one. Some people say worker threads handle all IO, others say they handle only blocking posix requests (for which there is no async version).
For example, assume that I am not blocking the main loop myself with some unreasonable processing. I am just invoking functions from available modules and providing the callbacks. I can see that if this requires some blocking or computationally-expensive operation then it is handled to a worker thread. But for some async IO, is it initiated from the main libuv loop? Or is it passed to a worker thread, to be initiated from there?
Also, would a nodejs worker thread ever initiate a blocking (synchroneous) IO operation when the OS supports an async mode to do the same thing? Is it documented anywhere what kind of operations may end up blocking a worker thread for a longer time?
I'm asking this because there is a fixed-size worker pool and I want to avoid making mistakes with it. Thanks.
Network I/O on all platforms is done on the main thread. File I/O is a different story: on Windows it is done truly asynchronously and non-blocking, but on all other platforms synchronous file I/O operations are performed in a thread pool to be async and non-blocking.

Resources