How many cores does a process occupy? - linux

Lets say I have 4 core on my machine and I have a process that spawns 4 threads, while this is the current process scheduled, are all 4 of those cores reserved for the process' 4 threads?

That is a very complex question. However, I can help. As a general rule, 1 process only uses 1 core. Actually, 1 thread can only be executed by 1 core. If you have a dual core processor, it is literally 2 CPUs stuck together in the same pc. These are called physical processors. These physical proessors execute 1 thread. Although, some CPUs have 2 physical cores but are capable of running 4 threads simultaneously. These extra 2 threads are run on logical cores. They do not physically exist but logically exist to the cpu.
If by process you mean thread then yes 1 process 1 core. And you can run 4 threads on a cpu with 4 compute cores (the name with includes physical and logical cores because a single core cpu may only have 1 compute core).
If by process you mean program or process in the processes tab in the task manager, then it depends on how the program is written.
Judging by your question, if a process spawns 4 threads it depends at what place it is in the pool. There are thousands of threads waiting to be executed. The threads from each program or executable file do not have to be executed at the same time.

The 4 threads of your process are scheduled independently - the process itself isn't scheduled.
If all 4 threads are runnable at the same time, and there's no other higher priority runnable threads in the system, then all 4 threads may be scheduled simultaneously on your 4 cores.

Related

What is a difference between CPU threads and program threads

For example i5 7600k has 4 threads, but game can have more than 4 threads. What is the difference and why they have the same name?
A CPU that has 4 threads (really a CPU with 4 cores, or possibly a 2 core CPU with Hyperthreading) can execute 4 separate threads simultaneously. A program can have more threads than that, but only 4 of them can be executing at any given time - the others would be in a sleep/wait state while they wait for the CPU to become available.
As for how the CPU "becomes available" for other threads when there are more threads than it can execute at a given time, that's a function of the operating system scheduler. The operating system scheduler rotates threads on and off the CPU periodically (typically every few milliseconds) so that every thread that wants to execute eventually gets its turn on the CPU.
There's more to it than that, but hopefully that covers the gist of your question.

When 2 threads would be executed on a 1 physical CPU core with a multi-core CPU machine?

Lets say there's a machine with 8-cores CPU.
I'm creating 2 posix threads using standard pthread_create(...) function.
As I know there's no any garanties these threads always would be executed by a 2 different physical cores, but practically in 90% they will run simultaneously (or in parallel). At least for my cases I seen that top command shows 2 cpu's are running ... thus around 160-180% CPU usage
The question is:
What could be the scenario when 2 threads within a single process are running only on 1 physical core ?
Two cases:
1) The other physical cores are busy doing other stuff, so only one core gets used by this process. The two threads run in alternation on that core.
2) The physical core supports executing more than one thread concurrently using hyperthreading or something similar. The other physical cores are busy doing other stuff, so the best the scheduler can do is run both threads in a single physical core.

How does more than one thread execute on a processor core

I wanted to know how does a multi-threaded program with more number of threads executes on a processor core. For example, my program has 12 threads and I am running it on a intel core-i5 machine. It has four CPUs. Will each core run 3 threads? I am confused because I have seen programs with 30 threads running on a 4 core machine.
Thanks
Each core would be able to execute one thread simultaneously. So if there are 30 threads and 4 cores, 26 threads will be waiting to get context switched to get executed. Something like, thread 1-4 runs for 200ms and then 5-8 runs for 200 ms and so on
The processor core is capable of executing one thread at a time. In a quad core, 4 threads are executed simultaneously. Not all the user space threads are executed simultaneously, the kernel threads also runs to schedule the next thread or do some other kernel tasks.

Does a process run threads in a sequential order?

The question is about multithreading. Say I have 3 threads, the main one, a child1, and a child2. Does the process executing these threads run it in an order that it works on one thread for a short amount of time, then works on the other, and so on and forth and keeps switching, or are the threads running without ever being stopped by the process? Somewhere I read that a thread gets stopped without finish, then another thread is worked on and stopped, then back to thread1 and so on on forth, but that wouldn't make any sense if any threads are stopped as the point of mutlithreading was that they are all concurrent and all run at the same time, but how does the processor do that?
This is in .Net/C#.
the scenario you describe is the way IS ran thread in the old age before multi-core
OS scheduled thread sequentially based in their priorities, but now... I suppose you have at least 2 core where 2 thread can run concurrently and the 3rd thread will be schedule and interrupt one of the other!!!!
The scenario you're describing is correct, except that one thread will normally be running at each time per processor core.
Simplified; if 3 threads are active on 4 cores, they will all always be allowed to run since there's always an available core to run them, while if 3 threads are active on 2 cores, only two can run at any time so they will have to take turns.
Operating systems schedule threads to execute on the available CPU cores (either real or virtual). In the past, most computers had single core CPUs, and thus only one thread could be executed at a time. Modern CPUs are typically 2, 4, or 8 core systems. Some of these cores are virtual, like Intel's hyperthreading CPUs which have twice as many virtual cores as physical cores.
However, there are almost always more threads than CPU cores available, so the OS will prioritize all of the threads on the system in order to run them as efficiently as possible. The threads created by your process may or may not truly run in parallel over any given time span, but you should assume that they will.

Threads vs Cores

Say if I have a processor like this which says # cores = 4, # threads = 4 and without Hyper-threading support.
Does that mean I can run 4 simultaneous program/process (since a core is capable of running only one thread)?
Or does that mean I can run 4 x 4 = 16 program/process simultaneously?
From my digging, if no Hyper-threading, there will be only 1 thread (process) per core. Correct me if I am wrong.
A thread differs from a process. A process can have many threads. A thread is a sequence of commands that have a certain order. A logical core can execute on sequence of commands. The operating system distributes all the threads to all the logical cores available, and if there are more threads than cores, threads are processed in a fast cue, and the core switches from one to another very fast.
It will look like all the threads run simultaneously, when actually the OS distributes CPU time among them.
Having multiple cores gives the advantage that less concurrent threads will be placed on one single core, less switching between threads = greater speed.
Hyper-threading creates 2 logical cores on 1 physical core, and makes switching between threads much faster.
That's basically correct, with the obvious qualifier that most operating systems let you execute far more tasks simultaneously than there are cores or threads, which they accomplish by interleaving the executing of instructions.
A system with hyperthreading generally has twice as many hardware threads as physical cores.
The term thread is generally used as a description of an operating system concept that has the potential to execute independently of other threads. Whether it does so depends on whether it is stuck waiting for some event (disk or screen I/O, message queue), or if there are enough physical CPUs (hyperthreaded or not) to allow it run in the face of other non-waiting threads.
Hyperthreading is a CPU vendor term that means a single core, that can multiplex its attention between two computations. The easy way to think about a hyperthreaded core is as if you had two real CPUs, both slightly slower than what the manufacture says the core can actually do.
Basically this is up to the OS. A thread is a high-level construct holding a instruction pointer, and where the OS places a threads execution on a suitable logical processor. So with 4 cores you can basically execute 4 instructions in parallell. Where as a thread simply contains information about what instructions to execute and the instructions placement in memory.
An application normally uses a single process during execution and the OS switches between processes to give all processes "equal" process time. When an application deploys multiple threads the processes allocates more than one slot for execution but shares memory between threads.
Normally you make a difference between concurrent and parallell execution. Where parallell execution is when you actually physically execute instructions of more than one logical processor and concurrent execution is the the frequent switching of a single logical processor giving the apperence of parallell execution.

Resources