A thread is an independent execution path
C# supports parallel execution of code through
multithreading.
How Threading Works
Multithreading is managed internally by a
thread scheduler, a function the CLR typically delegates to the operating
system. A thread scheduler ensures all active threads are allocated appropriate
execution time, and that threads that are waiting or blocked do not consume CPU
time.
On a single-processor computer : Thread
scheduler performs time-slicing — rapidly switching execution between
each of the active threads.
On a multi-processor computer:
Multithreading is implemented with a mixture of time-slicing and genuine
concurrency, where different threads run code simultaneously on different CPUs.
It’s almost certain there will still be some time-slicing, because of the
operating system’s need to service its own threads — as well as those of other
applications.
A thread is said to be preempted when
its execution is interrupted due to an external factor such as time-slicing.
When to Use
·
Maintaining a responsive user interface
·
Making efficient use of an otherwise
blocked CPU
·
Parallel programming
·
Speculative execution
Thread Pooling
Whenever you start a thread, a few hundred microseconds are spent
organizing such things as a fresh private local variable stack. The thread
pool cuts these overheads by sharing and recycling threads, allowing
multithreading to be applied at a very granular level without a performance
penalty.
The thread pool also keeps a lid on the total number of worker threads it
will run simultaneously. Too many active threads throttle the operating system
with administrative burden and render CPU caches ineffective. Once a limit is
reached, jobs queue up and start only when another finishes.
The thread pool starts out with
one thread in its pool. As tasks are assigned, the pool manager “injects” new
threads to cope with the extra concurrent workload, up to a maximum limit.
After a sufficient period of inactivity, the pool manager may “retire” threads
if it suspects that doing so will lead to better throughput.
You can set the upper limit of
threads that the pool will create by calling ThreadPool.SetMaxThreads; the
defaults are:
- 1023 in
Framework 4.0 in a 32-bit environment
- 32768 in
Framework 4.0 in a 64-bit environment
- 250 per
core in Framework 3.5
- 25 per core
in Framework 2.0
Synchronization
Synchronization constructs can be divided into four categories:
(Blocking/Locking/Signaling/NonBlocking)
Blocking:
Wait for another thread to
finish or for a period of time to elapse.
Sleep, Join, and Task.Wait are simple blocking methods.
Locking:
Limit the number of threads that can perform some
activity or execute a section of code at a time.
Monitor, Mutex, SpinLock (Exclusive locking)
Semaphore, SemaphoreSlim,
Reader/Writer locks (Nonexclusive
locking)
Signaling:
These allow a thread to pause until receiving a notification from another.
Event
wait handles and Monitor’s Wait/Pulse methods. Framework 4.0 introduces the CountdownEvent and Barrier.
NonBlocking:
These protect access to a common field by calling upon processor
primitives.
Thread.MemoryBarrier, Thread.VolatileRead, Thread.VolatileWrite, the volatile keyword,
and the Interlocked class.
No comments:
Post a Comment