Operating System: Question Set – 17
What are thread pools?
A thread pool is a collection of pre-created threads managed by a system. Threads are reused for multiple tasks, which reduces overhead.
- Advantages:
- Efficient resource management.
- Limits the number of concurrent threads to avoid resource exhaustion.
How do threads communicate?
Threads within the same process can communicate easily using:
- Shared memory.
- Global variables.
- Thread-specific data (to avoid overwriting shared data).
What are examples of multithreaded applications?
- Web Servers: Handle multiple client requests simultaneously.
- Media Players: Separate threads for decoding, playback, and UI.
- Online Games: Threads for networking, rendering, and game logic.
- Database Servers: Concurrent processing of multiple queries.
What is a context switch in threading?
A context switch occurs when the CPU switches from one thread to another. This involves saving the state of the current thread and loading the state of the next thread.
- Overhead: Context switching is less expensive for threads compared to processes because threads share memory.
What is the difference between a daemon thread and a user thread?
- Daemon Thread: Runs in the background, providing support for user threads (e.g., garbage collection). It terminates when all user threads finish.
- User Thread: A thread explicitly created by the user. The program does not terminate until all user threads have completed.
What is thread affinity?
Thread affinity refers to binding a thread to a specific CPU core. This can improve performance by:
- Reducing cache misses.
- Ensuring consistent execution on the same processor.
What is the role of a thread scheduler?
The thread scheduler is responsible for:
- Allocating CPU time to threads.
- Managing thread priorities.
- Handling blocking and waking up threads.
How does threading differ in multicore systems?
In multicore systems:
- Parallelism: Multiple threads can truly execute simultaneously on different cores.
- Synchronization: Becomes more critical as shared resources may lead to contention.
What is a lightweight process (LWP)?
Some operating systems utilize a thread-like entity called a lightweight process (LWP) to connect kernel-level and user-level threads. Multiple user-level threads can be mapped to an equal or less number of kernel threads thanks to LWPs.
What is thread priority?
The sequence in which threads are scheduled for execution is determined by thread priority. Although the scheduling algorithm determines the precise behavior, higher priority threads are often processed before lower priority threads.
What is a thread group?
To facilitate resource sharing and administration, relevant threads can be grouped together using a thread group. Programming frameworks frequently employ thread groups to control several threads at once (e.g., stopping all threads in a group).
What is reentrant code, and why is it important for threads?
Code that can be safely run by several threads simultaneously is known as reentrant code. It makes use of thread-local storage or parameters rather than shared or global variables. For thread safety in multithreaded contexts, reentrant programming is essential.