Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Multithreading Microprocesors
#1

Multithreading has been proposed as a method to reduce long latencies in multiprocessor systems. This is advantageous when one has large on chip caches, associativity of two, and a memory access cost of roughly 50 instructions. A small number of threads is enough, the thread system need not be xtraordinarily fast. The miss ratios are significantly less for multithreading because switch-on-miss multithreading introduces unfair scheduling.
A process runs until it misses on on-chip cache and miss request is issued to the off chip memory system and the processor switches over to another thread. The cost effectiveness of multithreaded processor comes from the fact that we can eliminate the lagrge intermediate level cache and making concurrent use of the processor and the memory system.
Data-Driven Multithreading is a nonblocking multithreading execution model that tolerates internode latency by scheduling threads for execution based on data availability. Scheduling based on data availability can be used to exploit cache management policies that reduce significantly cache misses.policies include firing a thread for execution only if its data is already placed in the cache which are called CacheFlow policy. Its main part is a memory mapped hardware module that is attached directly to the processor's bus which is responsible for thread scheduling and is known as the Thread Synchronization Unit (TSU).

studies of multithreaded architctures have primarily focussed on multiprocessor systems where several threads of parallel programs are maintained on each processor. Adding more threads leads to very fine grain time sharing leading to sharing of processor, network and memory system resources.

Multithtead cache behaviour
The miss rate determines the thread run length which determines the number of independent threads required to mask a given memory latency.There are several factors to consider for designing switch-on-miss cache design:
-cache size:
-cache partitioning
-cache associativity

Given the current growth rate, multithreading will become superior to others in the near future.

full seminar report download:
http://seminarsprojects.in/attachment.php?aid=971
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

Powered By MyBB, © 2002-2024 iAndrew & Melroy van den Berg.