Skip to search formSkip to main contentSkip to account menu

Task parallelism

Known as: Control parallelism, Task level parallelism, TLP 
Task parallelism (also known as function parallelism and control parallelism) is a form of parallelization of computer code across multiple… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2007
Highly Cited
2007
Many sorting algorithms have been studied in the past, but there are only a few algorithms that can effectively exploit both SIMD… 
Highly Cited
2006
Highly Cited
2006
In this paper, we show how 3D stacking technology can be used to implement a simple, low-power, high-performance chip… 
Highly Cited
2006
Highly Cited
2006
Previous proposals for power-aware thread-level parallelism on chip multiprocessors (CMPs) mostly focus on multiprogrammed… 
Highly Cited
2005
Highly Cited
2005
Chip multi-threaded (CMT) processors provide support for many simultaneous hardware threads of execution in various ways… 
Highly Cited
2005
Highly Cited
2005
Over the past few years, the ARM reduced-instruction-set computing (RISC) processor has evolved to offer a family of chips that… 
Highly Cited
2004
Highly Cited
2004
This paper provides a general technical description of several types of floating platforms for wind turbines. Platform topologies… 
Highly Cited
2004
Highly Cited
1999
Highly Cited
1999
Much emphasis is now being placed on chip-multiprocessor (CMP) architectures for exploiting thread-level parallelism in… 
Highly Cited
1996
Highly Cited
1996
This peeper presents a new concurrent multiple-threaded architectural model, called superthreading, for exploiting thread-level… 
Highly Cited
1993
Highly Cited
1993
For many applications, achieving good performance on a private memory parallel computer requires exploiting data parallelism as…