Skip to search formSkip to main contentSkip to account menu

Task parallelism

Known as: Control parallelism, Task level parallelism, TLP 
Task parallelism (also known as function parallelism and control parallelism) is a form of parallelization of computer code across multiple… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2012
2012
This paper presents the implementation of Advance encryption (AES) algorithm using parallel computing. Most of the research for… 
2010
2010
Finding an efficient schedule for a task graph on several processors is a trade-off between maximising concurrency and minimising… 
2009
2009
The performance of single-threaded programs and legacy binary code is of critical importance in many everyday applications… 
2007
2007
An important question is whether emerging and future applications exhibit sufficient parallelism, in particular thread-level… 
2007
2007
Multicore microprocessors have been largely motivated by the diminishing returns in performance and the increased power… 
2007
2007
— In this paper, genetic algorithm (GA) opmization technique is applied to design Flexible AC Transmission System (FACTS)-based… 
1991
1991
This paper presents Proteus, an architecture-independent language suitable for prototyping parallel and distributed programs… 
1976
1976
The SIFT (Software Implemented Fault Tolerance) computer is a fault-tolerant computer in which fault tolerance is achieved…