High-throughput computing

Known as: Throughput computing, High Throughput Computing, HTC (disambiguation) 
High-throughput computing (HTC) is a computer science term to describe the use of many computing resources over long periods of time to accomplish a… (More)
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Review
2017
Review
2017
Recent observations of N2 fixation rates (NFR) and the presence of nitrogenase (nifH) genes from heterotrophic N2-fixing… (More)
  • figure 1
  • table 1
  • table 2
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2015
Highly Cited
2015
MOTIVATION A large choice of tools exists for many standard tasks in the analysis of high-throughput sequencing (HTS) data… (More)
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
Ribosomal Database Project (RDP; http://rdp.cme.msu.edu/) provides the research community with aligned and annotated rRNA gene… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2013
Highly Cited
2013
MOTIVATION Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2012
Highly Cited
2012
The Parboil benchmarks are a set of throughput computing app lic tions useful for studying the performance of throughput… (More)
  • table I
  • table II
Is this relevant?
Highly Cited
2010
Highly Cited
2010
Recent advances in computing have led to an explosion in the amount of data being generated. Processing the ever-growing data in… (More)
  • table 1
  • table 2
  • figure 1
  • table 3
Is this relevant?
Highly Cited
2005
Highly Cited
2005
CMT processors offer a way to significantly improve the performance of computer systems. The return on investment for… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2003
Highly Cited
2003
A high-throughput memory-efficient decoder architecture for low-density parity-check (LDPC) codes is proposed based on a novel… (More)
  • figure 2
  • figure 1
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
1998
Highly Cited
1998
Conventional resource management systems use a system model to describe resources and a centralized scheduler to control their… (More)
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
1997
Highly Cited
1997
For many experimental scientists, scienti c progress and quality of research are strongly linked to computing throughput. In… (More)
Is this relevant?