Skip to search formSkip to main content

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

2019

2019

This manuscript presents some new impossibility results on adversarial robustness in machine learning, a very important yet… Expand

Is this relevant?

2018

2018

This manuscript presents some new impossibility results on adversarial robustness in machine learning, a very important yet… Expand

Is this relevant?

2014

2014

One of the most important stages in many areas of engineering and applied sciences is modeling and the use of optimization… Expand

Is this relevant?

Highly Cited

2011

Highly Cited

2011

Differential privacy is a powerful tool for providing privacy-preserving noisy query answers over statistical databases. It… Expand

Is this relevant?

Highly Cited

2008

Highly Cited

2008

We consider two widely used notions in machine learning, namely: sparsity and algorithmic stability. Both notions are deemed… Expand

Is this relevant?

2005

2005

We present a preliminary analysis of the fundamental viability of meta-learning, revisiting the No Free Lunch (NFL) theorem. The… Expand

Is this relevant?

2004

2004

The sharpened No-Free-Lunch-theorem (NFL-theorem) states that, regardless of the performance measure, the performance of all… Expand

Is this relevant?

2001

2001

This note discusses the recent paper "Some technical remarks on the proof of the no free lunch theorem" by Koppen (2000). In that… Expand

Is this relevant?

2001

2001

The No Free Lunch Theorem of Optimization (NFLT) is an impossibility theorem telling us that a general-purpose universal… Expand

Is this relevant?

1999

1999

The no free lunch (NFL) theorem tells us that without any structural assumptions on an optimization problem, no algorithm can… Expand

Is this relevant?