Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Robust algorithms for learning from modern data

CORDIS provides links to public deliverables and publications of HORIZON projects.

Links to deliverables and publications from FP7 projects, as well as links to some specific result types such as dataset and software, are dynamically retrieved from OpenAIRE .

Publications

Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation (opens in new window)

Author(s): Etienne de Klerk; François Glineur; Adrien B. Taylor
Published in: SIAM Journal on Optimization, Issue 21, 2020, ISSN 1052-6234
Publisher: Society for Industrial and Applied Mathematics
DOI: 10.1137/19m1281368

On the oracle complexity of smooth strongly convex minimization (opens in new window)

Author(s): Y. Drori; Adrien B. Taylor
Published in: Journal of Complexity, Issue 35, 2021, ISSN 1076-2787
Publisher: John Wiley & Sons Inc.
DOI: 10.1016/j.jco.2021.101590

On the Effectiveness of Richardson Extrapolation in Data Science. (opens in new window)

Author(s): F. Bach.
Published in: SIAM Journal on Mathematics of Data Science, 2021, ISSN 2577-0187
Publisher: SIAM
DOI: 10.1137/21m1397349

A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives (opens in new window)

Author(s): Mathieu Barré, Adrien Taylor & Francis Bach
Published in: Open Journal of Mathematical Optimization, 2022, ISSN 2777-5860
Publisher: Mersenne
DOI: 10.5802/ojmo.12

Accelerated Gossip in Networks of Given Dimension using Jacobi Polynomial Iterations (opens in new window)

Author(s): Raphaël Berthier; Francis Bach; Pierre Gaillard
Published in: SIAM Journal on the Mathematics of Data Science, Issue 13, 2020, ISSN 2577-0187
Publisher: SIAM
DOI: 10.1137/19m1244822

Efficient first-order methods for convex minimization: a constructive approach (opens in new window)

Author(s): Yoel Drori, Adrien B. Taylor
Published in: Mathematical Programming, 2018, ISSN 0025-5610
Publisher: Springer Verlag
DOI: 10.1007/s10107-019-01410-2

Explicit Regularization of Stochastic Gradient Methods through Duality.

Author(s): A. Raj, F. Bach.
Published in: Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS), 2021
Publisher: AISTATS

A Dimension-free Computational Upper-bound for Smooth Optimal Transport Estimation.

Author(s): A. Vacher, B. Muzellec, A. Rudi, F. Bach, F.-X. Vialard.
Published in: Proceedings of the Conference on Learning Theory (COLT), 2021, 2021
Publisher: COLT

Dual-Free Stochastic Decentralized Optimization with Variance Reduction.

Author(s): H. Hendrikx, F. Bach, L. Massoulié.
Published in: Advances in Neural Information Processing Systems (NeurIPS), 2020
Publisher: NeurIPS

Non-parametric Models for Non-negative Functions.

Author(s): U. Marteau-Ferey, F. Bach, A. Rudi.
Published in: Advances in Neural Information Processing Systems (NeurIPS)., 2020
Publisher: NeurIPS

Batch Normalization Provably Avoids Rank Collapse for Randomly Initialised Deep Networks.

Author(s): H. Daneshmand, J. Kohler, F. Bach, T. Hofmann, A. Lucchi.
Published in: Advances in Neural Information Processing Systems (NeurIPS), 2020, 2020
Publisher: NeurIPS

A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip

Author(s): M. Even, R. Berthier, F. Bach, N. Flammarion, P. Gaillard, H. Hendrikx, L. Massoulié, A. Taylor.
Published in: Advances in Neural Information Processing Systems, 2021
Publisher: NeurIPS

Learning with Differentiable Perturbed Optimizers

Author(s): Berthet, Quentin; Blondel, Mathieu; Teboul, Olivier; Cuturi, Marco; Vert, Jean-Philippe; Bach, Francis
Published in: Advances in NeurIPS, Issue 39, 2020
Publisher: NeurIPS

Relating Leverage Scores and Density using Regularized Christoffel Functions

Author(s): Pauwels , Edouard; Bach , Francis; Vert , Jean-Philippe
Published in: Advances in NIPS, 2018
Publisher: NIPS Foundation

Statistical Optimality of Stochastic Gradient Descent on Hard Learning Problems through Multiple Passes

Author(s): Loucas Pillaud-Vivien, Alessandro Rudi, Francis Bach
Published in: Advances in Neural Information Processing Systems (NIPS), 2018
Publisher: NIPS Foundation

On Fast Leverage Score Sampling and Optimal Learning

Author(s): Rudi , Alessandro; Calandriello , Daniele; Carratino , Luigi; Rosasco , Lorenzo
Published in: Advances in NIPS, Issue 28, 2018
Publisher: NIPS Foundation

Exponential convergence of testing error for stochastic gradient methods

Author(s): Pillaud-Vivien, Loucas; Rudi, Alessandro; Bach, Francis
Published in: Proceedings of COLT, Issue 5, 2018
Publisher: COLT

On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport

Author(s): Chizat , Lenaic; Bach , Francis
Published in: Advances in NIPS, Issue 1, 2018
Publisher: NIPS Foundation

Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions

Author(s): Adrien Taylor, Francis Bach
Published in: Proceedings COLT, 2019
Publisher: COLT

Affine Invariant Covariance Estimation for Heavy-Tailed Distributions

Author(s): Ostrovskii, Dmitrii; Rudi, Alessandro
Published in: COLT 2019 - 32nd Annual Conference on Learning Theory, Issue 5, 2019
Publisher: N/A

An accelerated decentralized stochastic proximal algorithm for finite Sums

Author(s): Hadrien Hendrikx, Francis Bach, Laurent Massoulié
Published in: Advances in Neural Information Processing Systems (NeurIPS), 2019
Publisher: N/A

On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport

Author(s): Chizat , Lenaic; Bach , Francis
Published in: Advances in Neural Information Processing Systems (NIPS), Issue 2, 2018
Publisher: N/A

Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses

Author(s): Marteau-Ferey, Ulysse; Bach, Francis; Rudi, Alessandro
Published in: Advances in Neural Information Processing Systems (NeurIPS), 2019
Publisher: N/A

On Lazy Training in Differentiable Programming

Author(s): Chizat, Lenaic; Oyallon, Edouard; Bach, Francis
Published in: NeurIPS 2019 - 33rd Conference on Neural Information Processing Systems, Dec 2019, Vancouver, Canada, Issue 15, 2019
Publisher: N/A

Deep Equals Shallow for ReLU Networks in Kernel Regimes.

Author(s): A. Bietti, F. Bach.
Published in: Proceedings of the International Conference on Learning Representations (ICLR), 2021
Publisher: ICLR

Batch Normalization Orthogonalizes Representations in Deep Random Networks

Author(s): H. Daneshmand, A. Joudaki, F. Bach.
Published in: Advances in NeurIPS, 2021
Publisher: NeurIPS

Fast rates in structured prediction.

Author(s): V. Cabannes, F. Bach, A. Rudi.
Published in: Proceedings of the Conference on Learning Theory (COLT), 2021
Publisher: COLT

Finite-sample analysis of M-estimators using self-concordance (opens in new window)

Author(s): Dmitrii Ostrovskii; Francis Bach
Published in: Electronic Journal of Statistics, Issue 27, 2021, ISSN 1935-7524
Publisher: Institute of Mathematical Statistics
DOI: 10.1214/20-ejs1780

Searching for OpenAIRE data...

There was an error trying to search data from OpenAIRE

No results available

My booklet 0 0