Folgen
Alessandro Favero
Alessandro Favero
Bestätigte E-Mail-Adresse bei epfl.ch - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Task Arithmetic in the Tangent Space: Improved Editing of Pre-Trained Models
G Ortiz-Jimenez*, A Favero*, P Frossard
Advances in Neural Information Processing Systems 36, 66727-66754, 2024
1182024
Multi-Modal Hallucination Control by Visual Information Grounding
A Favero, L Zancato, M Trager, S Choudhary, P Perera, A Achille, ...
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 14303 …, 2024
612024
A phase transition in diffusion models reveals the hierarchical nature of data
A Sclocchi, A Favero, M Wyart
Proceedings of the National Academy of Sciences 122 (1), e2408799121, 2025
34*2025
How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
F Cagnetta, L Petrini, UM Tomasini, A Favero, M Wyart
Physical Review X 14 (3), 031001, 2024
302024
What Can Be Learnt With Wide Convolutional Neural Networks?
F Cagnetta*, A Favero*, M Wyart
International Conference on Machine Learning, PMLR 202, 3347-3379, 2023
252023
Locality defeats the curse of dimensionality in convolutional teacher-student scenarios
A Favero*, F Cagnetta*, M Wyart
Advances in Neural Information Processing Systems 34, 9456-9467, 2021
232021
Relative stability toward diffeomorphisms indicates performance in deep nets
L Petrini, A Favero, M Geiger, M Wyart
Advances in Neural Information Processing Systems 34, 8727-8739, 2021
162021
Lines: Post-training layer scaling prevents forgetting and enhances model merging
K Wang, N Dimitriadis, A Favero, G Ortiz-Jimenez, F Fleuret, P Frossard
International Conference on Learning Representations (ICLR), 2025
42025
Probing the Latent Hierarchical Structure of Data via Diffusion Models
A Sclocchi*, A Favero*, NI Levi*, M Wyart
International Conference on Learning Representations (ICLR), 2025
42025
How compositional generalization and creativity improve as diffusion models are trained
A Favero, A Sclocchi, F Cagnetta, P Frossard, M Wyart
arXiv preprint arXiv:2502.12089, 2025
2025
Computational complexity of deep learning: fundamental limitations and empirical phenomena
B Barak, A Carrell, A Favero, W Li, L Stephan, A Zlokapa
Journal of Statistical Mechanics: Theory and Experiment 2024 (10), 104008, 2024
2024
What can be learnt with wide convolutional neural networks?*
F Cagnetta, A Favero, M Wyart
Journal of Statistical Mechanics: Theory and Experiment 2024 (10), 104020, 2024
2024
Unraveling the Latent Hierarchical Structure of Language and Images via Diffusion Models
A Sclocchi, NI Levi, A Favero, M Wyart
NeurIPS 2024 Workshop on Scientific Methods for Understanding Deep Learning, 2024
2024
Task Addition and Weight Disentanglement in Closed-Vocabulary Models
A Hazimeh*, A Favero*, P Frossard
ICML 2024 Workshop on Efficient Systems for Foundation Models II, 2024
2024
Statistical Mechanics of Infinitely-Wide Convolutional Networks
A Favero, F Cagnetta, M Wyart
Bulletin of the American Physical Society, 2023
2023
Diffeomorphisms invariance is a proxy of performance in deep neural networks
L Petrini, A Favero, M Geiger, M Wyart
Bulletin of the American Physical Society, 2023
2023
Locality defeats the curse of dimensionality in convolutional teacher-student scenarios*
A Favero, F Cagnetta, M Wyart
Journal of Statistical Mechanics: Theory and Experiment 2022 (11), 114012, 2022
2022
Relative stability toward diffeomorphisms indicates performance in deep nets*
L Petrini, A Favero, M Geiger, M Wyart
Journal of Statistical Mechanics: Theory and Experiment 2022 (11), 114013, 2022
2022
Spectral analysis of infinitely wide convolutional neural networks
A Favero
Politecnico di Torino, 2020
2020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–19