Welcome!

I’m currently a research fellow at The Simons Institute for the Theory of Computing. Starting in January 2025, I will be a postdoctoral fellow at the Institute for Foundations of Machine Learning (UT Austin). I received my PhD from Massachusetts Institute of Technology, advised by Jonathan Kelner and Ronitt Rubinfeld.

Before starting the PhD program, I completed my undergraduate studies at MIT with a major in Computer Science and a double minor in Physics and Philosophy. Earlier, I lived in Yerevan, Armenia, where I primarily competed in contests like the International Physics Olympiad (IPhO).

My scientific research interests include:

  • Computational learning theory
  • Computational statistics
  • Distribution learning and testing

Feel free to contact me via e-mail: MyFirstNameMyLastName at gmail.com

Publications:

Local Lipschitz Filters for Bounded-Range Functions
Jane Lange, Ephraim Linder, Sofya Raskhodnikova, Arsen Vasilyan
ACM-SIAM Symposium on Discrete Algorithms (SODA 2025, to appear).

Tolerant Algorithms for Learning with Arbitrary Covariate Shift
Surbhi Goel, Abhishek Shetty, Konstantinos Stavropoulos, Arsen Vasilyan
38th Conference on Neural Information Processing Systems (NeurIPS 2024, to appear).
Accepted as a spotlight.

Efficient Discrepancy Testing for Learning with Distribution Shift
Gautam Chandrasekaran, Adam R. Klivans, Vasilis Kontonis, Konstantinos Stavropoulos, Arsen Vasilyan
38th Conference on Neural Information Processing Systems (NeurIPS 2024, to appear).

Plant-and-Steal: Truthful Fair Allocations via Predictions
Ilan Reuven Cohen, Alon Eden, Talya Eden, Arsen Vasilyan
38th Conference on Neural Information Processing Systems (NeurIPS 2024, to appear).

Learning Intersections of Halfspaces with Distribution Shift: Improved Algorithms and SQ Lower Bounds
Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
37th Conference on Learning Theory (COLT 2024).

Testable Learning with Distribution Shift
Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
37th Conference on Learning Theory (COLT 2024).

An Efficient Tester-Learner for Halfspaces
Aravind Gollakota, Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
12th International Conference on Learning Representations (ICLR 2024).

Tester-Learners for Halfspaces: Universal Algorithms
Aravind Gollakota, Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
37th Conference on Neural Information Processing Systems (NeurIPS 2023).
Accepted for oral presentation.

Agnostic Proper Learning of Monotone Functions: Beyond the Black-box Correction Barrier
Jane Lange and Arsen Vasilyan
64th IEEE Symposium on Foundations of Computer Science (FOCS 2023).
Invited to special issue.

Testing Distributional Assumptions of Learning Algorithms
Ronitt Rubinfeld, Arsen Vasilyan
55th ACM Symposium on Theory of Computing (STOC 2023)

Properly Learning Monotone Functions via Local Reconstruction
Jane Lange, Ronitt Rubinfeld, Arsen Vasilyan
63rd IEEE Symposium on Foundations of Computer Science (FOCS 2022)

Monotone Probability Distributions over the Boolean Cube Can Be Learned with Sublinear Samples
Ronitt Rubinfeld, Arsen Vasilyan
11th Innovations in Theoretical Computer Science Conference (ITCS 2020)

Approximating the Noise Sensitivity of a Monotone Boolean Function
Ronitt Rubinfeld, Arsen Vasilyan
Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
(APPROX/RANDOM 2019).

Doctoral thesis:

Enhancing Learning Algorithms via Sublinear-Time Methods
Arsen Vasilyan, June 2024.