Darshan Patil PhD student at MILA/University of Montreal.

2025

  1. Experimental Design for Nonstationary Optimization
    Darshan Patil, Pranshu Malviya, Maryam Hashemzadeh, and Sarath Chandar
    [paper]

  2. Interpolate: How Resetting Active Neurons can also improve Generalizability in Online Learning
    Pranshu Malviya, Darshan Patil, Maryam Hashemzadeh, Quentin Fournier, and Sarath Chandar
    [paper]

2024

  1. Toward Debugging Deep Reinforcement Learning Programs with RLExplorer
    Rached Bouchoucha, Ahmed Haj Yahmed, Darshan Patil, Janarthanan Rajendran, Amin Nikanjam, Sarath Chandar, and Foutse Khomh
    International Conference on Software Maintenance and Evolution (ICSME), 2024.
    [paper]

  2. Exploring the Plasticity of Neural Network for NLP Tasks in Continual Learning
    Maryam Hashemzadeh, Pranshu Malviya*, Darshan Patil*, and Sarath Chandar
    Conference on Lifelong Learning Agents (CoLLAs) Workshop Track, 2024.

  3. Intelligent Switching for Reset-Free RL
    Darshan Patil, Janarthanan Rajendran, Glen Berseth, and Sarath Chandar
    International Conference on Learning Representations (ICLR), 2024.
    [paper], [code]

2023

  1. An Empirical Investigation of the Role of Pre-training in Lifelong Learning
    Sanket Vaibhav Mehta, Darshan Patil, Sarath Chandar, and Emma Strubell
    Journal of Machine Learning Research, 2023.
    Theory of Continual Learning Workshop, ICML, 2021 (Spotlight)
    [paper], [code]

2021

  1. Disentangling 3D Prototypical Networks for Few-Shot Concept Learning
    Mihir Prabhudesai*, Shamit Lal*, Darshan Patil*(equal contribution), Hsiao-Yu Tung, Adam W Harley, and Katerina Fragkiadaki
    International Conference on Learning Representations 2021, 2021.
    Object Representations for Learning and Reasoning Workshop. Neurips, 2020 (Spotlight)
    [paper], [code], [project page]

2019

  1. Towards modular and programmable architecture search
    Renato Negrinho, Darshan Patil, Nghia Le, Daniel Ferreira, Matthew Gormley, and Geoffrey Gordon
    Neural Information Processing Systems, 2019.
    [paper], [code], [framework]