Experience

PhD

I am completing my PhD in Learning with Functional Data via Operator-valued Kernel Methods, Deep Neural Network Models: Theory, Techniques and Algorithms under the supervision of Prof. P. Balamurugan from IEOR, IIT Bombay. During this journey, I have worked on the following problems which will constitute my thesis:

Learning Sparse Graphs for Functional Regression using Graph-induced Operator-valued Kernels

A functional regression problem aims to learn a map \(\mathfrak{F}:\mathcal{Z}\mapsto\mathcal{Y}\), where \(\mathcal{Z}\) is an appropriate input space and \(\mathcal{Y}\) is a space of output functions. When \(\mathcal{Z}\) is also a space of functions, the learning problem is known as function-to-function regression. In this work, we consider the problem of learning a map of the form \({F}:{\mathcal{Z}}^p\mapsto\mathcal{Y}\), a many-to-one function-to-function regression problem, where the aim is to learn a suitable \(F\) which maps \(p\) input functions to an output function. In order to solve this regression problem with \(p\) input functions and a corresponding output function, we propose a graph-induced operator-valued kernel (OVK) obtained by imposing a graphical structure describing the inter-relationships among the \(p\) input functions. When the underlying graphical structure is unknown, we propose to learn an appropriate Laplacian matrix characterizing the graphical structure, which would also aid in learning the map \(F\). We formulate a learning problem using the proposed graph-induced OVK, and devise an alternating minimization framework to solve the learning problem. To learn \(F\) along with meaningful and important interactions in the graphical structure, a minimax concave penalty (MCP) is used as a sparsity-inducing regularization on the Laplacian matrix. We further extend the alternating minimization framework to learn \(F\), where each of the \(p\) constituent input functions as well as the output function are multi-dimensional. To scale the proposed algorithm to large datasets, we design an efficient sample-based approximation algorithm. Further, we provide bounds on generalization error for the map obtained by solving the proposed learning problem. An extensive empirical evaluation on both synthetic and real data demonstrates the utility of the proposed learning framework. Our experiments show that simultaneous learning of \(F\) along with sparse graphical structure helps in discovering significant relationships among the input functions, and motivates interpretability of such relationships driving the regression problem.

Learning with Operator-valued Kernels in Reproducing Kernel Krein Spaces

Operator-valued kernels have shown promise in supervised learning problems with functional inputs and functional outputs. The crucial (and possibly restrictive) assumption of positive definiteness of operator-valued kernels has been instrumental in developing efficient algorithms. In this work, we consider operator-valued kernels which might not be necessarily positive definite. To tackle the indefiniteness of operator-valued kernels, we harness the machinery of Reproducing Kernel Krein Spaces (RKKS) of function-valued functions. A representer theorem is illustrated which yields a suitable loss stabilization problem for supervised learning with function-valued inputs and outputs. Analysis of generalization properties of the proposed framework is given. An iterative Operator based Minimum Residual (OpMINRES) algorithm is proposed for solving the loss stabilization problem. Experiments with indefinite operator-valued kernels on synthetic and real data sets demonstrate the utility of the proposed approach.


Workshop/Summer School

  • Attended Attended Research Week with Google 2023 organized by Google Research India in Bengaluru from January 29 -31, 2023.
  • Attended LinkedIn-MSR-IISc Workshop on Fairness, Accountability, Transparency and Ethics in Machine Learning (FATE-ML) 2020 at IISc, Bengaluru.
  • Participated in Statistical Physics of Machine Learning 2020 held in ICTS Bangalore
  • Attended Indo-French Center for Applied Mathematics (IFCAM) Summer School Workshop on Mathematics for Data Science at IISc, Bengaluru from July 15-27, 2019.

Research Internship

  • Research Intern in Computational and Statistical Physics Lab, Indian Institute of Science, Bangalore.
    During the 2 months (May 2013 - Jul 2013) internship I worked in Computational and Statistical Physics Lab, Department of Computational and Data Sciences, IISc and studied the spectral distribution of different random matrices, establishing their relation with various parameters theoretically as well as empirically using MATLAB. (Link)

Academic Background

Teaching Experience

  • Teaching Assistanship
    Courses: IE 663: Advanced Topics in Deep Learning, IE 643: Deep Learning-Theory and Practice, IE 614: Linear Systems, IE 503: Operation Analysis.
  • Assistant Professor
    Courses taught: Discrete Mathematics, Numerical Methods, Probability & Statistics, Probability and Stochastic Processes, Engineering Mathematics.
Akash Saha

akashsaha@iitb.ac.in
akashsaha06@gmail.com

PhD Student
IEOR
IIT Bombay, Mumbai

   
Design courtesy of Vasilios Mavroudis: Plain Academic