| Paper ID | MLSP-34.1 |
| Paper Title |
SAMPLE EFFICIENT SUBSPACE-BASED REPRESENTATIONS FOR NONLINEAR META-LEARNING |
| Authors |
Ibrahim Gulluk, Bogazici University, Turkey; Yue Sun, University of Washington, United States; Samet Oymak, University of California, Riverside, United States; Maryam Fazel, University of Washington, United States |
| Session | MLSP-34: Subspace Learning and Applications |
| Location | Gather.Town |
| Session Time: | Thursday, 10 June, 15:30 - 16:15 |
| Presentation Time: | Thursday, 10 June, 15:30 - 16:15 |
| Presentation |
Poster
|
| Topic |
Machine Learning for Signal Processing: [MLR-SBML] Subspace and manifold learning |
| IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
| Virtual Presentation |
Click here to watch in the Virtual Conference |
| Abstract |
Constructing good representations is critical for learning complex tasks in a sample efficient manner. In the context of meta-learning, representations can be constructed from common patterns of previously seen tasks so that a future task can be learned quickly. While recent works show the benefit of subspace-based representations, such results are limited to linear-regression tasks. This work explores a more general class of nonlinear tasks with applications ranging from binary classification, generalized linear models and neural nets. We prove that subspace-based representations can be learned in a sample-efficient manner and provably benefit future tasks in terms of sample complexity. Numerical results verify the theoretical predictions in classification and neural-network regression tasks. |