As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Multi-task learning (MTL) improves generalization by sharing information among related tasks. Structured sparsity-inducing regularization has been widely used in MTL to learn interpretable and compact models, especially in high-dimensional settings. These methods have achieved much success in practice, however, there are still some key limitations, such as limited generalization ability due to specific sparse constraints on parameters, usually restricted in matrix form that ignores high-order feature interactions among tasks, and formulated in various forms with different optimization algorithms. Inspired by Generalized Lasso, we propose the Generalized Group Lasso (GenGL) to overcome these limitations. In GenGL, a linear operator is introduced to make it adaptable to diverse sparsity settings, and helps it to handle hierarchical sparsity and multi-component decomposition in general tensor form, leading to enhanced flexibility and expressivity. Based on GenGL, we propose a novel framework for Structured Sparse MTL (SSMTL), that unifies a number of existing MTL methods, and implement its two new variants in shallow and deep architectures, respectively. An efficient optimization algorithm is developed to solve the unified problem, and its effectiveness is validated by synthetic and real-world experiments.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.