2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information

2021 IEEE International Conference on Acoustics, Speech and Signal Processing

6-11 June 2021 • Toronto, Ontario, Canada

Extracting Knowledge from Information
Login Paper Search My Schedule Paper Index Help

My ICASSP 2021 Schedule

Note: Your custom schedule will not be saved unless you create a new account or login to an existing account.
  1. Create a login based on your email (takes less than one minute)
  2. Perform 'Paper Search'
  3. Select papers that you desire to save in your personalized schedule
  4. Click on 'My Schedule' to see the current list of selected papers
  5. Click on 'Printable Version' to create a separate window suitable for printing (the header and menu will appear, but will not actually print)

Paper Detail

Paper IDMLSP-32.4
Paper Title FUSING MULTITASK MODELS BY RECURSIVE LEAST SQUARES
Authors Xiaobin Li, Lianlei Shan, Weiqiang Wang, University of Chinese Academy of Sciences, China
SessionMLSP-32: Optimization Algorithms for Machine Learning
LocationGather.Town
Session Time:Thursday, 10 June, 15:30 - 16:15
Presentation Time:Thursday, 10 June, 15:30 - 16:15
Presentation Poster
Topic Machine Learning for Signal Processing: [MLR-LEAR] Learning theory and algorithms
IEEE Xplore Open Preview  Click here to view in IEEE Xplore
Abstract It is easy to obtain multi-tasking models from open source platforms or various organizations. However, using these models at the same time will bring a great burden on storage and reduce computing efficiency. In this paper, we propose a transformation-based multi-task fusion method, called transformation fusion(TF), which is implemented by recursive least squares. The recursive transformation fusion not only reduces the storage burden brought by models fusion but also avoids computing the inverse matrix of high-dimensional matrix. Our multi-model fusion method can also be applied to many mainstream tasks, such as multi-task learning and offline distributed learning. Our fusion method can be applied to data-based fusion tasks as well as data-free fusion tasks. Extensive experiments demonstrate the effectiveness of our fusion method.