| CARVIEW |
Select Language
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Tue, 16 Aug 2022 06:50:34 GMT
access-control-allow-origin: *
strict-transport-security: max-age=31556952
etag: W/"62fb3e3a-4318"
expires: Mon, 29 Dec 2025 04:42:47 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: 61E3:328FD3:84657A:94CDFF:6952046E
accept-ranges: bytes
age: 0
date: Mon, 29 Dec 2025 04:32:47 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210074-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1766982768.609524,VS0,VE215
vary: Accept-Encoding
x-fastly-request-id: 5a8d0ac21bbdae484bf29c60da2dc7a8202cf743
content-length: 4675
Class-Incremental Learning with Cross-Space
Clustering and Controlled Transfer
Class-Incremental Learning with Cross-Space
Clustering and Controlled Transfer
|
|
|
|
Accepted at ECCV 2022
|
|
|
| In class-incremental learning, the model is expected to learn new classes continually while maintaining knowledge on previous classes. The challenge here lies in preserving the model's ability to effectively represent prior classes in the feature space, while adapting it to represent incoming new classes. We propose two distillation-based objectives for class incremental learning that leverage the structure of the feature space to maintain accuracy on previous classes, as well as enable learning the new classes. In our first objective, termed cross-space clustering (CSC), we propose to use the feature space structure of the previous model to characterize directions of optimization that maximally preserve the class: directions that all instances of a specific class should collectively optimize towards, and those that they should collectively optimize away from. Apart from minimizing forgetting, this indirectly encourages the model to cluster all instances of a class in the current feature space, and gives rise to a sense of herd-immunity, allowing all samples of a class to jointly combat the model from forgetting the class. Our second objective termed controlled transfer (CT) tackles incremental learning from an understudied perspective of inter-class transfer. CT explicitly approximates and conditions the current model on the semantic similarities between incrementally arriving classes and prior classes. This allows the model to learn classes in such a way that it maximizes positive forward transfer from similar prior classes, thus increasing plasticity, and minimizes negative backward transfer on dissimilar prior classes, whereby strengthening stability. We perform extensive experiments on two benchmark datasets, adding our method (CSCCT) on top of three prominent class-incremental learning methods. We observe consistent performance improvement on a variety of experimental settings. |
Method
| We propose two distillation-based objectives for class incremental learning that leverage the structure of the feature space to maintain accuracy on previous classes, as well as enable learning the new classes. |
Cross-Space Clustering
![]() |
| Our Cross-Space Clustering (CSC) objective alleviates forgetting by distilling class-level semantics and inducing tight clusters in the feature space. CSC leverages points across the entire feature space of the previous model $ F^T_{t−1} $, to identify regions that a class is optimized to stay within, and other harmful regions that it is prevented from drifting towards. |
Controlled Transfer
![]() |
|
|
Results
|
|
![]() |
|
|
![]() |
![]() |
Arjun Ashok, K J Joseph, Vineeth Balasubramanian. Class-Incremental Learning with Cross-Space Clustering and Controlled Transfer. In ECCV 2022. [Paper] [GitHub] |
Bibtex
|
AcknowledgementsWe are grateful to the Department of Science and Technology, India, as well as Intel India for the financial support of this project through the IMPRINT program (IMP/2019/000250) as well as the DST ICPS Data Science Cluster program. KJJ thanks TCS for their PhD Fellowship. We also thank the anonymous reviewers and Area Chairs for their valuable feedback in improving the presentation of this paper.This template was originally made by for a colorful ECCV project. |




