Loss-Curvature Matching for Dataset Selection and Condensation

Seungjae Shin, Heesun Bae, Donghyeok Shin, Weonyoung Joo, Il Chul Moon

Research output: Contribution to journalConference articlepeer-review

11 Scopus citations

Abstract

Training neural networks on a large dataset requires substantial computational costs. Dataset reduction selects or synthesizes data instances based on the large dataset, while minimizing the degradation in generalization performance from the full dataset. Existing methods utilize the neural network during the dataset reduction procedure, so the model parameter becomes important factor in preserving the performance after reduction. By depending upon the importance of parameters, this paper introduces a new reduction objective, coined LCMat, which Matches the Loss Curvatures of the original dataset and reduced dataset over the model parameter space, more than the parameter point. This new objective induces a better adaptation of the reduced dataset on the perturbed parameter region than the exact point matching. Particularly, we identify the worst case of the loss curvature gap from the local parameter region, and we derive the implementable upper bound of such worst-case with theoretical analyses. Our experiments on both coreset selection and condensation benchmarks illustrate that LCMat shows better generalization performances than existing baselines.

Original languageEnglish
Pages (from-to)8606-8628
Number of pages23
JournalProceedings of Machine Learning Research
Volume206
StatePublished - 2023
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: 25 Apr 202327 Apr 2023

Bibliographical note

Publisher Copyright:
Copyright © 2023 by the author(s)

Fingerprint

Dive into the research topics of 'Loss-Curvature Matching for Dataset Selection and Condensation'. Together they form a unique fingerprint.

Cite this