DEvS: data distillation algorithm based on evolution strategy.

Annual Conference on Genetic and Evolutionary Computation (GECCO)(2022)

Cited 0|Views5
No score
Abstract
The development of machine learning solutions often relies on training using large labeled datasets. This raises challenges in terms of data storage, data privacy protection, and longer model training time. One of the possible solutions to overcome these problems is called dataset distillation - a process of creating a smaller dataset while maximizing the preservation of its task-related information. In this paper, a new dataset distillation algorithm is proposed, called DEvS, which uses an evolutionary strategy approach to condense the training samples initially available for an image classification task, while minimizing the loss of classification accuracy. Experiments on CIFAR-10 demonstrate the competitiveness of the proposed approach. Also, contrary to recent trends, DEvS is derivative-free image generation, and therefore has greater scalability on larger input image sizes.
More
Translated text
Key words
dataset distillation, image classification, neural networks, evolution strategy, optimization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined