ABC: Abstract prediction Before ConcretenessLast modified: Wed Oct 7 20:30:58 2020 GMT.
AuthorsJung-Eun KimRichard Bradford Man-Ki Yoon Zhong Shao AbstractLearning techniques are advancing the utility and capability of modern embedded systems. However, the challenge of incorporating learning modules into embedded systems is that computing resources are scarce. For such a resource-constrained environment, we have developed a framework for learning abstract information early and learning more concretely as time allows. The intermediate results can be utilized to prepare for early decisions/actions as needed. To apply this framework to a classification task, the datasets are categorized in an abstraction hierarchy. Then the framework classifies intermediate labels from the most abstract level to the most concrete. Our proposed method outperforms the existing approaches and reference base-lines in terms of accuracy. We show our framework with different architectures and on various benchmark datasets CIFAR-10, CIFAR-100, and GTSRB. We measure prediction times on GPU-equipped embedded computing platforms as well. PublishedIn Proceedings of the 2020 Design, Automation, and Test in Europe Conference & Exhibition (DATE'20), Grenoble, France, May 2020. |
Copyright © 1996-2025 The FLINT Group
<flint at cs dot yale dot edu>
Yale University Department of Computer Science |
colophon |