Classification and Characterization of Memory Reference Behavior in Machine Learning Workloads

Seokmin Kwon, Hyokyung Bahn

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

With the recent penetration of artificial intelligence (AI) technologies into many areas of computing, machine learning is being incorporated into modern software design. As the in-memory data of AI workloads increasingly grows, it is important to characterize memory reference behaviors in machine learning workloads. In this paper, we perform a characterization study for memory references in machine learning workloads as the learning types (i.e., supervised vs. unsupervised) and the problem domains (i.e., classification, regression, and clustering) are varied. From this study, we uncover the following five characteristics. First, machine learning workloads exhibit significantly different memory reference patterns from traditional workloads, but they are similar regardless of learning types and problem domains. Second, in all workloads, memory reads and writes continue to appear for a wide range of memory addresses, but there is a specific time period where only reads appear. Third, among references to memory areas (i.e., code, data, heap, stack, library), library accounts for about 90% of total memory references. Fourth, there is a low popularity bias between memory pages referenced in machine learning workloads, especially for writes. Fifth, when estimating the likelihood of re-referencing, temporal locality is dominant in top 100 memory pages, but access frequency provides better information after that ranking. It is expected that the characterization of memory references conducted in this paper will be helpful in the design of memory management policies for machine learning workloads.

Original languageEnglish
Title of host publicationProceedings - 2022 IEEE/ACIS 24th International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022
EditorsShu-Ching Chen, Her-Terng Yau, Roland Stenzel, Hsiung-Cheng Lin
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages103-108
Number of pages6
ISBN (Electronic)9798350310412
DOIs
StatePublished - 2022
Event24th IEEE/ACIS International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022 - Taichung, Taiwan, Province of China
Duration: 7 Dec 20229 Dec 2022

Publication series

NameProceedings - 2022 IEEE/ACIS 24th International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022

Conference

Conference24th IEEE/ACIS International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022
Country/TerritoryTaiwan, Province of China
CityTaichung
Period7/12/229/12/22

Bibliographical note

Funding Information:
ACKNOWLEDGMENT This work was partly supported by the Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korean government (MSIT) (No.2021-0-02068, Artificial Intelligence Innovation Hub) and (No.RS-2022-00155966, Artificial Intelligence Convergence Innovation Human Resources Development (Ewha Womans University)).

Publisher Copyright:
© 2022 IEEE.

Keywords

  • artificial intelligence
  • characterization
  • classification
  • clustering
  • machine learning
  • memory reference

Fingerprint

Dive into the research topics of 'Classification and Characterization of Memory Reference Behavior in Machine Learning Workloads'. Together they form a unique fingerprint.

Cite this