With the recent penetration of artificial intelligence (AI) technologies into many areas of computing, machine learning is being incorporated into modern software design. As the in-memory data of AI workloads increasingly grows, it is important to characterize memory reference behaviors in machine learning workloads. In this paper, we perform a characterization study for memory references in machine learning workloads as the learning types (i.e., supervised vs. unsupervised) and the problem domains (i.e., classification, regression, and clustering) are varied. From this study, we uncover the following five characteristics. First, machine learning workloads exhibit significantly different memory reference patterns from traditional workloads, but they are similar regardless of learning types and problem domains. Second, in all workloads, memory reads and writes continue to appear for a wide range of memory addresses, but there is a specific time period where only reads appear. Third, among references to memory areas (i.e., code, data, heap, stack, library), library accounts for about 90% of total memory references. Fourth, there is a low popularity bias between memory pages referenced in machine learning workloads, especially for writes. Fifth, when estimating the likelihood of re-referencing, temporal locality is dominant in top 100 memory pages, but access frequency provides better information after that ranking. It is expected that the characterization of memory references conducted in this paper will be helpful in the design of memory management policies for machine learning workloads.
|Title of host publication||Proceedings - 2022 IEEE/ACIS 24th International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022|
|Editors||Shu-Ching Chen, Her-Terng Yau, Roland Stenzel, Hsiung-Cheng Lin|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||6|
|State||Published - 2022|
|Event||24th IEEE/ACIS International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022 - Taichung, Taiwan, Province of China|
Duration: 7 Dec 2022 → 9 Dec 2022
|Name||Proceedings - 2022 IEEE/ACIS 24th International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022|
|Conference||24th IEEE/ACIS International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, SNPD 2022|
|Country/Territory||Taiwan, Province of China|
|Period||7/12/22 → 9/12/22|
Bibliographical noteFunding Information:
ACKNOWLEDGMENT This work was partly supported by the Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korean government (MSIT) (No.2021-0-02068, Artificial Intelligence Innovation Hub) and (No.RS-2022-00155966, Artificial Intelligence Convergence Innovation Human Resources Development (Ewha Womans University)).
© 2022 IEEE.
- artificial intelligence
- machine learning
- memory reference