Redirigiendo al acceso original de articulo en 23 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 18 (2023)  /  Artículo
ARTÍCULO
TITULO

QZRAM: A Transparent Kernel Memory Compression System Design for Memory-Intensive Applications with QAT Accelerator Integration

Chi Gao    
Xiaofei Xu    
Zhizou Yang    
Liwei Lin and Jian Li    

Resumen

In recent decades, memory-intensive applications have experienced a boom, e.g., machine learning, natural language processing (NLP), and big data analytics. Such applications often experience out-of-memory (OOM) errors, which cause unexpected processes to exit without warning, resulting in negative effects on a system?s performance and stability. To mitigate OOM errors, many operating systems implement memory compression (e.g., Linux?s ZRAM) to provide flexible and larger memory space. However, these schemes incur two problems: (1) high-compression algorithms consume significant CPU resources, which inevitably degrades application performance; and (2) compromised compression algorithms with low latency and low compression ratios result in insignificant increases in memory space. In this paper, we propose QZRAM, which achieves a high-compression-ratio algorithm without high computing consumption through the integration of QAT (an ASIC accelerator) into ZRAM. To enhance hardware and software collaboration, a page-based parallel write module is introduced to serve as a more efficient request processing flow. More importantly, a QAT offloading module is introduced to asynchronously offload compression to the QAT accelerator, reducing CPU computing resource consumption and addressing two challenges: long CPU idle time and low usage of the QAT unit. The comprehensive evaluation validates that QZRAM can reduce CPU resources by up to 49.2% for the FIO micro-benchmark, increase memory space (1.66×) compared to ZRAM, and alleviate the memory overflow phenomenon of the Redis benchmark.

 Artículos similares

       
 
Viktar Atliha and Dmitrij ?e?ok    
Image captioning is a very important task, which is on the edge between natural language processing (NLP) and computer vision (CV). The current quality of the captioning models allows them to be used for practical tasks, but they require both large compu... ver más
Revista: Applied Sciences

 
Yeong-Mo Yeon, Ki-Nam Hong and Sang-Won Ji    
A lot of studies have been conducted to introduce self-prestress to structures using Fe-based shape memory alloys (Fe-SMAs). Technology to introduce self-prestress using Fe-SMAs can resolve the disadvantages of conventional prestressed concrete. However,... ver más
Revista: Applied Sciences

 
Ryan Feng, Yu Yao and Ella Atkins    
Autonomous vehicles require fleet-wide data collection for continuous algorithm development and validation. The smart black box (SBB) intelligent event data recorder has been proposed as a system for prioritized high-bandwidth data capture. This paper ex... ver más
Revista: Algorithms

 
Héctor Migallón, Otoniel López-Granado, Miguel O. Martínez-Rach, Vicente Galiano and Manuel P. Malumbres    
The proportion of video traffic on the internet is expected to reach 82% by 2022, mainly due to the increasing number of consumers and the emergence of new video formats with more demanding features (depth, resolution, multiview, 360, etc.). Efforts are ... ver más
Revista: Algorithms

 
Xiaofei Chao, Xiao Hu, Jingze Feng, Zhao Zhang, Meili Wang and Dongjian He    
The fast and accurate identification of apple leaf diseases is beneficial for disease control and management of apple orchards. An improved network for apple leaf disease classification and a lightweight model for mobile terminal usage was designed in th... ver más
Revista: Applied Sciences