[ACM]
As the mobile computing environment evolves, users demand high-quality apps and better user experience. Consequently, memory demand in mobile devices has soared. Device manufacturers have fulfilled the demand by equipping devices with more RAM. However, such a hardware approach is only a temporary solution and does not scale well in the resource-constrained mobile environment.
Meanwhile, mobile systems adopt a new app life cycle and a memory reclamation scheme tailored for the life cycle. When a user leaves an app, the app is not terminated but cached in memory as long as there is enough free memory. If the free memory gets low, a victim app is terminated and the associated memory to the app is reclaimed. This process-level approach has worked well in the mobile environment. However, user experience can be impaired severely because the victim selection policy does not consider the user experience.
In this article, we propose a novel memory reclamation scheme called SmartLMK. SmartLMK minimizes the impact of the process-level reclamation on user experience. The worthiness to keep an app in memory is modeled by means of user-perceived app launch time and app usage statistics. The memory footprint and impending memory demand are estimated from the history of the memory usage. Using these values and memory models, SmartLMK picks up the least valuable apps and terminates them at once. Our evaluation on a real Android-based smartphone shows that SmartLMK efficiently distinguishes the valuable apps among cached apps and keeps those valuable apps in memory. As a result, the user-perceived app launch time can be improved by up to 13.2%.