lunduniversity.lu.se

Digit@LTH

Faculty of Engineering, LTH

Denna sida på svenska This page in English

Software@LTH events

CS MSc Thesis Presentation 17 May 2022

Föreläsning

From: 2022-05-17 11:30 to 13:00
Place: E:2405 (Glasburen)
Contact: birger [dot] swahn [at] cs [dot] lth [dot] se
Save event to your calendar


One Computer Science MSc thesis to be presented on 17 May

Tuesday, 17 May there will be a master thesis presentation in Computer Science at Lund University, Faculty of Engineering.

The presentation will take place in E:2405 (Glasburen).

Note to potential opponents: Register as an opponent to the presentation of your choice by sending an email to the examiner for that presentation (firstname.lastname@cs.lth.se). Do not forget to specify the presentation you register for! Note that the number of opponents may be limited (often to two), so you might be forced to choose another presentation if you register too late. Registrations are individual, just as the oppositions are! More instructions are found on this page.


11:15-12:00 in E:2405 (Glasburen)

Presenters: Ivar Henckel, David Söderberg
Title: Database Loading Strategies for an In-Memory Cache in Java
Examiner: Jonas Skeppstedt
Supervisors: Alma Orucevic-Alagic (LTH), Inger Klintmalm (Nasdaq Technology AB)

The efficiency of software systems can be negatively affected by database latency which can take a significant fraction of execution time. To mitigate run-time latency, different levels of caching can be introduced with different strategies to load the cache. In this thesis such strategies are investigated, mainly focusing on lazy loading and parallel preloading of the cache. We implement some of the identified strategies and conduct an experimental analysis of the performance. All of the strategies are implemented using Java together with the Hibernate ORM framework. The caching strategies could be translated to any other ORM framework. After running experiments and comparing the collected measurements we conclude that one of the lazy loading solutions relying on Hibernate proxies is inefficient. Another lazy loading solution, which is based on lookup tables, effectively moves the latency from startup to run-time while also removing the cost of fetching data preemptively. The solution using inter-query parallelism with parallel preloading achieves efficient startup and run-time latency when all data is not requested from the cache directly at startup. In conclusion, the caches using parallel preloading and lookup tables perform the best and are recommended to be used, either by themselves or in combination.

Link to presentation: https://lu-se.zoom.us/j/2576059411

Link to popular science summary: https://fileadmin.cs.lth.se/cs/Education/Examensarbete/Popsci/220617_1130HenckelSöderberg.pdf