Between 5th of May to 9th of May, researchers from departments of Electrical and Information Technology (EIT), Computer Science and Automatic Control and Centre for Mathematical Sciences in Lund University conducted an extensive measurement campaign in Lund University Humanities Lab's motion capture studio.
The measurement campaign was carried out to create a dataset that combines different sensors in the given environment to accurately position devices (~cm) in real-time. A robot was equipped with a camera, a speaker, and a radio user equipment (which can transmit and receive signals from the EIT department's own massive MIMO base station, LuMaMi) and programmed to move in different trajectories.
The dataset will be used to develop hardware friendly positioning algorithms that fuses the information from vision, audio, and wireless sensors. The motion capture system is used to track the robot and the equipment on top, which will be used to verify the results obtained by the new positioning algorithms.
Overall, the dataset will contribute to in depth understanding and further development of AI/Machine learning and 6G applications.
Contributors: Ilayda Yaman, Guoda Tian, Henrik Garde, Martin Larsson, Patrik Persson, Michiel Sandra, Alexander Dürr, Erik Tegler, Nikhil Challa, Sirvan Abdollah Poor, Steffen Malkowsky, Ove Edfors, Karl Åström, Fredrik Tufvesson, Volker Kruger, Anders Robertsson, Liang Liu.
Text & contact: Ilayda Yaman