Academic staff
Leonie Laskowitz
Research focuses on motion capture technology
Projects
Martial arts training with technical support
Through kinematic motion analysis using motion capture technology, the training of beginners in the martial art of Muay Thai was analyzed and optimized. Traditionally, the movement sequence is guided and evaluated by a trainer. However, in cooperation with the university, athletes did not train in a dojo but in the motion capture lab at Sanderheinrichsleitenweg. This allowed for very detailed control over the movement sequences and possible sources of error could be corrected early on.
For more information, visit: www.mainpost.de/regional/wuerzburg/kampfsport-training-mit-technischer-unterstuetzung-art-11034350"
Dokument(e)
Foto L. Laskowitz | Messung der Muay Thai Technik mittels MoCap Skelett.webpEvaluation of a Motion Capture Training Approach as an Indicator for Performance Improvement in Muay Thai
A study at the Technische Hochschule Würzburg-Schweinfurt investigated the use of Motion Capture (MoCap) in the martial art of Muay Thai. Muay Thai required precise movements that were previously time-consuming and subjectively evaluated. The study combined MoCap with traditional training methods to enhance athletes' performance, demonstrating significant progress.
For more information, visit:
www.thws.de/service/news-presse/pressemeldungen/thema/evaluation-eines-motion-capture-trainingsansatzes-als-indikator-zur-leistungssteigerung-im-muay-thai/
www.linkedin.com/posts/leonie-laskowitz_wilkommen-kaeuppele-festung-activity-6991408408348307456-teJP
Dokument(e)
Leonie-Laskowitz-und-zwei-Athleten_-deren-Bewegungsabläufe-mithilfe-Motion-Capture-und-einer-Visual.webpPublications
Buchbeiträge, Journals
Laskowitz, L. & Müller, N. H. (2023). Introduction and Evaluation of an Alternative Training Approach as Indicator of Performance Improvement in Martial Arts with the Help of Kinematic Motion Analysis Using Motion Capture. Proceedings of ACHI 2023, The Sixteenth International Conference on Advances in Computer-Human Interactions, May 2023, pp. 152-158