2025 Ignite Grant Recipients
|
|
|
|
LIVE is thrilled to announce the winners from the latest cycle of its Ignite Tech Development Grant, showcasing the diverse potential of emerging learning technology innovation. This year’s competition attracted a strong and diverse applicant pool from across Vanderbilt’s schools and all academic levels, from undergraduates to faculty.
|
| |
The winning proposals focus on learning that spans the spectrum from math to empathy to AI ethics. Recipients were chosen based on their project’s novelty, field significance, feasibility, scalability, sustainability and future research potential. Over the award period, LIVE’s research engineer, Albert Na, will collaborate with the awardees to transform their ideas into tangible, practical and impactful tools, pushing the boundaries of learning technology development.
|
| |
|
2024 Ignite Grant Outcomes
|
|
|
On October 22nd, the Inaugural Ignite Grant recipients presented the outcomes of their work on the grant. Read on to learn about each of the 3 projects.
Doctoral students Zachary Karas and Zihan Fang worked with LIVE Research Engineer Albert Na to successfully demonstrate the feasibility of using eye-tracking technology to monitor student's cognitive load during learning activities which establishes a scalable research protocol for future research with their project "Real-Time Cognitive Load Detection using Eye-Tracking in a Classroom Setting."
|
| |
|
Dr. Akos Ledeczi's project was presented by doctoral student Saman Kittani and worked on with LIVE Post Doctoral Scholar Gordon Stein to design an intelligent coding assistant specifically for the block-based NetsBlox platform. BloxBuddy works to interpret block-based code to support students learning to program. Through this project students can now request an explanation of the current project, ask for new ideas to explore, and request assistance in finding a bug.
|
|
|
Finally, Dr. Vishesh Kumar worked with LIVE Post Doctoral Scholar Gordon Stein to create a tool that enables learners to train and use multimodal small ML models to recognize embodied movements. Kumar shared that dance and sports provide powerful use cases for revealing the computational AI that underlies gesture detection systems in popular devices such as smartwatches and dance games, giving youth greater control and flexibility over such tools. His project with the Ignite Tech Grant crated a tool that enables making models of both video based pose data and sensor based movement data and exports the models in a lightweight flexible format that can be used across various other creative platforms like Scratch and NetsBlox.
|
| |
|
| Nov 12
AI In Legal Practice
|
Mark Williams, Vanderbilt Law
|
|
|
| Nov 19
Multimodal Analytics & the Classroom of the Future
|
Gautam Biswas, Computer Science
|
|
|
You can find the latest information about the LIVE Learning Innovation Incubator and our Learning Innovation Series on our website. Got this announcement forwarded to you? Sign-up for our mailing list to receive invitations to our events and find out about our latest initiatives.
|
|
|
email us - LIVE@vanderbilt.edu
|
1400 18th Avenue South | Suite 3002 | Nashville, TN 37212 US
|
|
|
|