Other developed projects

Using Leap Motion for Mobile Robotics Interaction

 

Non-verbal communication is essential to humans and it is developed during the infancy early years. Children with autism spectrum disorder (ASD), however, experience hardships to learn such form of communication, which hinder their social skills. Inspired by the need of assistive tools to aid ASD therapy and the success of previous projects, this paper presents the development of an assistive technology (AT) to aid children with ASD to develop non-verbal communication skills. This AT consists of a virtual reality environment that employs hand tracking and gesture recognition to interact with a virtual robot, thus adding game elements to create an engaging scenario for the child. Hence, this paper describes the aforementioned system and discusses its preliminary results.

Work developed with friend and colleague Silas dos Reis Alves.
 

Using Leap Motion for Quantifying Hand Exercising

 

Various professions have an increased rate of carpal tunnel syndrome (CTS) given the reliance on regular and repetitive movements of the hand and wrist. The widespread use of computing devices, the popularity of video games, and the ubiquitous nature of mobile devices, the occurrence of CTS is increasing  amongst the general public. Given the rise in CTS along with the corresponding implications including reduced workplace productivity and impacts on the quality of life, there is significant interest in devising effective interventions to prevent CTS. Non-intrusive approaches include various hand stretching-based exercises that have shown to be effective. However, as with any exercise program, motivation to continue the program quickly decreases. Here we describe a hand motion tracking approach coupled with an engaging game-based 3D user interface (3DUI) to promote hand stretching exercises. The hand stretching exercises are tracked and provide feedback regarding the motions thus, ultimately helping the users to perform the exercise correctly. Preliminary results indicate that the system can be used to promote hand exercises in a fun, and engaging manner.

Work developed with Saskia Ortiz, Bill Kapralos and David Rojas.
 

Stereo 3D and Augmented Reality for Eye Examination Training

 

The fundus eye exam, an important ophthalmologic assessment procedure that allows examining the eye’s health, is taught by demonstration and guided practices whereby the trainees practice on each other and expertise is gained through experience using an ophthalmoscope. However, in addition to the issues associated with such an apprenticeship model, the anatomy of the eye’s intricate oculomotor system is conceptually difficult for novice trainees to grasp. The examination is based on 2D eye fundus images that without proper training and skills abnormalities in the eye can be overlooked. Although virtual anatomy and simulators are available to alleviate some of these issues, these still require an elevated investment and infrastructure and are typically limited to one user at a time. Our ongoing work is seeing the development of an engaging and interactive stereoscopic augmented reality app. The app allows a student to navigate, in an immersive stereoscopic 3D environment, the inner volumetric shape of the eye important to detect features and pathologies.

Work developed with Carlos Soto, Bill Kapralos and Norman Jaimes.
 

Orthodontic Treatment Effects Under Alveolar Bone Loss

 


The orthodontic treatments in patients with bone loss are subjects of study because when applying forces it can create the extrusion of the teeth. In this study the orthodontic procedure was simulated with the retraction of the incisive when using brackets and an elastic band generating a 1 Newton force. Three finite element models were created with three different levels of alveolar bone loss (2mm, 4mm and 6mm) composed of mandible, periodontal membrane (PDM) and orthodontic compounds. This project was developed in partnership with friend and colleague Byron Alfonso Pérez.

Multimedia HTML based tool for virtual ear anatomy study

 


Interactive tool for studying the anatomical characteristics of the inner, medium and outer ear. The developed tool followed a set of requirements established in partnership with the specialist physicians and the engineers, among them 3-D models were required so the anatomy could be known without any constraints, an animation of how the chain of bones works, detailed information that could be read while navigating a virtual environment with the desired part. This project was developed in partnership with friend and colleague Byron Alfonso Pérez.

 

Virtual Environment for Cartesian Robot Teaching

 


Virtual based tool embedded in an HTML file, through the 3D model of a Cartesian robot the basic operation commands are provided in order of allowing teaching, training or getting familiar with the device aiding those processes when either the device is not physically available or the user. This project was developed in partnership with friend and colleague Augusto Mainardi.

 

 

 

3DUIs for Teleoperating Serial and Mobile Robots

 


In this scenario, the 3DUIs APIs were integrated with the robots system to teleoperate and program them accordingly to user motion captured by the sensors. 

Wiimote and Festo's Robotino (Codeveloped with Silas Franco dos Reis Alves). 

Kinect and Festo's Robotino (Codeveloped with Silas Franco dos Reis Alves). 

Serial legged robot and Wiimote. 

Serial legged robot and Kinect

 

 

 

 

 

 

 

 

3D modeling


Anatomy 3D models for medical applications using Autodesk Maya.