OpenPTrack is an open-source person tracking software developed at REMAP. Since joining the development team in 2018, I have contributed to the project in various capacities:
We are working to add support for Intel RealSense and Sterolab Zed depth imagers in OpenPTrack, as well as the new Kinect Azure. This is under active development over Summer ‘19. In this effort, I am responsible for containerizing the project to run all imagers from a single Docker image, system tests & configuration for optimal imager performance, and documentation.
In order to show all of OpenPTrack’s data more effectively (spatial position, pose, objects, facial recognition), we wish to have a Unity-based visualization app. This also serves as a step to integrate OpenPTrack with mobile AR. C# consumers receive raw tracking data from ROS, and experience-specific effects can be added easily in Unity. The Unity project is available on github.
Experimental Kalman Filter Module
OpenPTrack now provides various ‘centroids’, or person positions, including: a HOG person detection algorithm, YOLO object tracking, and pose recognition. Other data, such as phone positions in mobile AR scenarios, can also provide spatial location information depending on the context. I’m working on a Kalman Filter module for sensor fusion to provide an estimation based on all of these detections. The resulting detection will be more accurate and more persistent (i.e. less ‘dropped tracks’). Initial tests have shown the Kalman Filtered position data to be more robust than any one tracking solution; a production-ready module is in the works!
We have active OpenPTrack installations in UCLA’s Little Theater and REMAP offices, as well as at our collaborators’ labs, for use in embodied learning for children and immersive performance. As a part of this work, I setup and maintain these installations.