This is phase 1 of a mixed reality project with the possibility to extend upon good work.
As an example of what we are trying to do, please see the attached video called 'ZED Mini Greenscreen'. In this video, they have used a Oculus Rift with a Zed Mini on the front to create a virtual world, limited by the green screen, and a tethered headset. But the effect of seeing your own arms in a complete 3d virtual world is what we want.
Headsets like Oculus/Zed are quite expensive considering the end product will be for kids, so as an alternative, we think MaxST Sensor Fusion SLAM can do what we need with a single RGB camera and IMU from phone. It's actually quite robust as you can see from the attached MaxST Sensor Fusion video.
Here is what I need:
Step 1: Use MaxST to create a 3D point cloud map of a basketball court (can be outdoors/indoors). Your point cloud should capture also the basketball backboards (goals).
Step 2: Export the map into Unity, where you will have a 1:1 scale basketball court model. The point cloud map should align with your model in Unity.
Step 3: Add some visual graphics onto the court surface (like how they added the character in the MaxST video) and test it out on your phone, recording some videos for me to see.
Step 4: In the MaxST video, you see they can track the movement of the user within the map. I need video footage of the user's movement within the basketball court.
Step 5: I will need to be able to test it out on my phone here, so I will need you to export your work as an APK so that I may try it out on a local court here.
If you would like to discuss the job in more detail, please let me know.