There are 3 steps to this project, with 3 milestone payments.
We are creating an AR game using Unity as the game engine. We have been using Google ARCore but are unhappy with the drift and tracking capabilities for our needs.
We have seen many YOLO/YOLACT/Mask-RCNN tracking videos (like the link below) and are impressed with the real-time tracking it can produce.
[login to view URL]
For our app, a user will have a camera within a headset (POV), so for the solution, you can not used a fixed camera point.
Here are the 3 steps to the project.
Step 1: Proven ability to track a person's real-time location/direction within a full sized basketball court. This must be from a camera (POV).
Step 2: Proven ability to track a basketball and a basketball backboard in real-time from a camera (POV)
Step 3: Proven ability to highlight interactions between ball and backboard in real-time (for example, if both ball and backboard are being tracked, we need you to show a graphic element every time the ball touches the backboard).
PLEASE only apply if you are confident you can achieve thes 3 steps, otherwise you waste your time and mine.