User manual
33 
˗  Highly reflective, such as water 
˗  Especially dark or bright 
˗  Subject to frequent lighting changes 
˗  Made of closely repeating patterns, such as tiles 
˗  Sound-absorbent, such as thick carpeting 
˗  In motion, such as roads with heavy traffic 
  Keep the binocular camera lenses and sensors clean. 
  To avoid interference with the Starpoint™ Positioning System, do not use 40 
kHz  ultrasonic  devices,  such  as  ultrasonic  rangefinders,  fault  detectors, 
cleaners or welding machines. 
  Functionality Description 
  Dynamic Tracking 
Dynamic tracking uses deep learning algorithms to detect six types of subjects in 
real  time:  pedestrian,  cyclist,  car,  truck,  boat,  or  animal.  A  real-time  tracking 
algorithm  is  used  to  automatically  track  the  selected  subject  while  avoiding 
obstacles during flight. The function can track the subject in three modes. For more 
information please refer to the APP Manual document. 
  Viewpoint 
Viewpoint allows the user to send the aircraft to a selected destination by touching 
a point on the screen.
  Gesture Commands 
The main camera uses deep learning algorithms to recognize and respond to three 
gesture commands: outstretch your arms to set yourself as a target, raise both arms 
to capture a photo, and raise one arm to start or stop recording. 
  Accurate Landing 
Accurate landing uses the aircraft’s bottom binocular vision system to save a series 
of images indicating the  aircraft's altitude and yaw during takeoff. During the go-
home and landing process, the aircraft matches the altitude and yaw to the images 
taken during takeoff, and calculates how far off the aircraft is from its takeoff location. 
The aircraft is accurately controlled based on VIO feedback, and lands in its takeoff 
position.










