Kalman based sensor fusion using images and mechanical sensors
Common user control interfaces are exemplified by translations of user interaction into an unnatural medium, such as joystick controllers or vehicle controls. These devices require experience to use properly, limiting user interaction and carryover to different control devices. In this work we propose a control device based on intuitive user motion interaction, where this motion is replicated on a robotic system instead of an intermediate device control. The device control consists of a common camera phone, where the motion of the user is replicated on a 2 degree-of-freedom camera mount system. To achieve this concept, we begin by building and testing the accuracy of two different types of prediction algorithm, one based on images taken by the camera phone, and one using the sensors contained in the phone itself. The accuracy of the angular rotational estimation is evaluated on two different experiments, one using slower rotation, more suited to the image-based approach, and one using a faster rotation speed, where a sensor-based approach is more suitable. The results of each experiment are then evaluated, indicating that a combined approach for prediction is suitable for a range of rotational velocities. The combined method based on the Kalman filter is developed and tested, and results presented.