Date on Master's Thesis/Doctoral Dissertation

5-2022

Document Type

Master's Thesis

Degree Name

M.S.

Department

Electrical and Computer Engineering

Degree Program

Electrical Engineering, MS

Committee Chair

Popa, Dan

Committee Co-Chair (if applicable)

Naber, John

Committee Member

Naber, John

Committee Member

McIntyre, Michael

Committee Member

Harnett, Cindy

Committee Member

Zhang, Ruoshi

Author's Keywords

Automation; tactile sensors; computer vision

Abstract

Human-Robot Interaction is a developing field of science, that is posed to augment everything we do in life. Skin sensors that can detect touch, temperature, distance, and other physical interaction parameters at the human-robot interface are very important to enhancing the collaboration between humans and machines. As such, these sensors must be efficiently tested and characterized to give accurate feedback from the sensor to the robot. The objective of this work is to create a diversified software testing suite that removes as much human intervention as possible. The tests and methodology discussed here provide multiple realistic scenarios that the sensors undergo during repeated experiments. This capability allows for easy repeatable tests without interference from the test engineer, increasing productivity and efficiency. The foundation of this work has two main pieces: force feedback control to drive the test actuator, and computer vision functionality to guide alignment of the test actuator and sensors arranged in a 2D array. The software running automated tests was also made compatible with the testbench hardware via LabVIEW programs. The program uses set coordinates to complete a raster scan of the SkinCell that locates individual sensors. Tests are then applied at each sensor using a force controller. The force feedback control system uses a Proportional Integral Derivative (PID) controller that reads in force readings from a load cell to correct itself or follow a desired trajectory. The motion of the force actuator was compared to that of the projected trajectory to test for accuracy and time delay. The proposed motor control allows for dynamic force to stimulate the sensors giving a more realistic test then a stable force. A top facing camera was introduced to take in the starting position of a SkinCell before testing. Then, computer vision algorithms were proposed to extract the location of the cell and individual sensors before generating a coordinate plane. This allows for the engineer to skip over manual alignment of the sensors, saving more time and providing more accurate destinations. Finally, the testbench was applied to numerous sensors developed by the research team at the Louisville Automation and Robotics Research Institute (LARRI) for testing and data analysis. Force loads are applied to the individual sensors while recording response. Afterwards, postprocessing of the data was conducted to compare responses within the SkinCell as well as to other sensors manufactured using different methods.

Share

COinS