A vision-based error compensation method for accurate path tracking in robotic trimming

dc.contributor.authorTayaranian Marvian, Keyvan
dc.contributor.examiningcommitteeSalimi, Elham (Electrical and Computer Engineering)
dc.contributor.examiningcommitteeLiang, Xihui (Mechanical Engineering)
dc.contributor.supervisorKhoshdarregi, Matt
dc.date.accessioned2023-09-04T16:24:44Z
dc.date.available2023-09-04T16:24:44Z
dc.date.issued2023-08-16
dc.date.submitted2023-08-16T15:42:56Zen_US
dc.degree.disciplineMechanical Engineeringen_US
dc.degree.levelMaster of Science (M.Sc.)
dc.description.abstractTrimming is a common step in the fiberglass manufacturing process. During the trimming operation, outer edges and inner cutouts of molded fiberglass parts are removed to get the final part. This process is commonly performed manually by workers in many fiberglass manufacturing plants. Robotic automation of trimming processes is a challenging task due to the highly variable nature of fiberglass manufacturing. Fiberglass parts typically suffer from manufacturing inconsistencies and deformations, and rendering pre-defined offline robot programs is impractical. To enable robotic automation of trimming processes, it is necessary for robots to detect and adjust to part variation. This research develops a methodology in which a fusion of vision and laser sensors along with advanced image processing and robot control techniques are used to automatically detect and accurately follow trimming paths on fiberglass parts. A multi-stage real-time and offline error compensation framework is proposed. An external 3D camera and point cloud processing techniques are used to automatically detect trimming paths and generate target points to guide a robot. To improve the accuracy of the robot path, a 2D camera mounted on the robot is used to directly measure and correct the path deviation. A laser displacement sensor is also used to implement real-time height control, ensuring a constant distance between the onboard camera and the surface. A laser cross sensor is also utilized to measure and correct the orientation errors. Moreover, a deep learning model is developed to improve the robustness of the path detection step. In comparative experiments, variants of the U-Net architecture and backbones with different hyperparameters are compared to find the best-performing model. A U-Net model with an Xception backbone is trained to be a classifier with an AUC value of 0.99 and 96.38% accuracy on test data. The developed model is tested on a sample fiberglass part using an industrial robot. The results show that errors can be reduced to less than 0.5 millimeters and 3 degrees, which meet the required tolerance in typical fiberglass manufacturing applications.
dc.description.noteOctober 2023
dc.description.sponsorshipResearch Manitoba Innovation Proof-of-Concept, Project No. 4763, Title: "Autonomous Robotic Platforms for Aerospace and Vehicle Composite Manufacturing.", https://researchmanitoba.ca/funding/programs/innovation-proof-of-concept-grant/
dc.identifier.urihttp://hdl.handle.net/1993/37558
dc.language.isoeng
dc.rightsopen accessen_US
dc.subjectRobotic Trimming
dc.subjectVision-Based Error Compensation
dc.subjectPath Tracking
dc.subjectPath Detection
dc.subjectDeep Learning
dc.titleA vision-based error compensation method for accurate path tracking in robotic trimming
dc.typemaster thesisen_US
local.subject.manitobayes
oaire.awardNumberProject No. IT25100
oaire.awardTitleVision-guided autonomous robotic trimming of vehicle fiberglass parts
project.funder.identifierhttp://dx.doi.org/10.13039/501100004489
project.funder.nameMitacs Accelerate
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Marvian_Keyvan.pdf
Size:
3.68 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
770 B
Format:
Item-specific license agreed to upon submission
Description: