Automation of Unloading Graincars using “Grain-o-bot”

dc.contributor.authorLokhamoorthi, Aravind Mohan
dc.contributor.examiningcommitteeWhite, N.D.G. (Biosystems Engineering) Britton, Myron (Civil Engineering) Venkatesh, Meda (University of Saskatchewan, Food and Bioprocess)en_US
dc.contributor.supervisorJayas, Digvir (Biosystems Engineering)en_US
dc.date.accessioned2012-01-17T00:14:10Z
dc.date.available2012-01-17T00:14:10Z
dc.date.issued2012-01-16
dc.degree.disciplineBiosystems Engineeringen_US
dc.degree.levelDoctor of Philosophy (Ph.D.)en_US
dc.description.abstractLarge quantities of bulk grain are moved using graincars in Canada and other parts of the world. Automation has not progressed significantly in the grain industry probably because the market is limited for automated systems. A prototype of a robot (“Grain-o-bot”) using machine vision to automatically open and close graincar hopper gates and detect the contents of the graincar was built and studied. The “Grain-o-bot” was a Cartesian robot equipped with two cameras and an opening tool as the end-effector. One camera acted as the eye to determine the sprocket location, and guided the end-effector to the sprocket opening. For most applications, machine vision solutions based on pattern recognition were developed using images acquired in a laboratory setting. Major constraints with these solutions occurred when implementing them in real world applications. So the first step for this automation was to correctly identify the hopper gate sprocket on the grain car. Algorithms were developed to detect and identify the sprocket under proper lighting conditions with 100% accuracy. The performance of the algorithms was also evaluated for the identification of the sprocket on a grain car exposed to different lighting conditions, which are expected to occur in typical grain unloading facilities. Monochrome images of the sprocket from a model system were acquired using different light. Correlation and pattern recognition techniques using a template image combined with shape detection were used for sprocket identification. The images were pre-processed using image processing techniques, prior to template matching. The template image developed from the light source that was similar to the light source used to acquire ii images was more successful in identifying the sprocket than the template image developed using different light sources. A sample of the graincar content was taken by slightly opening and immediately closing the hopper gates. The sample was identified by taking an image using the second camera and performing feature matching. An accuracy of 99% was achieved in identifying Canada Western Red Spring (CWRS) wheat and 100% for identifying barley and canola.en_US
dc.description.noteFebruary 2012en_US
dc.identifier.urihttp://hdl.handle.net/1993/5097
dc.language.isoengen_US
dc.rightsopen accessen_US
dc.subjectRoboticsen_US
dc.subjectAutomationen_US
dc.subjectImage processingen_US
dc.subjectCorrelationen_US
dc.subjectTemplate matchingen_US
dc.titleAutomation of Unloading Graincars using “Grain-o-bot”en_US
dc.typedoctoral thesisen_US
local.subject.manitobayesen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Lokhamoorthi_ Aravind Mohan.pdf
Size:
2.43 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.25 KB
Format:
Item-specific license agreed to upon submission
Description: