Automation of Unloading Graincars using “Grain-o-bot”
Lokhamoorthi, Aravind Mohan
Large quantities of bulk grain are moved using graincars in Canada and other parts of the world. Automation has not progressed significantly in the grain industry probably because the market is limited for automated systems. A prototype of a robot (“Grain-o-bot”) using machine vision to automatically open and close graincar hopper gates and detect the contents of the graincar was built and studied. The “Grain-o-bot” was a Cartesian robot equipped with two cameras and an opening tool as the end-effector. One camera acted as the eye to determine the sprocket location, and guided the end-effector to the sprocket opening. For most applications, machine vision solutions based on pattern recognition were developed using images acquired in a laboratory setting. Major constraints with these solutions occurred when implementing them in real world applications. So the first step for this automation was to correctly identify the hopper gate sprocket on the grain car. Algorithms were developed to detect and identify the sprocket under proper lighting conditions with 100% accuracy. The performance of the algorithms was also evaluated for the identification of the sprocket on a grain car exposed to different lighting conditions, which are expected to occur in typical grain unloading facilities. Monochrome images of the sprocket from a model system were acquired using different light. Correlation and pattern recognition techniques using a template image combined with shape detection were used for sprocket identification. The images were pre-processed using image processing techniques, prior to template matching. The template image developed from the light source that was similar to the light source used to acquire ii images was more successful in identifying the sprocket than the template image developed using different light sources. A sample of the graincar content was taken by slightly opening and immediately closing the hopper gates. The sample was identified by taking an image using the second camera and performing feature matching. An accuracy of 99% was achieved in identifying Canada Western Red Spring (CWRS) wheat and 100% for identifying barley and canola.
Robotics, Automation, Image processing, Correlation, Template matching