A neural network to classify auditory signals for use in autonomous harvester control systems

dc.contributor.authorSimundsson, Avery
dc.contributor.authorThomas, Gabriel
dc.contributor.authorMann, Danny
dc.contributor.examiningcommitteePetkau, Donald (Biosystems Engineering) Thomas, Gabriel (Electrical and Computer Engineering)en_US
dc.contributor.supervisorMann, Danny (Biosystems Engineering)en_US
dc.date.accessioned2019-08-29T17:11:36Z
dc.date.available2019-08-29T17:11:36Z
dc.date.issued2019-07-22en_US
dc.date.submitted2019-08-17T14:51:01Zen
dc.degree.disciplineBiosystems Engineeringen_US
dc.degree.levelMaster of Science (M.Sc.)en_US
dc.description.abstract3 1 ABSTRACT As agricultural machinery moves into the digital era, significant developments in the available technology make autonomous farm vehicles more feasible, affordable, and desirable. One of the challenges of effective autonomous vehicle control specific to agriculture is the ability of the vehicle to interpret and adapt to constantly changing conditions. There are many types of sensors able to identify specific changes in conditions (elevation, temperature, image etc.), but a single indicator to signal a variety of changes in operating conditions would be beneficial to a remote human operator, particularly in triggering an automatic shutdown to prevent machinery damage. Auditory information is a primary indicator of changing conditions to an in-cab operator, particularly in detecting mechanical overload in a combine. This paper explores the potential for auditory information, which has proven valuable to an in-cab operator, to be used in autonomous vehicle control. Sound was recorded at a sampling rate of 48 kHz near the combine chopper for three different operating modes during the same harvest day for canola. Samples from each clip were segmented and analyzed using the Fast Fourier Transform (FFT) in MATLAB. The FFT generated a power spectral density (PSD) function for each segment, from which eight features were extracted and labelled to create feature vectors. These vectors were used to create a classifier using a feedforward pattern recognition neural network. The network used scaled conjugate gradient backpropagation training to achieve accuracy of 100%. The speed of sampling and analysis is sufficient to be used in real time machinery analysis and control.en_US
dc.description.noteOctober 2019en_US
dc.identifier.urihttp://hdl.handle.net/1993/34105
dc.language.isoengen_US
dc.rightsopen accessen_US
dc.subjectAutonomous, FFT, Machine Learning, Control system, Auditory, Sonificationsen_US
dc.titleA neural network to classify auditory signals for use in autonomous harvester control systemsen_US
dc.typemaster thesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Simundsson_Avery.pdf
Size:
2.7 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.2 KB
Format:
Item-specific license agreed to upon submission
Description: