MLP Lab: Wall Following Robot
Alright, are you ready for some fun? Today, we're going to explore the magical world of Multi-Layer Perceptron (MLP) by helping our friend Robiee navigate and follow a wall!
To make this happen, we're going to use data from the SCITOS G5 robot. Don't worry if you're not familiar with the robot, we'll guide you through everything step-by-step.
So buckle up and get ready to witness the power of MLP as we help Robiee on his exciting journey!
Data Set Location :
https://www.kaggle.com/datasets/uciml/wall-following-robot
The data were collected as the SCITOS G5 navigated through the room following the wall in a clockwise direction, for 4 rounds. To navigate, the robot uses 24 ultrasound sensors arranged circularly around its "waist". The numbering of the ultrasound sensors starts at the front of the robot and increases in clockwise direction.
Acknowledgements
These datasets were downlaoded from the UCI Machine Learning Repository
Lichman, M. (2013). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.
As observed, the MLP's accuracy is quite impressive at 98.6 percent. Thanks to our assistance, Robie is now able to navigate without colliding with the walls. To understand in-depth theory how its working follow previous blog on Exploring the World of Multi-Layer Perceptrons