Development of a real-time framework for farm monitoring using drone technology

Received Apr 26, 2020 Revised May 31, 2020 Accepted Jun 9, 2020 This work developed a cost-effective framework for agriculturists to regularly monitor their crops against intruding rodents and other security concerns using modern drone technology through configuration and deployment of an autonomous UAV which also functions as a remotely piloted vehicle. This was done by configuring a quadcopter capable of causing a disturbance when a rodent is observed through an inbuilt alarm system whose sound is amplified to be loud enough to cause the animals to leave the farm area. A framework for real-time image and live video transmission from the farm to a designated remote base station was developed. This was achieved through programming codes that configured the drone to operate an intelligent alarm and object tracking systems which enables a live feed from the UAV using Arduino IDE and Mission Planner for autonomous flight control. The requisite algorithms were developed using the framework of tracking, learning, and detection in the OpenCV software. The drone movement is equally controlled remotely over a Wi-Fi network using an ESP8266 Wi-Fi module for redirection and controlling of the drone movement to monitor specific locations.


INTRODUCTION
Investment in agriculture forms the basis for well-being and poverty reduction in Africa, particularly amongst the poorest people [1]. Optimizing agricultural profit through increasing productivity and improved yield has benefited from several innovative developments over the years including the use of drones' technology [2]. Major applications for drones related to the eradication of rodents include animal detection through the use of infra-red cameras, the delivery of baits, securing timely high resolution imagery of the area of interest and many more [3].
Unmanned aerial vehicles (UAVs), which are also referred to as drones [4], are flying robots that can fly thousands of kilometers or in confined spaces [5]. These vehicles do not carry a human operator, can fly remotely or autonomously, and can carry lethal or nonlethal payloads [5]. These UAVs can be equipped with different sensors and equipment to perform different tasks ranging from aerial photography [6] to disaster search and rescue [7].
UAVs are semi-autonomous or fully autonomous aircrafts that can carry cameras, sensors, communication equipment or other payloads [8]. Developments in the field of unmanned aerial systems (UAS) began several decades ago, before the first manned airplane flight occurred in 1903 [9]. The first and  [9]. UAVs are classified based on different parameters. In considering civil and military applications, UAVs are classified as MAVs (micro or miniature air vehicles), NAVs (nano air vehicles), VTOL (vertical take-off and landing), LASE (low altitude, short-endurance), LASE close, LALE (low altitude, long endurance), MALE (medium altitude, long endurance), and HALE (high altitude, long endurance) [5]. UAVs can also be classified based on their performance characteristics which include weight, wing span, wing loading, range, maximum altitude, speed, endurance, and production costs [5,10]. S. Ward, et al., [11] developed a UAV that moves ahead of a user, equipped with a low-cost thermal camera and a small onboard computer that identifies heat signatures of a target animal from a predetermined altitude and transmits that target's GPS coordinates. The system, consisting of a quadrotor UAV (3DR IRIS), an autopilot (Pixhawk), a thermal camera (FLIR Lepton), a microcomputer (Raspberry Pi 2) and GPS (3DR brand) module, was capable of autonomously locating animals from a predetermined height and generates a map showing the location of the animals ahead of the user [11].
The work of [12] focused on the development and implementation of an on-board computing module in an UAV based on a low-cost single-board computer. This module transmits the data gathered by its attached sensors (GPS, image, and other UAV sensors) to a base station. At the base station, the GPS coordinates are subsequently used to track the trajectory in a user interface based on Google Maps. The mobile station was developed using C programming language on a Raspberry Pi single board computer with Linux distribution. It captures images using a webcam and uses a conventional GPS module to acquire the latitude/longitude. The acquired data is then subsequently sent to the base station through a Wi-Fi communication link. The base station comprises of a web application, developed in Node.Js, with a graphical interface to show (in real time) the trajectory of the UAV.
A. Mazur [13] developed an autonomous multirotor UAV (hexacopter) operated and controlled through 4G LTE using onboard GPS and image processing which was capable of intelligent remote waypoint navigation and image processing by utilizing modern communication networks. The hexacopter was employed to carry out object tracking and surveillance by coordinating its flight patterns based on image processing algorithms being implemented [13].
A quadcopter was designed to obtain stable flight, gather and store CO2 data using a KK 2.1.5 flight controller board by [14]. In this study, the quadcopter was capable of flying and landing in a stable manner, determining its exact location from GSM data, and also storing and logging CO2 data obtained. The system was implemented using motors, electronic speed controllers, Arduino development board, sensor boards, batteries, a transmitter, a receiver, a GPS module, and SIM card which were all interfaced accordingly while the PID controller was tuned for stability.
An autonomous quadcopter drone, fitted with a GPS tracking system, and programmed to autonomously fly from one location to another using GPS coordinates was developed by [15]. Execution of the drone's software begins with the user being prompted to enter a set of destination latitude and longitude values in decimal form, negative numbers are used for southern and western coordinates [15]. The inputted values are converted into feet and compared to the GPS receiver's values which are also converted to feet. The difference in these values determines the destinations direction depending on if the latitude difference is positive or negative. The drone would then lift off and maneuver to its destination, maintaining stability along the way using the PID controller. When it arrives at its destination, with a margin of error of two feet in either direction, it would slowly spin down its motors until it returned to its preset ground level [15].
This work focuses on developing local capacity in the deployment and re-configuration of an autonomous UAV for general surveillance but with specific application in farm monitoring. Apart from the general framework for real-time image and live video transmission from the farm to a designated remote base station, the work equally developed a platform for the drone movement to be controlled remotely over a Wi-Fi network for redirection thereby making it possible for the drone to change its pre-programmed flight pattern when intruders are sensed in some other locations thereby making the surveillance system more intelligent.

RESEARCH METHOD 2.1. Unmanned aerial vehicle
UAVs usually have three, four, six or even eight rotors for a stable flight [16]. The UAV requires both hardware and software components to configure it. The UAV is capable of being deployed as an autonomous UAV or a Remotely Piloted Vehicle (RPV). The UAV schematic is as shown in Figure 1 while all the basic components of the constituent hardware and software components are as listed below. Mission planner is a ground control station for Ardupilot. It provides setup and flying support, review of recorded flight, point-and-click waypoint entry using Google maps, configuration of settings for the airframe, selection of mission commands. A sample flight pattern obtained from the Mission Planner software is shown in Figure 2. b. Image processing During autonomous flight of the UAV, the system is capable of tracking intruders/rodents detected within the perimeter of the surveillance area. Also, the UAV is capable of alerting specified personnel (security personnel) in the case of detecting an intruder using Raspberry Pi 3 B+ Board and Raspberry Pi Camera with night vision lens. Instructions were given to the system to be able to carry out the necessary action on information obtained from the camera feed. Object tracking algorithm was developed using Convolutional Neural Networks (CNN) which was trained and deployed onto the raspberry pi.
Sample images of different objects were gotten from an online source and the model was trained using a template of the Google trained model. The model was trained to identify objects from a frame in video and identify with a confidence interval of above 60%. A bounding box is created around the images captured by the camera and compared with the model trained. Once the confidence interval, which is the similarity level, is above 70%, the system activates the buzzer to create an alarm system and alert the necessary personnel. The raspberry pi was boot loaded with Raspbian OS which would enable the tracking algorithms to function well and easily implemented. A Linux operating system was used as the communication link between the raspberry pi and the personal computer. The screenshots of the Boot loading Raspberry Pi with Raspbian OS are shown in Figure 3.  The system is capable of remote navigation control in which the system can be controlled and redirected from a remote location. To effectively implement this system, ESP8266 Wi-Fi Module and SFM-27 DC 3-24 V buzzer components are used. The Wi-Fi module was connected to the telemetry port of the flight controller via a telemetry cable. The system was powered on and the Wi-Fi module hosted a network on the system which was connected via a laptop. The Mission planner software was opened and connected to the flight controller via the UDP protocol which then gave a two-way connection to and from the UAV in which data is sent.

RESULTS AND ANALYSIS
The quadcopter was powered by the 5000 mAh LiPo battery. It was recorded that the flight time provided by the battery was approximately 25 minutes at half throttle while testing on a portion of the farmland. It was in the remote-controlled flight that this was observed when the system was put into continuous flight. The quadcopter was powered by the 5000 mAh LiPo battery. It was recorded that the flight time was approximately 15 minutes at full throttle and 25 minutes at half throttle. It was in the remote-controlled flight that this was observed. Waypoints were uploaded on the quadcopter and it was observed that the drone (Figure 4) followed the waypoints with some discrepancies due to the flight instability. After going through each waypoint, it returned to the launch position and landed.
The landing gears of the UAV were designed and customized using the PTC Creo Parametric design software. The designed CAD model was converted to .stl (stereolithography) format representing a three-dimensional surface as an assembly of planar triangles. The build orientation was designed in such a way that the x-y plane makes the model to be conveniently positioned and ensuring that the shortest dimension is in the z direction thereby reducing the number of layers and shortening the build time. The slicing software used to convert the .stl model to g-codes for the eventual 3D printing is CURA. Figure 5 shows the customized landing gear designed and 3D-printed for the drone The raspberry pi, being boot loaded with the Raspbian operating system via NOOBS software, was powered on and all components were in good condition. It successfully connected to a Wi-Fi network hosted and could send and receive data. The camera was connected to the camera port and the interface was enabled on the raspberry pi. Sample images were taken and videos were recorded to test the operation of the camera as shown in Figure 6.
Real-time imagery and video feed from the drone movements were transmitted over Wi-Fi to the computer system on which it is being viewed. Although, due to the fluctuation in the network, there was latency in the video stream being transmitted, thus, it can be considered as near real-time. The video stream was hosted on VLC media player on a laptop and several mobile phones to verify its mobility and ease of connection. The system was configured to follow the flight pattern currently active in its memory and each flight pan has a home location. Different navigation points were developed and uploaded to the flight controller which was connected via the Wi-Fi network from the remote device. The system was observed to transit between different waypoints with a little delay in transiting due to time taken to upload the new waypoints.
Sensors were placed at strategic locations on the area being monitored. These sensors, which are ultrasonic, passive infrared (PIR) and motion sensors, detect unexpected and unwanted motion in that area and raise an alarm by activating a lamp at the base station. While different flight patterns have been constructed for each location of the sensors, the administrator only uploads the flight pattern corresponding to the location of interest and the UAV automatically reprogram its flight route to the new flight route and moves to the new location for surveillance while transmitting imagery to the base station. After satisfactory surveillance, the administrator uploads the default surveillance flight pattern and the UAV returns to surveillance.

CONCLUSION
A surveillance drone has been successfully configured and deployed and adequately equipped with remote navigation and flight pattern control which can be used for large scale farm monitoring. Using a raspberry pi board, live data was captured in the form of a video stream and transmitted over a Wi-Fi network to be displayed using a VLC player on any device. With the graphics processing capability of the board, images are processed to detect the presence of animals or human intruders and an alarm system on board the UAV is triggered.