Design and development of soft robotic hand for vertical farming in spacecraft

Received May 9, 2019 Revised Oct 6, 2019 Accepted Oct 31, 2019 For colonization in deep space we need to explore the feasibility of a bioregenerative system in microgravity or artificial gravity environments. The process has various complexities form ranging to biological obstacles to engineering limitations of the spacecraft. Concentration of microbes in the confinements of a spacecraft can be fatal for the crew. In this paper, a solution to the elevated microbial levels by farming using robots is discussed. The soft robotic arm is made up of Asymmetric Flexible Pneumatic Actuator (AFPA). The AFPA under internal pressure will curve in the direction of the side having greater thickness as the expansion of the thinner side (outside radius) will be more than thicker side (inside radius) due to differential expansion and moment induced due to eccentricity. Simulation results demonstrate that bending based on AFPA can meet the designed requirement of application. The AFPA is used for five fingers of the robotic hand. The safe, soft touch and gentle motion of the bellow (AFPA) gives the feel of real human hand. The internal pressure of the AFPA is controlled using a solenoid valve which is interfaced using an Arduino microcontroller for hand like moves. The bending of the fingers and degree of freedom (DOF) of the joints of the hand is controlled using an IMU and flex sensor. Wireless connection of the hand and the control system is implemented using XBee pro 60mW with a range of 1 miles. The pneumatic soft robotic hand is made up of solenoid valve, Mini Compressor, AFPA bellow, and Servos. This soft robotic hand has many advantages such as good adaptability, simple structure, small size, high flexibility and less energy loss. As an extension Manual control of the robot using a virtual reality environment and well as some possible aspects of an automated farming systems can be considered as future additions.

For colonization in deep space we need to explore the feasibility of a bioregenerative system in microgravity or artificial gravity environments. The process has various complexities form ranging to biological obstacles to engineering limitations of the spacecraft. Concentration of microbes in the confinements of a spacecraft can be fatal for the crew. In this paper, a solution to the elevated microbial levels by farming using robots is discussed. The soft robotic arm is made up of Asymmetric Flexible Pneumatic Actuator (AFPA). The AFPA under internal pressure will curve in the direction of the side having greater thickness as the expansion of the thinner side (outside radius) will be more than thicker side (inside radius) due to differential expansion and moment induced due to eccentricity. Simulation results demonstrate that bending based on AFPA can meet the designed requirement of application. The AFPA is used for five fingers of the robotic hand. The safe, soft touch and gentle motion of the bellow (AFPA) gives the feel of real human hand. The internal pressure of the AFPA is controlled using a solenoid valve which is interfaced using an Arduino microcontroller for hand like moves. The bending of the fingers and degree of freedom (DOF) of the joints of the hand is controlled using an IMU and flex sensor. Wireless connection of the hand and the control system is implemented using XBee pro 60mW with a range of 1 miles. The pneumatic soft robotic hand is made up of solenoid valve, Mini Compressor, AFPA bellow, and Servos. This soft robotic hand has many advantages such as good adaptability, simple structure, small size, high flexibility and less energy loss. As an extension Manual control of the robot using a virtual reality environment and well as some possible aspects of an automated farming systems can be considered as future additions.

AFPA Flex Sensor IMU Solenoid Valve Vertical Farming
This is an open access article under the CC BY-SA license.

INTRODUCTION
Millions of people across the world are working in environments that have hazardous materials and could prove to be deadly as well. The major concern of handling hazardous materials by humans that are working in a place filled with radiations or having considerable amounts of radioactive materials is life threatening for the workers. These preventive measures are never 100 % safe. In 2012, 1133 people died as a result of hazardous waste accidents. In 2014, 4679 people died on the job in the United States of America; on an average, 13 deaths per day were reported. The statistics show a considerable increase in the annual deaths Recently there have been a lot of advancement in the preventive measures to avoid these accidents, but they are not capable of eradicating life threat to the safety of the individual. Thus, there is a need to create a system that is capable of handling hazardous materials remotely instead of human on-site dealing with the radioactive materials. With the advent of new technologies (like Virtual Reality), the ease of use of these robots has tremendously increased which has led to a boost of their use in a variety of domains. These advancements will eventually increase the quality and safety of the workers at such work places.
With the development of modern science and technology, robots have been used in a variety of applications, while robotic teleoperation technology can span space, place people, machines, and task objects in a closed loop to achieve the human and the objective world of synchronous machine interaction, and to improve people's perception and behaviour in a large extent [1]. For robots in complex operating environments (such as home services), sometimes robotic vision may be difficult to provide enough information to successfully perform such tasks. Additionally, some occlusion of the situation in the real indoor environment often occurs, which makes relying solely on the robot itself is difficult to complete and correct perception of the surrounding environment, and to make decisions. In this case, it is an effective way to solve these problems by putting human, robot and task object in a closed loop and introducing human experience to control the robot [2,3].
In order to deal with the complex operating tasks, the operator must always behave according to the actual situation of the transformation, the robot coordinated control, and constitute a human-robot working environment system. Since human beings are good at performing perceptual understanding, action planning, action resolution, and making decisions based on experience, people play a crucial role in the process. At this point, people are required to complete the intelligent analysis part, and then complete the underlying work through the body language control robot. By this process, this man-machine interaction can achieve better results. The manipulator is the main operating mechanism of the robot. Therefore, it is of great significance to carry out the research of the remote control system based on gesture control.
The Micro-Engineering Department (Institut de Systèmes Robotiques: ISR) of the Swiss Federal Institute of Technology (EPFL) is involved in robotic design and development, with a special focus on industrial applications. The classical methods for robotic systems programming (off-line as well as on-line) lack user friendless and performance. This is why, since 1990 people have been developing Virtual Reality (VR) interfaces to simplify robot task planning, supervision and control [4]. In addition, the Intelligent Mechanisms Group (IMG) of the NASA Ames Research Center (developers of the Virtual Environment Vehicle Interface [5,6], a user interface to operate science exploration robots) has shown that a tool to generate rapidly VR interfaces for new robots arm manipulators would provide great benefits [7,8].
This paper introduces a complete integrated system that combines every necessary aspect of a human hand to act as an Intelligent Multi-Fingered Dexterous Hand which is capable of doing tasks that required humans in the past. The system consists of a two-way communication between two Arduino through Xbee pro. The communications provide better network conditions and ensure interruption-free communication between the hand and the human with a maximum range of 2 miles. The robotic hand can be controlled from anywhere within the range and helps to perform the task without visiting the site.
Samsung Gear 360 technology is used to give a virtual video feedback of the robotic arm environment in which the robotic hand is working. This technology enables us to operate the hand in the same manner in which human beings use their own hands, thus making the task easy to perform. The most important aspect of the system is that it doesn't require any human presence at the place of work thus preventing the direct exposure of human life to the hazardous environment. The master control of the robotic hand is made using a glove having IMU and multiple flex sensors attached to it for gesture recognition. The accelerometer is used to give the acceleration of the hand in the three axes and a gyroscope is used to give the angular rotation and angular velocity of the hand. Flex sensors are helpful in defining the pressure applied through hands and finger movements. The combination of both IMU and flex sensor help to mimic our hand movements perfectly [9].
These kind of systems are also very useful in day-to-day activities beside the space applications. Consider a problem when there is shortage of time and you want to cook food without wasting time or you are coming from office and you want to utilise the travelling time in preparing food in your kitchen. In these conditions, this kind of an arm is beneficial. The system is capable of creating a virtual environment of the kitchen and helps the user to perform the cooking task while travelling or studying. The arm in the kitchen conducts the cooking work without having the need of the human being to be physically present there. Gear 360 camera replicates the whole kitchen environment virtually in front of the user. Thus, this system is a salutary contribution in helping the humans in saving time and preventing many lives.
If a robotic manipulator is to be controlled, it is usually done using the RF (Radio Frequency) remote control or by gesture recognition with both robot and human within the same environment. But the problem with conventional systems is that they have a very difficult and cumbersome user interface, which inhibits its usability by a great deal.
The interactive user interface (360-degree Vision) and the easy mimicking control of the robotic hand based on human hand makes our system far apart from the prior art. This way we could control our robotic arm in space, households as well as in places of hazardous materials. The soft feel of the nitrite rubber robotic hand gives the feel of real human hand and it's very safe at work. Apart from this, our system is made in such a manner that it could also address the issue of cost effectiveness.

DESIGN AND ANALYSIS OF ROBOTIC HAND
The Asymmetric Flexible Pneumatic Actuator (AFPA) is designed in such a way that one half of it has the profile of a bellow while the other half is flat. It is a single chambered tubular structure with one of its sides thicker than the other. The eccentricity of the bellow generates the differential expansion of the top and bottom of the asymmetric bellow and moment which causes the actuator to bend more towards the thicker side. The asymmetric bellow design gives maximum deflection up to a certain value of eccentricity and is capable of withstanding high pressure as compared to normal symmetric designs.
The deflection of the actuator is also affected by the shape of the bellow profile. As a varying internal pressure was applied along the interior of the AFPA, the percentages of expansion increased gradually for triangular, trapezoidal, U-shaped, and square shaped bellow profiles respectively. It can be inferred that the triangular shaped profile is the most suitable one since it provides the best deflection. The actuator was made of a two-component nitrile rubber which is RTV (room temperature vulcanizing) type. The material properties of nitrile rubber are listed in the following Table 1 [10,11]. The CAD model of the AFPA is shown in Figure 1. The models were created with varying parameters of A and B to analyze and effectively find the optimum design that is responsible for eccentric actuation, where A and B represent the thickness of the plate at the top and the bellow side respectively. The length of the model is designed as 27.5 mm with the radius as 2.5 mm.  Table 2 shows four models which are used to measure the elastic deformation with respect to the applied pressure. It is noticed that while the thickness of the plate at the bellow side (B) remains constant, the thickness at the top (A) increases from 1.5 mm to 3 mm. Based on our analysis, we have seen that model 4 gives the least deformation while model 2 gives the best deformation. Thus, it is observed that the deformation of the asymmetric bellow actuator is influenced by the eccentricity provided up to a certain extent. From this phenomenon, it can be inferred that as the eccentricity increases by increasing the thickness of the plate at the top side, there is also a simultaneous increase in the stiffness that reduces the deformation of the actuator to a considerable extent.  Figure 2 and Figure 3 show the deformation and stress analyses of model 2 of the AFPA using PTC Creo Simulate software, where the AFPA is subjected to five different internal pressures such as 300 kPa, 350 kPa, 400 kPa, 450 kPa, and 500 kPa. The variation in analysis result and the experimental result is due to the error caused in modeling and analyzing using the software. In addition, the AFPA is made of nitrile rubber which is a highly elastic material and has nonlinear property, which makes it difficult to analyze larger values of deformation using the software. It is difficult to theoretically predict the rate at which the AFPA bends due to the high pressure. It is noticed that the maximum and minimum deformation as well as the Von Mises stress in the AFPA increases with the increase in the internal pressure. The AFPA finger structure prototype is as shown in Figure 4. The mechanical design of the palm of robotic hand is done using 3D printing and its properties are as shown in the Table 3. The design was redesigned 3 times in PTC creo to bring flexibility to the design and looks more like a human hand. Once the CAD files were created as shown in Figure 5 for every part, those were converted to a STL files and imported in the Mojo 3D Printer's file processing software-Print Wizard which scaled the parts as necessary in proper orientations. For setting up the printer, all the parts were given an infill of 30%. The five fingers were printed with 2 shell and without any support or raft. The portion of the hand that is connected to the wrist and the five fingers were printed with 3 shells without any support or raft. Once the parts were printed of ABS material using the Fused Deposition Modeling (FDM) process, the parts with the support were put in WaveWash 55 which is an automatic support removal system. Once it was plugged in, an EcoWorks tablet was dropped in there along with the parts with the support. After that it was filled with tap water and the machine was turned on. It took considerable amount of time to remove the supports and once the supports were gone, the solution was poured down the drain and the parts were rinsed off. Once all the parts were printed, sandpaper was used to file the edges of the parts to make those smoother. Then the parts were assembled together using various types of joints and fasteners that includes, nuts, bolts, washers, and glue. The 3D printed part of palm and forearm are shown in Figure 6.  Figure 5. CAD model Figure 6. 3D print palm and forearm

WORKING PRINCIPLE OF THE SYSTEM
The Figure 7 shows the prototype of the system and Figure 8 shows its working principle. The IMU MPU-6050 is used to control the two degrees of freedom (DOF) of the robotic arm. One degree of freedom is the rotational base, and the other is the pitch. Further, 10 DOF soft robotic fingers are controlled via the bending of the flex sensor, thus we have 12 DOF soft robotic hand. The data from MPU-6050 is read in the Arduino Uno using I2C protocol and the analog data from the flex sensor is read using the analog port. Both the data are processed and transmitted via encapsulation of data in a packet. The data from the transmitter (Gloves) will be sent to the receiver using XBee pro 60mW which is having a range of one mile [12]. The receiver interprets the received sensor data and controls the actuators of the robotic arm via mapping the glove movement. The 10 DOF bellow fingers are made using nitrite rubber material and it's extremely soft and flexible such that it feels like a real human hand. All the bellows are connected to 10 one channel solenoid valve through which the air pressure is regulated. Mini 5 bar compressor is used to supply 5 bar total air capacity to the solenoid valve. The solenoid valve rated at 12V can be controlled using a relay system and supply required air pressure to the bellow for required bending of the finger. Two heavy duty servos are attached to the soft robotic arm for extra 2 DOF movement. One for the 360 degree base and the other for the pitch (up and down) movement. Also, we can use a gear 360 and get the 360 degree video feedback of the arm being placed within 2-mile range. This makes the hand to be controlled very precisely as if it's our real hand. So this solves the issue of high concentration of microbes while harvesting plants which are grown inside a space vertical farm [13].

ELECTRICAL SYSTEM AND CONTROL OF THE ROBOTIC ARM
The electrical system majorly consists of two control units: − Master Control Unit − Slave Control Unit

Master control unit:
Master control unit is the transmitter part of the system as shown in Figure 9 which is responsible for controlling the slave control unit. The Master control unit consists of the following components: − Microcontroller Arduino: Arduino is used as the microcontroller in the master control unit because of its high processing speed. Two Arduinos are connected together to establish a communication between the master control unit and the slave control unit. Arduino is responsible for every single processing operation in the system. Arduino is connected to Xbee Pro for establishing a network between the user and the robotic hand. − Xbee Pro: Xbee Pro is used to establish a fast and reliable network between the master and slave control unit. Xbee works on ZigBee protocol which enables it to establish an uninterrupted network for shorter distances. The combination of Xbee communication and WiFi communication prevents the system from network failure. Baud Rate of Xbee: 115200 Baud Frequency: 2.4 GHz − Sensors: Flex sensor: Five flex sensors are used in the hand for replicating the gesture and motion of the fingers. Every finger has one flex sensor connected with the microprocessor and the data of flex sensor is used by the microprocessor to analyse and give results. IMU 6050: It is used to define the acceleration and angular motion of the robotic hand. It consists of an accelerometer which provides the acceleration of the robotic hand in the three axes, a gyroscope which is responsible for defining the angular velocity of the hand, and a magnetometer. The flowchart of Gesture control is shown in Figure 10.

Slave control unit:
Slave control unit is the most important part of the whole system as in Figure 12 it is responsible for conducting the job on the behalf of human. The slave control unit consists of a robotic arm which is capable of doing every single task that a human hand can perform. The robotic hand consists of the following components: − Servo Mechanism: The mechanism as shown in Figure 13 is implemented to the base of the system to add 2 DOF. The mechanism help in the rotation of the base and up/down motion which helps to do task like pick and place with much ease. − Microcontroller (Arduino): In the slave control unit, Arduino is used as the controlling unit of the prosthetic arm. With the help Xbee pro, a wireless communication is established between the slave and the master control unit to control the robotic hand from longer distances. − Xbee Pro: It is used to establish a fast and reliable network between the master and slave control unit.
In slave control unit coordinator mode of Xbee is used as receiver. Xbee works on ZigBee protocol which enables it to establish an uninterrupted network for shorter distances. The combination of Xbee communication and WiFi communication prevents the system from network failure. − 360 Video Camera: Gear 360 as shown in Figure 14 is used as a medium to see the real environment of the workplace virtually. This technology helps in mimicking the working environment in which the robotic hand is working. It also helps in improving the user experience and makes the control of the robotic hand much more comfortable for the operator. It is capable of recording 360 degree videos and hence gives real-time streaming of video of the working environment. This helps the operator to understand the real conditions of the job and make them capable of doing the task without being physically present [14,15].

RESULT AND DISCUSSION
The process was implemented by using a samsung Gear 360 as shown in Figure 15. The robotic arm is tested with hand gesture control for the pick and place of a glass, the results were satisfactory. Figure 15. Prototype test setup TESTED BY XBEE CONNECTIVITY: Implemented ZigBee communication using two Xbee Pro and get appropriate results in controlling the robotic hand using the transmitter gloves as shown before. The three cases indicating different actions of the Robot arm are achieved as shown in Figure 15- Figure 16. CASE (1) Start from initial location and Grab the object ie. glass. CASE (2) Moving to target with glass. CASE (3) Reaching the target with glass. − CASE 1: The arm is moved toward the target red glass from its initial location as shown in Figure 15 to the target location as shown in Figure 16. This trajectory of the hand is performed by the mapping of the IMU with the real hand movement of the user with the transmitter gloves. Also, the red glass is grabbed using the flex sensor mapping of the finger of users hand. − CASE 2: The arm from the position as shown in Figure 16 is now lifted and its on its way towards goal position as shown in Figure 17. This trajectory is maintained with the IMU mapping of the user hand with transmitter gloves. − CASE 3: The arm from the position as shown in Figure 17 is moved/narrowed down and reached its final goal position as shown in Figure 18. This trajectory is maintained with the IMU mapping of the user hand with transmitter gloves.

CONCLUSSION AND APLICATION
The hand system developed solves the issue of high concentration of microbes while harvesting plants which are grown inside a space vertical farm. The robotic system also finds its application in various domains of our lives. First, it could be used like a cooking robot as in, the user could easily sit in his/her room and could control the robotic arms fixed in the kitchen. This helps the user easily cook food without being present in the kitchen physically, but he/she will still feel like standing in the kitchen. Second, the robotic system finds its application in dealing with hazardous materials (like nuclear waste and radioactive materials) which is usually done using humans by wearing radiation resistant suits, but they are never 100 percent secure. But by this manner, many lives could be saved which are being sacrificed every year. Further, the system could find its application in industries where precision tasks are required which could be done using robots only, and also while handling very heavy things. In addition, the user experience is close to reality due to implementation of gear 360 technology [16,17].