Our main goal has been to solve problems caused by experiments burdening researchers and absence of education caused by issues such as poverty and pandemic. Engineering success has been achieved in the overall project with the below development process.
To begin with, we thoroughly discussed the way to automate a series of experiments for culturing cells (and conducting transduction with the lentiviral system). We found the solution on the robot arm. To guarantee applicability to original instruments, 3D modeling of auxiliary items were conducted. For more precise and reliable movement, we chose vision control as the coordinate correction method. Finally, user-friendly designed GUI, containing total remote-control functions, has undergone improvement from time to time for global versatility
In addition, every development was accomplished through the following engineering cycle called ‘ D D C C process’.
Discuss
Design
Compare
Cooperate
3D modeling of supportive items
Discuss:
The experiment automation has been planned to be conducted based on the 6-axis robot arm and two-fingered gripper. For the applicability of a prepared machine to experiment, we carefully discussed experiment instruments requiring auxiliary items to be attached on. Those instruments include pippets, suction machine, tip racks, plates, incubator, and total experiment table.
Design:
Before beginning designs, the most considerable characteristics of auxiliary tools were creativeness and practicality. Each robot team member drew sketches and shared ideas. In the process, we converged not only with advice from mechanical engineering professor and advisor, but also several ideas from high school students (Hansung Science High School). Although much time had been spent on design, we completed pragmatic and aesthetical tools. Detailed information of hardware is on HARDWARE.
Compare:
Utilizing the Inventor-3D modeling tool of Autodesk, Inc., we conducted mechanical analysis, including stress analysis and dynamic simulation, and compared every designed model. Eventually, we selected models optimized to the movement of the robot arm and two-fingered gripper and markedly decreased the number of trial & error.
Cooperate:
We cooperated in creating every item to make best use of each member's ability utilizing precise machine tools, Inventor, testing applicability of tools, etc. Teamwork made it possible to cut down time and overlaboring. Now, finished products were applied for patents and play the critical role of the automation.
Vision control as the coordinate correction method
Discuss:
Biochemistry experiments require a high level of exquisiteness, the reason why automation of biochemistry experiments is difficult. It was relatively simple to input specific coordinates in the robot arm to operate certain motions. However, this mechanism was too rough to do delicate motions and readily cause problems even when the location of instruments have little changes. Coordinate correction was necessary since the problems above resulted in a wide margin of error on overall progress. We suggested several ways for revision such as LIDAR, ultrasonic sensor, and vision control.
Design:
With vision control, we set algorithms to perform motions in a delicate manner. We also adopted ‘ArUCO maker’ to process the data of items. Two cameras recognize ArUCO markers and correct the location of the gripper while preset offset values between markers and instrument help execute rough movement. Additionally, we visited Onrobot, Inc. and took technical counsel to develop an algorithm that enables measuring the distance of all parts without any fixed values.
Compare:
In this engineering cycle, we reversed the order of compare and design process to achieve rapid development. We compared three ways of coordinate correction mentioned above. LIDAR and ultrasonic sensors have similar problems that should be controlled by the ROS (Robot Operating System) program on Linux and the algorithm processing the data was overly complicated to reach the robot arm control box. In contrast, vision control utilizing cameras was advantageous for recognizing short-distanced items and for simplicity of usage.
Cooperate:
We quickly focused on camera station modeling which underwent the DDCC process as always. Identifying reachable distance and posture, cam station and auxiliary tools were set immediately and now the robot arm satisfies the highest exquisiteness.
User-friendly designed GUI
Discuss:
We used programs ‘FFFTP’ and ‘Teraterm’ to test the robot arm and gripper. Although fast trials are available for developers with those programs, they are not implemented to be intuitive and not specialized to control the robot arm in remote locations (one of our goals is to enable remote education for students living far as across the continents). Achieving the goal of the project, we decided to self-develop a GUI.
Design:
We required GUI contain functions as follow
- Communicating with server on the robot arm
- Conducting partial or overall experiment under user’s choice
- Simple and intuitive UI
While we had not planned the GUI development at the first time, only significant functions were decided to be included to save extra time. We suggested several ways of GUI, such as mobile application, website, exe-program.
Compare:
Mobile apps and websites have their advantages on high accessibility and real-time update and maintenance. However, requiring proficiency in utilizing the ‘flutter’ tool or ‘html’, expected time was not cost-effective. On the other hand, the exe-program is suitable to apply the ‘python’ language that we have used throughout the whole project.
Cooperate:
GUI development initiated while testing advancement of automation. Therefore, a few engineers were assigned to work on the GUI. Despite the small size of the workforce, members cooperated to design UI and to put client-server communication and streaming. The GUI introduced on SOFTWARE is available now, but it is on the way to upgrade into a website for its long-term efficiency.