USF Libraries
USF Digital Collections

Development of an end-effector sensory suite for a rehabilitation robot

MISSING IMAGE

Material Information

Title:
Development of an end-effector sensory suite for a rehabilitation robot
Physical Description:
Book
Language:
English
Creator:
Stiber, Stephanie A
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Barrett hand
Camera
Sensors
MatLab
C++
Vision
Laser range finder
Dissertations, Academic -- Mechanical Engineering -- Masters -- USF
Genre:
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: This research presents an approach in assisting the control and operation of a rehabilitation robot manipulator to execute simple grasping tasks for persons with severe disabilities. It outlines the development of an end-effector sensory suite that includes the BarrettHand end-effector, laser range finder, and a low cost camera. The approach taken in this research differs greatly from the currently available rehabilitation robot arms in that it requires minimal user instruction, it is easy to operate and more effective for persons severely disabled. A thorough study of the currently available systems; Manus, Raptor and Kares II arm, is also presented. In order to test the end-effector sensory suite, experiments were performed to find the centroid of an object of interest to direct the robot end-effector towards it with minimal error. Analyses of centroid location data to ensure accurate results are also presented. The long term goal of this research is to significantly enhance the ability of severely disabled persons to perform activities of daily living using wheelchair mounted robot arms. The sensory suite developed through this project is expected to be integrated into a seven-degree of freedom wheelchair mounted robot arm currently under development at the Rehabilitation Robots Laboratory at the University of South Florida.
Thesis:
Thesis (M.A.)--University of South Florida, 2006.
Bibliography:
Includes bibliographical references.
System Details:
System requirements: World Wide Web browser and PDF reader.
System Details:
Mode of access: World Wide Web.
Statement of Responsibility:
by Stephanie A. Stiber.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 96 pages.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001796896
oclc - 156975070
usfldc doi - E14-SFE0001623
usfldc handle - e14.1623
System ID:
SFS0025941:00001


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam Ka
controlfield tag 001 001796896
003 fts
005 20070725100633.0
006 m||||e|||d||||||||
007 cr mnu|||uuuuu
008 070725s2006 flu sbm 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0001623
040
FHM
c FHM
035
(OCoLC)156975070
049
FHMM
090
TJ145 (ONLINE)
1 100
Stiber, Stephanie A.
0 245
Development of an end-effector sensory suite for a rehabilitation robot
h [electronic resource] /
by Stephanie A. Stiber.
260
[Tampa, Fla] :
b University of South Florida,
2006.
3 520
ABSTRACT: This research presents an approach in assisting the control and operation of a rehabilitation robot manipulator to execute simple grasping tasks for persons with severe disabilities. It outlines the development of an end-effector sensory suite that includes the BarrettHand end-effector, laser range finder, and a low cost camera. The approach taken in this research differs greatly from the currently available rehabilitation robot arms in that it requires minimal user instruction, it is easy to operate and more effective for persons severely disabled. A thorough study of the currently available systems; Manus, Raptor and Kares II arm, is also presented. In order to test the end-effector sensory suite, experiments were performed to find the centroid of an object of interest to direct the robot end-effector towards it with minimal error. Analyses of centroid location data to ensure accurate results are also presented. The long term goal of this research is to significantly enhance the ability of severely disabled persons to perform activities of daily living using wheelchair mounted robot arms. The sensory suite developed through this project is expected to be integrated into a seven-degree of freedom wheelchair mounted robot arm currently under development at the Rehabilitation Robots Laboratory at the University of South Florida.
502
Thesis (M.A.)--University of South Florida, 2006.
504
Includes bibliographical references.
516
Text (Electronic thesis) in PDF format.
538
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
500
Title from PDF of title page.
Document formatted into pages; contains 96 pages.
590
Adviser: Rajiv Dubey, Ph.D.
653
Barrett hand.
Camera.
Sensors.
MatLab.
C++.
Vision.
Laser range finder.
690
Dissertations, Academic
z USF
x Mechanical Engineering
Masters.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.1623



PAGE 1

Development of an End-effector Sens ory Suite for a Rehabilitation Robot by Stephanie A. Stiber A thesis submitted in partial fulfillment of the requirement s for the degree of Master of Science in Mechanical Engineering Department if Mec hanical Engineering College of Engineering University of South Florida Major Professor: Rajiv Dubey, Ph.D. Craig Lusk, Ph.D. Shuh-Jing Ying, Ph.D. Date of Approval: July 19, 2006 Keywords: Barrett hand, camera, sensor s, MatLab, C++, vision, laser range finder Copyright 2006, Stephanie A. Stiber

PAGE 2

Dedication I would like to dedicate this work to the me mory of my late mother, Judith Stiber.

PAGE 3

Acknowledgments First and foremost, I would like to thank Dr. Rajiv Dubey for making this wonderful experience possible. Without his gracious financial and intellectual support, this project would have not been possi ble. I would like to thank Dr. Craig Lusk and Dr. Shuh-Jing Ying for their gui dance and recommendations during this project. I would also like to recognize the Mechanical E ngineering department, faculty and staff for offering such a wonderful program w here young engineers can really grow and learn. Next, I would like thank Eduardo Veras. Thank you so much for your patience and for teaching me C++, Mat Lab, and basically everything computer related. Without you, this project w ould have been near impossible. I would like to thank Norali Pernalete for her assistanc e in editing this work. Everyone else is the lab, thank you for your helpful suggestions, constant cooperation and guidance. It is amazing to work in a l ab like this, where great things are made and developed for the we ll-being of others. I want to thank my husband, Ryan McKeon, for his understanding of my late hours and awkward schedule. I would like to thank the rest of my family members for their constant support and understanding of my goals. Thank you.

PAGE 4

Table of Contents List of Tables iii List of Figures iv Abstract vi Chapter One Introduction 1 1.1 Motivation 1 1.2 Thesis Objective 2 1.3 Contributions 2 1.4 Thesis Outline 3 1.5 Importance 3 Chapter Two Background 4 2.1 Disabilities Worldwide 4 2.2 Rehabilitation Arms 5 2.2.1 Manus 6 2.2.2 Raptor 8 2.2.3 Kares II 8 2.2.4 WMRA 9 2.3 Robot End-effectors 11 2.4 Vision Applications in Rehab ilitation 11 Chapter Three Development of the Sensor Suite 14 3.1 Selection Criteria 14 3.2 Experimental Platform 15 3.2.1 Puma Arm 15 3.2.2 Camera Specifications 22 3.2.3 Laser Range Finder 26 3.2.4 BarrettHand End-effector 29 Chapter Four Syst em Integration 33 4.1 End-effector Integration 33 4.2 SICK Laser Range Finder 36 4.3 Camera Set-up 37 4.4 Image Processing 43 4.5 Integration Platform 47 4.6 Operating Procedure 55 i

PAGE 5

Chapter Five Analysis of Results and Conclusions 57 5.1 Results 57 5.2 Discussion 64 5.3 Conclusion 65 5.4 Recommendations 66 References 67 Bibliography 69 A ppendices 71 Appendix A: C++ Coding 72 A.1 Automatic C++Code 74 A.2 BarrettHand Initialization C++ Code 75 A.3 BarrettHand Demo C++ Code 76 A.4 Laser Range Finder C++ Code 78 A.5 Laser Range Finder with BarrettHand C++ Code 79 Appendix B: MATLAB Code 82 B.1 Laser Range Finder MATLAB Code 83 B.2 Centroid MATLAB Code 86 B.3 BarrettHand Demo 88 B.4 BarrettHand and Laser Code 93 B.5 Final Integration 94 ii

PAGE 6

List of Tables Table 1: Nu mber of Persons Using Assistive Technology Devices 5 Table 2: D-H Parameters of Puma Arm 18 Table 3: Technical Specifications of the Bumblebee Stereovision Camera 23 Table 4: Logitech Camera Specifications 24 Table 5: Technical Specifications of the Sony Camera 25 Table 6: Product Specifications for Creative Lab Camera 26 Table 7: DME Product Features 27 Table 8: Technical Data 27 Table 9: BarrettHand TM Specifications 32 Table 10: Common GCL Commands 34 Table 11: Calibration Data for Sony Camera 40 Table 12: Creative Camera Calibration 42 Table 13: Before and After Image Processing 58 Table 14: Actual vs Calculated Centroid in x,y, z 60 Table 15: Creative Camera Calibration Error 61 Table 16: C++ Code Definitions 73 Table 17: MATLAB Image Acquisition Commands 82 Table 18: Common MATLAB Image Processing Commands 83 iii

PAGE 7

List of Figures Figure 1: Manus Arm 7 Figure 2: Manus End-effector 7 Figure 3: Raptor Arm 8 Figure 4: Kares II Robot System 9 Figure 5: New Wheelchair Arm 10 Figure 6: Two Camera Setup 12 Figure 7: The Puma Arm 15 Figure 8: Simple Link 16 Figure 9: Puma Arm Coordinate C onfiguration and Link Assignment 17 Figure 10: Bumblebee 22 Figure 11: Web Cam 24 Figure 12: Sony CCD Camera 25 Figure 13: Creative Labs Web Camera for Notebooks 26 Figure 14: DME2000 Dim ension Drawing 29 Figure 15: BarrettHand End-effector 30 Figure 16: BarrettHands Grasping Abilities 30 Figure 17: Independent Finger Movement 31 Figure 18: MATLAB Command Flowchart 35 Figure 19: Laser Range Finder 36 iv

PAGE 8

Figure 20: Range Finder Distanc e Verse Height for Sony 41 Figure 21: Range Finder Distance Vers e Height and Width of a Camera Display 43 Figure 22: Original Image 45 Figure 23: Filtered Image 45 Figure 24: Black and White Image 46 Figure 25: Ruler Centroid 46 Figure 26: Program Flow Chart 47 Figure 27: Puma Arm Kinematics Set-up 48 Figure 28: Translation of O Frame to Base 49 Figure 29: Translation from Base Frame to Workstation Frame 50 Figure 30: Part with Respect to Workstation 51 Figure 31: Part With Respect to Camera 52 Figure 32: End-effector Fram e with Respect to Frame 6 53 Figure 33: Transformations of the Puma Arm 54 Figure 34: Final Puma Arm Configuration 57 Figure 35: Camera in Hand 58 Figure 36: Original Test Image 62 Figure 37: Centroid Test Image 62 Figure 38: Puma Arm Moving 63 Figure 39: Grasping Test 63 Figure 40: Returning to Workspace 64 v

PAGE 9

Development of an End-effector Sens ory Suite for a Rehabilitation Robot Stephanie A. Stiber ABSTRACT This research presents an approach in assisting the control and operation of a rehabilitation robot manipulator to execute simple grasping tasks for persons with severe disabilities. It outlines the development of an end-effector sensory suite that includes the BarrettHand end-e ffector, laser range finder, and a low cost camera. The approach taken in this research differs greatly from the currently available rehabilitation r obot arms in that it requires mi nimal user instruction, it is easy to operate and more effective for persons severely disabled. A thorough study of the currently avai lable systems; Manus, Raptor and Kares II arm, is also presented. In order to test the end-effector sensory suite, experiments were performed to find the centroid of an object of interest to direct the robot endeffector towards it with minimal error. Analyses of centroid location data to ensure accurate results are also presented. vi

PAGE 10

The long term goal of this research is to significantly enhance the ability of severely disabled persons to perform acti vities of daily liv ing using wheelchair mounted robot arms. The sensory suite developed through this project is expected to be integrated into a sevendegree of freedom wheelchair mounted robot arm currently under development at the Rehabilitation Robots Laboratory at the University of South Florida. vii

PAGE 11

Chapter One Introduction 1.1 Motivation According to the Census Bureau in 2002, 2.2 million pe ople in the United States over the age of 15 use a whee lchair. Furthrmore, 18 million have had difficulty lifting and carrying a ten pound bag of groceries or grasping small objects [1]. A significant number of these people are se verly disabled and unable to manipulate objects to perform activities of daily living. With this arises a need for a user-friendly end-effector that can be used while attached to a wheelchairmounted robot arm. A user-friendly end-e ffector is needed in order to grasp everyday items, such as water bottles, pens, or even silverware. People with disabilities often have difficulties navigating through their surroundings. This can make activities of daily living extremely difficult and frustrating. Some activities of daily li ving include: bathing, dressing, walking, eating, toilet-use, grooming, and transferring from a b ed to a chair [2]. People confined to a wheelchair hav e a reach limited to a se micircle with the radius being no longer than the length of their arm. The purpos e of this research is to extend the workspace and al low people with disabilities to perform simple daily tasks. While there are devices that do th is as well, these devices are often too complex to control. By integrating vision recognition and a laser range finder with a robot end-effector, the control of a robot arm becomes manageable. 1

PAGE 12

1.2 Thesis Objectives Develop an end-effector sensor suit e to assist in the control of a rehabilitation robot. This includes: Programming and integrat ion of BarrettHand end-effector to the Puma 560 robot Selection and integrat ion of a laser range finder with the manipulator Selection and integr ation of a vision system with the manipulator Perform experiments with the sensory suite integrated with the manipulator on the arm. This includes: Using the end-effector sensor suite s assistance in finding the centroid of an object Analyzing object centroid location data to ensure accurate results are obtained Develop algorithms for moving th e robot hand to grasp the object 1.3 Contribution The contribution of this research is the unique combination of computer vision and laser range finder technology integrated with a state of the art robot hand. This integration will ai de in the execution of tasks with minimal user input. The approach taken uses the BarrettHand end-effector with some sensory assistance. These sensors in clude a laser range finder and a basic camera. The BarrettHand is a relatively new r obot end-effector and while it has been integrated with cameras, it has never been integrated wit hin the realm of 2

PAGE 13

study of rehabilitatio n robots. The end-effector s ensor suite will allow the hand assembly to transfer from one r obot arm to another with ease. 1.4 Thesis Outline The thesis is outlined as follows : Chapter Two contains background information on rehabilitation robot arms, endeffectors, and vision applications in robots. Chapter Three serves as an overview of the des ign procedure; it discusses the criteria for selection, product specifications and manipulator details. Chapter Four provi des a description of the int egration of the robot arm with the end-effector, la ser range finder, BarrettHand, and the camera system. Finally, Chapter Five contains the resu lts, discussion, conclusion, and further recommendations for future research. 1.5 Importance Similar projects have been conducted worldwide, but most have not been applicable towards rehabilitation engineering or servicing people with disabilities. This project integrates a vision system laser range finder, and advances robotic gripper. The end result will be the calculat ion of the centroid of an object of interest in order to allow an end-effector to automatical ly work with minimal user interaction. The long term goal of this research will be to enable a robot arm to view an object and allow a person with seve r disability to grasp and manipulate it. 3

PAGE 14

Chapter Two Background 2.1 Disabilities Worldwide At a United Nations meeting, it was agreed that disability is multidimensional; thus, they could not ascertain the single true size of the disabled population. Different symptoms are related to different levels of disability [3]. The level of disability that this research will benefit is any permanently wheelchair-bound person. This device sh ould be able to be used by someone with full use of upper limbs as well as someone with limited upper extremity mobility. The following Table gives the demographi cs of disabilities taken by the National Center for Health Statistics in 1994 [4]. The Table discusses the devices used, the age of the user and amount of users. 4

PAGE 15

Table 1: Number of Persons Using A ssistive Technology Devices by Age of Person and Device: United States 1994 [4] All Assistive Device ages 44 years and under 45-64 years 65 years and over Anatomical devices Number in thousands Any artificial limb 199 69 59 70 Artificial leg or foot 173 58 50 65 Artificial arm or hand *21 *9 *6 *6 Any mobility device** 7,394 1,151 1,699 4,544 Wheelchair 1,564 335 365 863 Any vision device** 527 123 135 268 Braille 59 *28 *23 *8 Computer equipment *34 *19 *8 *7 Table 1 demonstrates that there is a large population of mobility challenged individuals who ma y be helped by this tec hnology. This device has the potential to help the mobility challenged individuals. This research can help people with an artificial arm or hand and also people permanently bound to wheelchairs. This allows for the possib ility of helping over one million people. 2.2 Rehabilitation Arms Rehabilitation engineering originated because of a need for assistive devices for people with severe disabilities An overview of these technologies is 5

PAGE 16

discussed by Lancioni et al [5]. The conclusion of this overview was that the majority of available resources for som eone with disabilities have been aimed at promoting a disabled persons direct access to or request of environmental stimulation. These resources have al so been directed towards supporting and increasing a persons orient ation and mobility while r educing their accidental injury rate. The resources that are mo st commonly available to people with disabilities are micro switches and s peech output devices. Other devices, such as robot limbs, are less accessible becaus e they are difficult to implement. 2.2.1 Manus The Manus is a fully functional wh eelchair-mounted robot manipulator that has been built in the Netherlands. The goa l of this wheelchair-mounted robot arm is to provide the disabled with a great er level of personal independence. The Manus has a simple controller and is commercially available through Exact Dynamics. The Manus is a 6 degree of fr eedom robot arm which is mounted on a rotating and telescoping base unit. This a rm can be attached to a variety of electric wheelchairs and is capable of grasping up to 2.2 kg. According to Dallaway [6], the current versions of the Manus have a reac h of approximately 850 mm. A picture of the Manus arm is show n below in Figure 1. This arm has a few preset functions such as a home pos ition and drinking functi on, allowing it to lift a glass while holding the water level even. 6

PAGE 17

Figure 1: Manus Arm With the arm, a cup can be grasped, held level, and then brought to a persons face. Once at the face, the arm has a preset drinking command. This command tilts the cup about an axis while adjusting the height to allow for easy drinking. Figure 2 shows a close up view of the Manus end-effector Figure 2: Manus End-effector This end-effector contains two fi ngers with a rotating wrist. The commands for these fingers consist of simple open and close commands. 7

PAGE 18

2.2.2 Raptor The Raptor robot arm is another commercially available wheelchairmounted robot arm. The Raptor is the fi rst commercially available FDA-approved rehabilitative robot. The Raptor is controll ed by a joystick or a sip-in-puff. It has a robot end-effector similar to that of M anus and is capable of lifting items off the floor. Figure 3 shows the Raptor arm attached to a wheelchair. Figure 3: Raptor Arm 2.2.3 Kares II Bien et al [7] created a robot system called Kares II (Korea Advances Institute of Science and Technology--KAI ST Rehabilitation En gineering Service System II). This system was designed to out-perform the Manus and Raptor. This system used two experimental platforms, a mobile platform and a wheelchair platform. This device is capable of twel ve major tasks; among these tasks is face 8

PAGE 19

washing and retrieving a cup. Kares II was designed using task-oriented designs. Figure 4 shows the shaving and face cleaning tasks. Figure 4: Kares II Robot system Figure 4 just shows one of the two op erating platforms: a mobile unit not attached to a wheelchair. The Kares II r obot comes with another platform that can be used on a mobile wh eelchair. Researchers concl uded that Further study is needed to design a conveni ent operation methodology of the system on behalf of novice users and long-term handling. More sensitive and wide intention reading capability of various kinds is desirable for human-friendly interaction [7]. 2.2.4 WMRA The nex t arm being discussed, WMRA (wheelchair mounted robot arm), currently does not have an end-effector. It is a product designed to out lift and out perform both the Raptor and the Manus w heelchair mounted robot arms. It is pictured in Figure 5. This arm was built in 2005 by Edwards [8]. 9

PAGE 20

Figure 5: New Wheelchair Arm This arm is a variable controlled arm with six main joints. This wheelchair was created at the University of South Florida in the Rehabi litation department of the College of Engineering. This arm would be optimal because it has a greater payload capacity than the other available arms. The Manus a rm only has a payload capacity of about 2.2 kg and the Raptors capacity is onl y 1.5 kg. This arm can easily support a load of over 10 kg. It can do this because of the motors used and the solid joint connections. This arm also has more degr ees of freedom than that of the Manus, Raptor, and Puma arms. This allows the arm to go to one po sition in multiple ways, thereby imitating the human arm (thi s is called redundancy). As soon as cartesian control is implemented, this s ensory suite will be switched over to this arm instead of the Puma arm. 10

PAGE 21

2.3 Robot End-effectors The end-effectors are e ssentially one of the most important aspects of a robot. Many available end effectors hav e a simple open or close function with minimal force feedback. One example of th is is shown above in Figure 2. While this robot end-effector does have some amount of force feedback, the endeffector lacks the ability to grasp an objec t evenly. In addition, it does not offer any finger dexterity. The fi ngers do not bend to grasp an object evenly, making it easy for the object to simply fall through the grasp of the robot. 2.4 Vision Applications in Rehabilitation There have been two vision based syst ems developed at the University of South Florida. One of the systems, developed by Fritz [9], consisted of a seven degree of freedom robot mani pulator, an end-effector, a laser range finder, a vision system, and several other sensors. This work was an important basis for the inclusion of the laser range finder in the present study. The system developed by Jurczyk [10] c onsisted of two cameras that were used to create a stereovision camera system programmed and calibrated to find a handle. The hardware utiliz ed was a Hitachi KP-D50 CCD (charged couple display) camera, an Imaging Source CCD, and two Imaging Source DFG/LC1 frame grabber cards. This se tup is shown in Figure 6. 11

PAGE 22

Figure 6: Two Camera Setup In Martins research conducted at the Massachusetts Institute of Technology, programming was used to cr eate a vision subsystem for obstacle avoidance [11]. In this research study, proximity sensors were utilized to cut down on the information processing for dist ance. This research provided a very important four-step outline to programming vision systems. Step one is to record, step two is to learn, step thr ee is to build, and step four is to validate. This project is concerned with obstacle avoidance but provides a good basic outline for recognizing and processing images. Another very similar project oper ates in a dynamic, unstructured environment. This research was conduc ted by Kragjic, et al [12]. The experimental setup included the Puma a rm, BarrettHand, sonar sensor, laser sensor, a wrist mounted force tor que sensor, a color CCD on hand and two cameras for stereovision. This system is simply too bulky for the purposes of rehabilitation engineering, but the resear chers approach in programming was 12

PAGE 23

very important for the st udy presented in this thesis. The study broke the programming into several smaller progr ams instead of one massive program. One study was conducted by Allen et al [13] at Columbia University integrating vision, force, and tactile sensing for grasping. They used a number of experiments to show how certain sensors, such as strain gages, vision sensing, and tactile sensors, are int egrated. It was demonstrated that the integration of sensors dramatically increas es the capabilities of a robot hand for grasping. This conclusion is important, as this investigation can expect to reach the same conclusion. There are many end-effectors in existence, but few with advanced technology for grasping door handles and bottles. There are currently several common end-effectors, or end ef fectors, that are used in the field of rehabilitation robots. Many of the end-e ffectors made are for indust rial use and issues of strength and precision contribute to th e lack of end-effectors available for rehabilitation applications. 13

PAGE 24

Chapter Three Development of the Endeffector sensor suite 3.1 Selection Criteria The approach presented in this study differs greatly from the current end-effectors for use on wheelc hair-mounted robot arms in existence. This project should be relatively low cost, require minimal user instruction, easy to operate, and effective. None of the end-effectors used by the Manus, Raptor, nor Kares II satisfies these requirements. Although the Manus arm is capable of relatively easy manipulation after the user grows acquainted wit h the arm, it can take a few minutes to get to a specified spot and the grasp provided by the end-effector is not very strong. If the end-effector is not perfectly centered on the object, the object can easily fall on the floor and become even more difficult to grasp. The Raptor is neither easy to operate nor effective. The end-effector has the same exact problems as the Manus in addition to the arm being much harder to operate. It is controlled using a simple joystick or sip-in-puff which provides minimal control. Also, there are not many preset options for operating, such as home positions or assistive features. The Kares II does not satisfy the third condition of being relatively inexpensive. It does, however, implement cameras, force torque sensors, and other types of sensors to assist in activi ties of daily living. It is perhaps the most 14

PAGE 25

versatile of all the devices, but its operation is hard to learn. The researchers at KAIST even state this fact in their documentation. 3.2 Experimental Platform The following section provides informa tion on the experimental platform. It gives specifications and product selectio n guidelines for the Puma 560 arm, the BarrettHand, laser range finder, and four different types of cameras. 3.2.1 Puma Arm The arm chosen for demonstration of th is sensory suite is the Puma 560. This is shown below in Figure 7. A Puma arm was chosen for testing purposes because it is one of the most widely us ed arms in laboratory and industrial settings; it also has six degrees of fr eedom, making it possible to locate and orient an object in physical space. Duri ng this set-up, the Robotics toolbox for MATLAB created by Peter Corke [14] was used. It includes the complete kinematics for the Puma 560 arm. Figure 7: The Puma Arm 15

PAGE 26

In order to utilize the Puma arm, t he link parameters must be known. Link parameters are used in order to accurately define the direction and distances associated with the orient ation of a robot arm. The Denavit-Hartenberg (D-H) convention is used to describe the pos itions of links and joint parameters unambiguously. This convention is explai ned using further detail in the next pages. Figure 8 shows an example of a simple link. Figure 8: Simple Link [15] In Figure 8, i-1 and a i-1 are used to describe the kinematic relationship between the two joint axes. The link length is a i-1 and is measured as the mutual perpendicular between the two axes. T he link twist angle is defined as i-1 and it is used to describe the angle between the two axes [15]. In Figure 8, d is the link offset and theta is the joint angle. These two parameters are used to describe the conn ections between adjacent links. The link offset is the distance along a common from one link to the next. Theta is the 16

PAGE 27

joint angle, which defines the rotation about the common link. These four variables are defined below in equation form [15]. ai-1= the distance from Zi to Zi+1 measured along Xi (3.1) i-1 = the angle from Zi to Zi+1 measured about Xi (3.2) di = the distance from Xi-1 to Xi measured along Zi (3.3) i = the angle from Xi-1 to Xi measured about Zi (3.4) The coordinate system for the Puma a rm is given below in Figure 9. Figure 9: Puma Arm Coordinate Configuration and Link Assignment The link-frame assignment follows a six step procedure which is e xplained in full detail in Craigs work [ 15]. The procedure is as follow: 1. Identify the joint axes 2. Identify the common perpendicula r between them, or a point of interception. At this point of intercept ion, or at the poi nt where the common perpendicular meets the i th axis, assign the link frame origin. 3. Assign the Z axis pointing along the i th ax is 17

PAGE 28

4. Assign the X axis po inting along the common perpen dicular, or normal to the plane of containing the two axes 5. Assign the Y axis to complete the right hand coordinate system 6. Label as necessary The D-H parameters are listed below in Table 2. i in Table 2 represents the link based on the assignment in Figure 8. Table 2: D-H Parameters of Puma Arm [15] i i-1 a i-1 d i i 1 0 0 0 1 2 -90 0 0 2 3 0 a 2 d 3 3 4 -90 a 3 d 4 4 5 90 0 0 5 6 -90 0 0 6 The D-H parameters are used to calculate the transformation matrices. Transformation matrices are four by four matrices in which the information on the objects rotation and translation is obtai ned. A frame that strictly undergoes translation is shown in equation 3.5 [15]. 1000 100 010 001 z y x Tn m (3.5) 18

PAGE 29

In this equation, the translational val ues in x, y and z directions would be placed in the fourth column of the matrix. The frame would stay oriented the same, with the x, y and z coordinate fr ames being parallel to each other. The rotation portion of a transformati on matrix is contained within the first three rows and three columns. When an objec t is rotated around the x-axis by an angle of the x axis value in the matrix remains constant and the other axes are changed, as shown in equation 3.6 [15]. 1000 0cossin0 0sincos0 0001 Tn m (3.6) Equation 3.6 shows only the rotation about the x axis, and does not have any translation components. When an object is rotated around the y-axis by an angle of the y values in the second column and second row remain constant and the other axes rotation change, as shown in equation 3.7 [15]. 1000 0cos0sin 0010 0sin0cos Tn m (3.7) The rotation about the z axis follows the same pattern, where in the z column the value will remain c onstant when rotated by an angle of This is shown in equation 3.8 [15]. 1000 0100 00cossin 00sincos Tn m (3.8) 19

PAGE 30

Equation 3.9 provides the general trans formation matrix passed solely on link parameters. This equation uses the values from the Table to quantify the frames rotation and translation. 1 0 0 0 cos cos sincos sinsin sin sin coscos cossin 0 sin cos1 1 1 1 1 1 1 1 1 1 ii i ii ii ii i ii ii i i i i id d a T (3.9) Equation 3.9 is now used in combinatio n with the D-H paramet ers to obtain the six transformation matrices that govern the Puma arm [15]. In equation 3.10, represents the transformat ion matrix of frame one with respect to the base frame, where T0 1 1 is the rotation component about the z axis [15]. 1000 0100 00cossin 00sin cos1 1 1 1 0 1 T (3.10) In equation 3.11, represents the transformat ion matrix of frame two with respect to the frame one, where T1 2 2 is the rotation component about the z axis [15] 1000 00cos sin 0100 00sin cos2 2 2 2 1 2 T (3.11) 20

PAGE 31

In equation 3.12, represents the transformati on matrix of frame three with respect to the frame two, where T2 3 3 is the rotation component about the z axis. There is also translation of this frame in the x and z directions [15]. 1000 100 00cossin 0sin cos3 3 3 2 3 3 2 3d a T (3.12) In equation 3.13, represents the transformation matrix of frame four with respect to the frame three, where T3 4 4 is the rotation component about the z axis. There is also translation of this frame in the x and y directions [15]. 1000 00cos sin 100 0sin cos4 4 4 3 4 4 3 4 d a T (3.13) In equation 3.14, represents the transformati on matrix of frame five with respect to the frame four, where T4 5 5 is the rotation component about the z axis [15]. 1000 00cossin 0100 00sin cos5 5 5 5 4 5 T (3.14) In equation 3.15, represents the transformati on matrix of frame five with respect to the frame four, where T5 6 6 is the rotation component about the z axis [15] 21

PAGE 32

1000 00cos sin 0100 00sin cos6 6 6 6 5 6 T (3.15) The full forward kinemati c parameters are given by which is shown below in equation 3.16, which wil l be used later in chapter four. T0 6 TTTTTTT5 6 4 5 3 4 2 3 1 2 0 1 0 6 (3.16) 3.2.2 Camera Specifications Four cameras were tested in the course of this study. These cameras include a Sony, Creative Labs Web Ca m for Notebooks, Logitech QuickCam Orbit MP, and Point Greys Bumblebee came ra. The implementation of these will be described in chapter four, but the technical information is contained in the charts below. The first camera tested is Point Grey s Bumblebee new two-lens stereo vision camera system. It is shown below in Figure 10. Figure 10: Bumblebee 22

PAGE 33

The bumblebee camera comes preprogr ammed in C++ and can return a three dimensional point cloud of its field of vision. Programming is done to help recognize a cylinder in the point cloud. The bumblebee was chosen because of its size, weight, and capabilities. The techni cal chart is shown below in Table 3. Table 3: Technical Specif ications of the Bumbl ebee Stereovision Camera Image Sensors Imaging Device 1/3 progressive scan CCDs Resolution 640x4801024x768 VGA format Size 16 x 4 x 4cm Signal to noise ration TBD Consumption 2,1 W Power By IEEE-1994 Focal Length Lens focal length High quality 4mm focal length pre-focused micro lenses Baseline 120 mm HFOV 70 degrees Synchronization Less than 20 s The second camera tested is Logitech s QuickCam Orbit MP. This camera was chosen because it was inexpensive, lig ht-weight, small, and easy to obtain. The specifications for this came ra are shown below in Table 4. 23

PAGE 34

Table 4: Logitech Camera Specifications [16 ] System Requirements Features Windows 2000, XP Logitech QuickCam Orbit MP Camera with motorized camera head CD-Rom drive 9" stand Pentium P4 1.4 GHz or AMD Athlon processor 1 GHz (Pentium P4 2.4 GHz or better recommended*) Base 128MB RAM (256MB RAM recommended*) QuickCam Software CD 200MB Free Hard Disk space 6-foot USB cable 16-bit color display adaptor Stereo headset Windows compatible sound card and speakers High-quality 1.3 Mega pixel sensor Available 1.1 or 2. 0 USB port (USB 2.0 High Speed port (Required for mega pixel image capture) Camera set-up guide A picture of this web cam is show in Figure 11. This camera can either be mounted directly on the base or with a nine inch stand. Figure 11: Web Cam 24

PAGE 35

The third camera tested was a Sony ca mera pictured in Figure 12. It is a digital color charged couple display (CCD) with a cosmicar / pentax lens. The model number is IV-CCAM2, serial num ber U3000029. The specifications are shown below in Table 5. Figure 12: Sony CCD Camera Table 5: Technical Specificat ions of the Sony Camera Image Sensors Imaging Device 1/3 progressive scan CCDs Resolution 640x4801024x768 VGA format Size 16 x 4 x 4cm Signal to noise ration TBD Consumption 2,1 W Power By IEEE-1994 Focal Length Lens focal length High quality 4mm focal length pre-focused micro lenses Baseline 120 mm HFOV 70 degrees Synchronization Less than 20 s The last camera tested is Creati ve Labs Web Cam for notebooks. This camera was chosen because it is small in size, weighs just ounces, and was the 25

PAGE 36

least expensive of all the cameras tested. The camera is pictured in Figure 13 and the specifications are shown in Table 6. Table 6: Product Specifications for Creative Lab Camera [17] 1 year warranty USB Port connection VGA CMOS sensor Portable 640 x 480 video resolution Requires Notebook PC with Intel Pentium II or AMD Athlon processor 350MHz; Windows 98, 98 SE, 2000, ME or XP; 128MB RAM; available USB port; CD-ROM drive Additional Requirements Networ king hardware and Internet connection (dial-up, LAN or wireless Figure 13: Creative Labs Web Camera for Notebooks 3.2.3 Laser Range Finder A SICK laser range finder was used in this experiment as well. Table 7 from source [18] shows the DME 2000 lase r range finders product features. The 26

PAGE 37

laser range finder operates on time of flight technology. This means that the laser beam is pulsated and sent out towa rd an object in a narrow line. The time that it takes for the laser beam to hit the object and reflect back to the camera is used to determine the distance to an object based on the speed of light constant and time of flight. This range finder is accurate up to millimeters; however, accuracy in sub millimeters is substandard. Table 7: DME Product Features Sensing range min ... max (reflector mode): 0,1 ... 130 m Sensing range min ... max (proximity mode): 100 ... 2.047 mm Light source: Laser diode Type of light: Laser, red light Laser protection class: 2 (IEC 60825-1/EN 60825-1) Table 8 shows the technical specific ations for the DME 2000 range finder. Table 8: Technical Data [18] Dimensions (W x H x D): 54 x 105 x 138 mm Supply voltage min ... max: DC 18 ... 30 V Output current: <= 100 Ma Light spot diameter: Reflector mode: Approx. 250 mm / 130 m, Proximity mode: Approx. 3 mm/2 m Data interface: RS-232 Resolution: Reflector / Proximity mode: 1 mm Accuracy: +11 mm (>18% remission), +5 mm (=90% remission), +65 mm (6% remission), Proximity mode: 27

PAGE 38

Table 8: Continued Reproducibility: + 5 / 20 mm, 1 mm (=90% remission), 25 mm (6% remission), 3 mm (>18% remission), Reflector mode:, Proximity mode: Switching outputs: PNP, Q1, Q2, Qp, Qs Note: Q1 and Q2 invertible Analogue output: A nalogue output min ... max: 0 ... 20 mA Remark, analogue output: or 4 ... 20 mA Ripple: < 5 Vss Measured value output: Reflector mode: 100 ms, Proximity mode: 29 ms Reverse polarity protection: Short-circuit protection: Overload protected: Connection type: Connector, M16, 12-pin Enclosure rating: IP 65 Ambient operation temperature, min ... max: -10 C ... +45 C Ambient storage temperature, min ... max: -25 C ... +75 C A dimensioned drawing of the DME 2000 is shown in Figure 14. The dimensions are given in mm. This par ticular laser range finder has a mass of approximately 2 pounds. 28

PAGE 39

Figure 14: DME2000 Dimension Drawing 3.2.4 BarrettHand End-effector The BarrettHand offers a viable soluti on to the problems of common endeffectors. The BarrettHand is a four degr ee of freedom robot end-effector with advanced gripping capabilities. This endeffector is an adaptive end-effector, meaning the force of the end-effector can be adjust ed. The BarrettHand robot end-effector is pictured in Figure 15. 29

PAGE 40

Figure 15: BarrettHand End-effector Figure 16 shows that the three fingers on this end-effector curl at a joint to allow for optimal gripping of complex shaped objects. The end-effector is able to grasp balls, handles, door knobs, and even cell phones. The hands three fingers have the ability to spread, step open, and step close. Figure 16: BarrettHand's Grasping Abilities The BarrettHand grasps objects by closing the fingers around an object. Once a certain force is reached in the bo ttom portion of the finger, the top portion of the finger curls around to allow for pr oper grasping of the obj ect of interest. 30

PAGE 41

The fingers can also be controlled independently of each other. This makes it more difficult for an object to slip out of grasp and gives the end-effector even more versatility. With the control of i ndividual fingers, it allows the user to draw an object in close without the use of all three fingers. This is extremely useful when trying to center or close in on an object quickly and accurately. This is shown in Figure 17. Figure 17: Independent Finger Movement The robot hand used in this project is the BarrettHand by Barrett Technologies Inc. The specifications are shown in Table 9. The BarrettHand is used because it is the most effective end-effector, both in terms of price and ease of operati on. The BarrettHand costs approximately $20,000, which is cheaper than a single year of dependent care taking. The BarrettHand comes with preloaded GCL co mmands. It is also equipped with an ADR interface card to allow for port communication. 31

PAGE 42

Table 9: BarrettHand TM Specifications [19] Total fingers 3 Fingers that spread Fingers 1 and 2 Joints per finger 2 Motors per finger 1 Motors for spread 1 Total number of motors 4 Range of Motion for base joint 140 o Range of Motion for fingertip 45 o Range of Motion for finger spread 180 o Time for finger to move from fully open to fully closed: 1.0 sec Time for Full 180 finger spread: 0.5 sec Optical Incremental encoder for position sensing 0.008 at the finger base joint 17,500 encoder counts full finger open to full close Hand Weight 1.18kg (2.60lbs) Payload 6.0 kg (13.2 lbs), 2.0 kg (4.4 lb) per finger Active Finger Forces (at tip) 15 N (3.3 lb) Passive Finger Forces (at tip) 20 N (4.4 lb) Motor Type Samarium-Cobalt, brushless, DC, servo motors Mechanisms Worm drives integrated with patented cable drive and br eakaway clutch Cycles per Month 10,000 Power Requirements Typical AC electrical outlet Load 600 W Phases Single Voltage 120/240 % VAC Frequency 50/60 Hz Power Supply Size H,W,D: 200 x 200 x 300 mm (7.5 x 7.5 x 12 in) Power Supply Size Weight 5 kg (11 lb) Single Cable to Hand 3m conti nuous-flex cable, 8mm diameter 32

PAGE 43

Chapter Four System Integration 4.1 End-effector Integration When running the BarrettHand program through the provi ded interface, the interface is capable of produci ng C++ code. The problem with this automatically generated code, sh own in Appendix A section A. 1, is that it calls to a C-function library that handles the port communication. In order to bypass this C-function library, which was an additional $2,000, the appropriate port communication was n eeded to be established. The first program experimented with was one written by Ontrack.com and modified to work with the BarrettHand. This code was ve ry long, slow, and was proven to be unnecessary. The next step was to write the c ode in Appendix A, section A.2. The codes sequence opens the por t of communication, sets the baud rates, time out, and other parameters. The BarrettHand has on-board microprocessors and is preprogrammed to recognize End-e ffector Control Language (GCL). The common GCL commands are listed below in Table 10. 33

PAGE 44

Table 10: Common GCL Commands Command Name Purpose C Close Commands the motor to the closed position HI Hand Initialize This initializes the GCL commands in the hand HOME Home Moves the motors to the home position IO Incremental Open Steps the motors open by a user specified increment IC Incremental Close Steps the motors close by a user specified increment LOOP Loop Enters real-time mode M Move Moves the to a user specified position O Open Opens the selected motor T Terminates Power Turns selected motors off HSG Highest Strain Gage Value Sets the highest strain gage value The HI is always run first, prior to all other commands, in order to initialize the GCL commands. Since there is no safety triggers bu ilt into the C++ code, the programmed commands are first run in the BarrettHand software package prior to writing the code. This is done to test the sequence of commands and to make sure that the BarrettHand is not forced into an unstable position or a position that would cause irreparable damage. An example of a series of comm ands created this way is shown in Appendix A, section A. 3. To insert the GCL comma nds into the code, the user should edit the port write co mmand. It looks like: port.write (C\r,5); where the 123C refers to the GCL command and the number five following the comma corresponds to number of characters the program should read in the command. 34

PAGE 45

The BarrettHand will be mounted on the Pu ma robot arm. The versatility of this end-effector allows it to easily gras p most objects requir ed for daily living. Using the BarrettHand in MATLAB is very simple because the hand can still be sent commands using the GCL language. The commands are sent as follows in Figure 18. Figure 18: MATLAB Command Flowchart The first two steps handle port comm unication, where s1 is assigned to port one, and the command serial (COM1) opens port one. The Baud Rate is being set to 9600 bits per second, the parity is none, and the remaining parameters are set in accordanc e to the operating manual. The third command creates a temporar y file, where the %5s indicates the length of data to be sent to the hands interface. The s1 command is the port that is assigned to the BarrettHand and the fprintf is actually writing the command to the port. 35

PAGE 46

With these commands, it is possi ble to open and operate the BarrettHand in MATLAB or C++, depending on the pl atform that the robot manipulator operates on. The MATLAB program used to control the BarrettHand is shown in Appendix B, section in B.3. 4.2 SICK Laser Range Finder A SICK laser was also used in this experiment. The purpose for using this is to detect distance, the z coordinate. This is used for vision applications as well as sending a signal to close the end-effect or. This laser was used because it was already available; however, others may be a better choice in order to avoid eye damage and perhaps downgrade the size of the unit. The picture of the laser range finder is shown in Figure 19. Figure 19: Laser Range Finder The range finder was programmed using a similar port writing C ++ program to that of the BarrettHand. This code is shown in Appendix A, section 4. When running this program, the numbers from the lase r range finder are simply sent to the computer and displayed on a separate screen. 36

PAGE 47

Appendix A, section 5 shows t he integration of the BarrettHand commands with the laser range finder. This program runs, and once an object is within five inches of the BarrettHand, the hand closes. This value can be adjusted as necessary and w ill be experimentally tested, as the mounting of the laser on the arm can change. The laser range finders progra mming in MATLAB was a challenge because, although the port opening is similar to that of the BarrettHand (a Baud rate of 9600 bits per second), data was determined to be lost. The baud rate was slowed down to 1200 to reduce the fr equency of data being lost. In the MATLAB program, the values read by the device had to be put through an if-then loop to ensure that an actual value was received rather than a null value. Calculating the value output was diffe rent in MATLAB because the data is displayed in bits. The bits must be converted as follows. Bits 0 though 9 represent values 45 thr ough 57 and the characters A through F represent 65 through 70. Once the value of each digi t is known, multiply each by the appropriate value (1, 10, 100) or (1, 16), and add the results. This is shown in Appendix B section 1. The integration of the BarrettHand with the laser range finder is shown Appendix B, section 4. 4.3 Camera Systems Four cameras were tested and us ed at different stages of the experimental process. The bumblebee camera was eliminated the quickest, followed by the Logitech camera, Sony camera, and the Creative web camera. The vision aspect of this project is perhaps the hardest to understand. The initial 37

PAGE 48

tests were run with the bumblebee st ereovision camera because it is an inexpensive system that woul d be able to return three dimensional coordinates if needed. The stereovision camera chosen fo r this application is the bumblebee camera because of its light weight. The general procedure taken was to co llect many bottles and start with the camera very close to obtain the point cl oud of the bottle. Then it was necessary to write a simple algorithm for recognizi ng the object in the point clouds when the bottles are further away. Once that phase is accomplished, the camera can be mounted to a robot arm wher e it will obtain the three dimensional coordinates and send the coordinates to a robot arm. Once the arm moves to a position, a command is sent to the BarrettHand to grasp the object and then return to the wheelchair, or some fixed reference point, to finally being released. Although this approach contains only one camera, was determined to be expensive and labor intensive. When tr ying to retrieve three dimensional coordinates of an object, the camera had limited capabilities in regards to adjusting the light and seei ng clear objects and noise. Th is prompted the need to set up a different vision system. The second vision setup was less expensive and has two components: a range finder and a web camera. It was nece ssary to determine a transformation matrix for the laser range fi nder mounted on the robot arm. The range finder will be pointed at an object of reference, in this case a red object, and will return the distance from the range finder to the object of interest. The range finder wil l be stationary with respect to the robot arm and will 38

PAGE 49

be part of the robot arms workspace. It will need a constant transformation from the range finder to the robot arm so t he proper final positio n can be obtained. This would work in a similar manner if the range finder was mounted on a wheelchair with a robot arm attached to it. The camera will be attached above the BarrettHand on the arm and will take pictures before the z coordinates are reached. Using MATLABs image acquisition and image processing toolbo xes, the x and y coordinates will be obtained from some known images such as a bottle or a door knob. The cameras used for these applications are a Logite ch, Sony, and Creative cameras. Any windows based wed camera can be used for this application. The Logitech camera was used for the early stages of the vision algorithms develop with the MATLAB image processing toolbox. This camera was utilized because it was the easiest to use and the resolution was perfect for this application. The Logitech also came with automatic light adjusting capabilities. This camera was later replaced because of the face tracking capabilities. The face tracking made it impossible to hold the camera steady. The face tracking feature would captur e a large image and then would follow it regardless of its movement. The camera would not stay focused in one direction, and the coding that was responsible for the motion of the camera is highly proprietary. The next camera to use was the Sony camera. This camera is a highquality camera with excellent resolution. This camera required a frame grabber card which could not be located, so a video card was sufficient for the 39

PAGE 50

experimental purpose. The experiments in cluded the entire processing plan as well as the calibration which is shown in Table 11. Table 11: Calibration Data for Sony Camera Distance Height Width Area Width / Height 1.18 2 1.3 2.6 0.65 1.96 3 2 6 0.666667 2.75 4 2.7 10.8 0.675 3.54 4.5 3.1 13.95 0.688889 4.33 5.5 4 22 0.727273 5.11 6.3 4.5 28.35 0.714286 5.9 7.2 5.2 37.44 0.722222 6.69 8 5.5 44 0.6875 44.68504 54 41 2214 0.759259 42.87402 51 36 1836 0.705882 40.3937 47 34 1598 0.723404 37.87402 45 33 1485 0.733333 35.74803 43 32 1376 0.744186 33.34646 40 30 1200 0.75 30.15748 38 28 1064 0.736842 29.44882 36 27 972 0.75 27.12598 33 24 792 0.727273 24.84252 31 23 713 0.741935 22.83465 29 22 638 0.758621 20.90551 27 20 540 0.740741 19.68504 25 18 450 0.72 18.38583 23 16 368 0.695652 16.10236 21 14 294 0.666667 14.92126 18.5 13 240.5 0.702703 The calibration was done by adjusting t he distance from a plain white wall to the camera. This distance was reco rded, and the height and width of the displayed screen on the computer was m easured on the white wall using a tape measure. This was done for the anticipated range of use for the camera, as all cameras have different characteristics wh ich govern their field of view. Figures 40

PAGE 51

20 and 21 show the calibration charts wit h a trend line which predicts the width and height of the camera view based on the range finder values. Sony Camera Distance Verse Height and Widthy = 1.1277x + 2.634y = 0.852x + 1.1639 0 10 20 30 40 50 60 01020304050 Distance from Camera (in)Observed Width and Height (in ) Camera Width Camera Height Linear (Camera Width) Linear (Camera Height) Figure 20: Range Finder Distanc e Verse Height for Sony This camera was later scrapped because th e resolution made it difficult for the MATLAB image processing toolbox to differentiate between colors and often provided inconsistent resu lts. This camera may hav e preformed better with an adequate frame grabber card, as the card used for this was a simple video card with no special functions. This camera also had to be mounted over the BarrettHand, and the fingers of the BarrettHand would often obstruct the view of the camera. 41

PAGE 52

The final camera chosen for the app lication is the Creative Web Camera for Notebooks. This camera combines the be st features of the Sony camera (size and versatility) with the Logitech camera (ease of use and dependability). The calibration is as shown below in Table 12 and was conducted in exactly the same manner as the Sony camera. The calibra tion equations are di splayed below in Figure 21. Table 12: Creative Camera Calibration Distance (in) Range Finder Distance (in) Width He ight Height / Width 1 12.5 1.0625 0.75 0.705882 2 13.5 2 1.5 0.75 3 14.5 2.625 2 0.761905 4 15.5 3.375 2.375 0.703704 6 17.5 5 3.6 0.72 8 19.5 6.375 4.625 0.72549 10 21.5 8.875 6.5 0.732394 12 23.5 9.625 7 0.727273 15 26.5 12.5 9 0.72 18 29.5 15.125 11.625 0.768595 21 32.5 17.625 13 0.737589 24 35.5 21 15 0.714286 30 41.5 26 19 0.730769 36 47.5 29.5 22.5 0.762712 42 53.5 36 28.5 0.791667 48 59.5 40 31 0.775 42

PAGE 53

Range Finder Distance Verse Height and Width of a Camera Display y = 0.6521x 0.1978 R2 = 0.9971 y = 0.8407x 0.2435 R2 = 0.9987 0 0.2 0.4 0.6 0.8 1 1.2 0. 30 60 91 21. 5 Range Finder Distance (m)Camera Displayed Distance (m) Height Width Linear (Height) Linear (Width) Figure 21: Range Finder Distance Verse He ight and Width of a Camera Display 4.4 Image Processing In order to locate a specific object of interest, the object must first be defined. In this study, objects of interest were defined simply as objects colored red. This was done because red is not a common household color and allows for the greatest amount of contra st with other colors. In or der to locate red objects, the image acquisition and processing toolboxes from MATLAB version 7.1 where used during these experiments. Once the object was obtained using the image acquisition toolbox, the image processing toolbox was used. MATL AB has a built-in application called L*A*B color space. This color space is pre-programmed with a pixel library for blue, green, red, magenta, yellow, and cyan. It di stinguishes these colors by 43

PAGE 54

comparing the pixel spaces at the boundaries of an object. By utilizing L*A*B color space, the program can distinguish which colors are present by comparing the edge pixels distance. Ba sed on this comparison, it is able to determine which colors are present and separ ate them from other colors This MATLAB code is shown in Appendix B section 2. The general procedure is as follows for the image recognition: the colors are segmented and then filtered only for red. Once the red is located, all other colors turn black and the picture is conver ted to grey scale. From here, the holes are filled by the color next to it. A hole was defined as an object less than thirty pixels in diameter. The next step is the boundary trace function in MATLAB, in which the boundaries are traced along the white to black threshold. Once the boundaries are determined, the centroids are calculated through the same toolbox. The program calculates t he area, in pixels, that the obj ect is occupying, then places the centroid in the center of the object. While this doe s not necessarily represent the mass center, it provides a good basis for the grasping application. When running the program, the camera shoul d be positioned so that only one red object is present. The following four pictures are going to outline the image processing steps. Step 1, shown in Figure 22, shows the original image. This is the step where the image is acquired. 44

PAGE 55

Figure 22: Original Image As can be seen, there are many colors pr es ent Figure 22. The object of interest is going to be the red rule r. Figure 23 shows the imag e once it has been filtered for red and then converted to grey scale. Figure 23: Filtered Image The image is converted to grey scale to a llow the holes to be filled in the next step without a color bias. In the next step, the holes in the image will be filled. The black will be completely bl ack with no white marks and the ruler will look like a solid rectangle. This is shown in Figure 24. 45

PAGE 56

Figure 24: Black and White Image The next step is to use MATLABs trace function. With this function, the edges are traced and the centroid is t hen calculated based on the pixel area. This is shown in the following pict ure with the green mark in Figure 25 representing the calculated centroid of the ruler. Figure 25: Ruler Centroid As can be seen, this process is completed with relatively little error. The closer to the object the camera gets, the more accurate the ca lculations become. 4.5 Integration Platform 46

PAGE 57

The integration of these programs is done in MATLAB. The programs follow the flow of Figure 26. Figure 26: Program Flow Chart The main program is broken into three sub-programs, the Centroid, Laser with Barret Hand and BarrettHand Open. The Centroid operates by calling to the vision algorithm and laser range finder programs shown in appendix B section 2. The vision program runs and finds the centro ids location, after which it calls to the laser range finder and inputs its value into the calibration equation. This returns the x, y and z coordinate of the centroids location with respect to the cameras center. The integration of the BarrettHand, camera, and laser range finder is being applied to a Puma arm as shown in Figure 27. 47

PAGE 58

Figure 27: Puma Arm Kinematic Setup 48

PAGE 59

The transformation matrices were show n in full detail in chapter three. These transformations are necessary for ca lculating the inverse kinematics of the Puma arm. This suite did not utilize this information, but the groundwork is being laid for future testing of the output results from the MATLAB programming. All of the following matrices are necessary in order to obtain the angles of rotation necessary for the robot arm to move to. From chapter three, equation 3.16 states: TTTTTTT5 6 4 5 3 4 2 3 1 2 0 1 0 6 (3.16) In order to obtain the join t angles, the following paradigm has to be followed. First, a nonmoving base frame must be defined. Equation 4.1 shows the transformation matrix of t he Puma arm base frame wit h respect to the frame zero. This is illustrated in Figure 28. 1000 672.0100 0010 00010TB (4.1) Figure 28: Translation of O Frame to Base 49

PAGE 60

Equation 4.1 shows only a translati on component in the z direction during this transformation. Secondly, the workstation frame must be known with respect to the base frame. There is only translation in this transformation matrix shown in equation 4.2. 1000 0100 23.0010 68.0001TB S (4.2) This is also illustrated in Figu re 29. The orange lines in Figure 29 represent the translation in the x and y directions in the transformation matrix, since the workstation was at the same height as the ba se frame, no translation in the z axis occur. Figure 29: Translation from Base Frame to Workstation Frame The next step is to fi nd the transformation matrix of the camera with respect to the wrist frame, which was assigned to be frame number six in chapter 50

PAGE 61

three. This transformation matrix is shown in equation 4.3 and only shows translation along the y and z axes. Frame 6 is shown in Figure 32. 1000 2.0100 02.0010 00016TC (4.3) The goal point, called fr ame P in this case, was described with respect to the workstation frame. This is demons trated in equation 4.4 and illustrated in Figure 30. 1000 0100 41.0010 038.0001TS P (4.4) Figure 30: Part with Re spect to Workstation Figure 30 shows the translation portion of the transformation matrix in yellow; again, there is no rotation to take into consideration. 51

PAGE 62

Equation 4.5 shows the transformation matrix of the part frame with respect to the view of the camera. In th is equation, the fourth column represents the results of the centroid computation. This is shown in Figure 31. 1000 100 010 001Zcentroid Ycentroid Xcentroid TC P (4.5) Figure 31: Part With Respect to Camera Figure 31 shows that the y and z axis changes in rotation and that there is translation along the x, y and z coordinates, shown in green. The last frame that must be defined is the end-effector frame. The endeffector frame is defined with respect to jo int 6. This is show n in equation 4.6 and illustrated in Figure 32. 1000 33.0100 0010 00016TG (4.6) 52

PAGE 63

Figure 32: End-effector Frame with Respect to Frame 6 In Figure 32, the red error repres ents translation along the z axis, and there is no rotation in this transformation. In order to obtain the necessary joint angles for the Puma arm configuration, equation 4.7 must be satisfied. This provides a check point to ensure everything is operat ional and accurate. Figure 33 shows the overall transformation of the Puma arm. 53

PAGE 64

Figure 33: Transformations of the Puma Arm In Figure 33, the burgundy line repr esents the transformation of frame 6 with respect to frame 0. The dark green line represents the transformation of the part with respect to frame 6. The product of these two transformations is equal to the products of the fo llowing transformations: The transformation of the base with res pect to the origin, shown in blue. The transformation of the workstation with respect to the base, shown in light green. The transformation of the part with re spect to the base, shown in teal. The inverse transformation of the part with respect the camera, shown in army green. 54

PAGE 65

The transformation of the camera wit h respect to frame 6, shown in yellow. These transformations are shown in equation 4.7. 1 060 6 TTTTTTC P S P B SBC (4.7) Equation 4.7 is true in the current posit ion. This allows for the calculation of the joint angles for in the current position. T hese results can be used as input for the inverse kinematics solver in future applications. For the arm to have the joint angles of the desired location, the end-effector fr ame and the part frame should have zero translation, meaning t hey should be coincident. Equations 4.8 and 4.9 shows this, where represents the transformation matrix of the desired end-effector location with respect to the origin. TO 6 TO d 6 TTTTTC PC d Gd60 6 60 6 (4.8) This is also equal to equation 4.9. TTTTTGd S P B SB 00 6 0 (4.9) Equation 4.9 serves as a check for equation 4.8. These last two equations are used to determine what angles are necessa ry for the Puma arm to move to in order to have the end-effect or at the parts location. 4.6 Operating Procedures The sensory suite was shown to be very user friendly and is suitable to be integrated into any robot arm capable of Cartesian movement, with the exception of one change. The transformation matrix from the camera to the laser range finder must be changed every time the came ra or the laser range finder is moved to a different location. However, as long as the displacement between the two 55

PAGE 66

objects remains in one coordinate, in th is case the z-coor dinate, the change should be a matter of subtracting the o ffset value in the MATLAB program seen in Appendix B. Once on a wheelchair the camera must be aimed at an object, which means that some user input is necessary. Once an object of interest is within the cameras viewpoint, the image processing program must be run. Once a positive centroid is calculated, the results need to be sent to the arm. This step is dependant on the robot ar m and must be integrated accordingly. Once the results are sent to the robot arm, it starts to move. When this starts, the automatic gri pping program should be run. Once the robot arm is within a certain distance fr om the object of interest, the hand will automatically close. Once closed, the arm can be mo ved back to the users workplace. 56

PAGE 67

Chapter Five Analysis of Results and Conclusions 5.1 Results The final set-up appears as Figure 34. Figure 34: Final Puma Arm Configuration This Figure shows the workstation, the arm, the BarrettHand, the camera in the center of the hand, and the laser range finder. The camera in the center of the hand is shown below in Figure 35. 57

PAGE 68

Figure 35: Camera in Hand In order to properly determine how the integration works, several experiments were run. The first experiment was placing objects of interest in different locations with different ba ckgrounds and seeing how closely the program calculated the cent roid as opposed to the actual centroid, shown below in Table 13. Table 13: Before and After Image Processing Original Image Centroid Location 58

PAGE 69

Table 13: Continued In the next test, 4 different objects were tested. The image processing program was run and the values of the centroid were displayed. Ten readings were taken for each object. These readi ngs are then averaged together, and their results are compared. The four objects include a ratcheting tie down, ball, ru ler, and box of staples; all of which are red. These results are quantified in Table 14. 59

PAGE 70

Table 14: Actual vs. Calculated Centroid in x, y, z Direction Computer Value Average (m) Actual Measured Distance (m) Difference (m) Object A X -0.005 -0.006 0.001 Y -0.0079 -0.009 0.0011 Z 0.2018 0.205 0.0032 Object B X -0.009 -0.007 0.002 Y -0.008 -0.007 0.001 Z 0.267 0.268 0.001 Object C X -0.009 -0.010 0.001 Y -0.008 -0.008 0.000 Z 0.2345 .234 0.0005 Object D X -0.007 -0.005 0.001 Y -0.009 -0.008 0.001 Z 0.212 0.210 0.002 As can be seen from Table 14, the greatest error that occurs is about 3 millimeters in the z direction, 2 millimeters in the x, and1 millimet er in the y. This means the x, y, and z coordinates calcul ated is equal to their actual location. Even though there is a difference, this di fference falls within the margin of error when physically measuring the actual centroid. There was also some error found in the camera calibration of the Creative web camera, which is shown in Table 15. 60

PAGE 71

Table 15: Creative Camera Calibration Error Distance (in) Range Finder Distance (in) Width Height H/W Expected Ratio Error % 1 12.5 1.0625 0.75 0.705882 0.75 5.882353 2 13.5 2 1.5 0.75 0.75 0 3 14.5 2.625 2 0.761905 0.75 -1.5873 4 15.5 3.375 2.375 0.703704 0.75 6.17284 6 17.5 5 3.6 0.72 0.75 4 8 19.5 6.375 4.625 0.72549 0.75 3.267974 10 21.5 8.875 6.5 0.732394 0.75 2.347418 12 23.5 9.625 7 0.727273 0.75 3.030303 15 26.5 12.5 9 0.72 0.75 4 18 29.5 15.125 11.625 0.768595 0.75 -2.47934 21 32.5 17.625 13 0.737589 0.75 1.654846 24 35.5 21 15 0.714286 0.75 4.761905 30 41.5 26 19 0.730769 0.75 2.564103 36 47.5 29.5 22.5 0.762712 0.75 -1.69492 42 53.5 36 28.5 0.791667 0.75 -5.55556 48 59.5 40 31 0.775 0.75 -3.33333 The expected ratio should be the sa me as the measured ratio, and the maximum error seen in this calibration was about 6.2%. This can contribute to the error in the calculations of the x and y directions. The final results are shown in the following sequence of pictures, with each step being illustrated in the next four images. Step 1: Acquire Original Image 61

PAGE 72

Figure 36: Original Test Image Step 2: Image Centroid Location Figure 37: Centroid Test Image The location of the centroid is x coor dinate = -0.008, y coordinate = -0.005, z coordinate = 0.218 with all dimensions being in meters. Step 3: Arm moving into place, shown in Figure 38. 62

PAGE 73

Figure 38: Puma Arm Moving Step 4: BarrettHand automatically closin g around center of object shown in Figure 39. Figure 39: Grasping Test 63

PAGE 74

Step 5: Object returning to persons worksp ace, in this case the blue cup, shown in Figure 40. Figure 40: Returning to Workspace 5.2 Discussion There are several possible sources of e rror. The greatest source of error in the results can be as a result of camera calibration readout. When calibrating the camera, a simple ruler was used. This is not nearly as accurate as using the laser range finder, so there is a certain am ount of acceptable loss in this process. Generally, the error was a maximum of 6%, which is still acceptable for the tasks preformed. The BarrettHand compensates for th is error because of the way the fingers curl around an object. As the object is being grasped, it is forced to the center of the palm of the hand. This allows for even grasping of the object even with some error present. 64

PAGE 75

Another source of error was encounter ed with the image processing. If the room is too dark, the program will not work. It is also essential that there only be one object of interest in the camera fiel d of view. If this does not happen, the arm receives two output coordinates and cannot distinguish between the two coordinates; therefore, the a rm does not know where to go. This type of error is a fatal error and will result in the program malfunctioning. 5.3 Conclusion In conclusion, the end-e ffector sensory suite meet s all the objectives as defined in chapter one. An end-e ffector sensor suite to a ssist in the control of a rehabilitation robot end-effector was developed. This included: Programming and integration of the BarrettHand end effector to the Puma 560 robot Selection and integrat ion of a laser range finder with the manipulator Selection and integr ation of a vision system with the manipulator Experiments were also performed with t he sensory suite integrated with the manipulator on the arm. This included: Using the end-effector sensor suite s assistance in finding the centroid of an object Analyzing object centroid location dat a to ensure accurate results were obtained Developed algorithms for moving t he robot hand to grasp the object 65

PAGE 76

5.4 Recommendations The operating platform for this experim ent would by greatly improved by complete cartesian control of the Puma arm. This would allow for a fully functional robot system. A better arm would also provide a better platform for the sensory suite. The wheelchair arm created by Kevin Edwards, shown in Figure 32, would provide the perfect mobile platform. Other improvements can be made to the vision algorithms. If object recognition was utilized instead of co lor recognition, there would be less interference with the background. For obj ect recognition, high resolution is necessary in order to proper ly identify the object. This would work best with the Sony Camera tested. In the future, combining the Sony Camera with a proper frame grabber card could lead to more stable and accurate results. 66

PAGE 77

References [1] US Census Bureau. Facts for Feature. July 26, 2002. http://www.census.gov/Press-Release/www/2002/cb02ff11.ht ml. January 2006. [2] Wikipedia. Disability. 13 November 2005. http://en.wikipedia.org/wik i/Disability 15 Nov. 2005. [3] United Nations Statistics Division. Wa shington Group on Disability Statistics. 2006. http://unstats.un.org/unsd/methods /citygroup/ashingt on.htm. May 2006. [4] National Center for Health Statistics. National Health Interview Survey on Disabilities. 1994. http://www.cdc.gov/n chs/about/major/nhis_ dis/ad292tb1.htm. May 2006. [5] Lancioni G. E.; OReilly M. F.; Bas ili G. An overview of technological resources used in rehabilitation research with people with severe/profound and multiple disabilities. Disability and Rehabilitation 23, no. 12 (2001): 501-508. [6] Dallaway JL, Jackson RD, Timmers PHA (1995) Rehabilitation Robotics in Europe. IEEE Transactions on Rehabilitation Engineering 3. 35-45. [7] Bien, Zeungnam ; Chung, Myung-Jin et al. Integration of a Rehabilitation Robotic System (KARES II) with Human-Friendly Man-Machine Interaction Units. Autonomous Robots 16, no. 2 (2004): 165-191. [8] Edwards, K. Design, Construction and Testing of a Wheelchair Mounted Robotic Arm Masters Thesis. Universi ty of South Florida. July, 2005. [9] Fritz, B. Development and Testi ng of Telerobotic System for Enhancing Manipulation Capabilit ies of Persons with Disabilities Masters Thesis. University of South Florida. July, 2002. [10] Jurczyk, M. U. Shape Based Ster eovision Assistance in Rehabilitation Robotics. Masters Thesis. University of South Florida. March, 2005. [11] Martin. M.,Genetic Programming fo r Robot Vision, From Animals to Animats 7, (SAB 2002), Edinburgh, UK. 2002. [12] Kragic, D. et al., Vision for robot ic object manipulation in domestic settings, Robotics and Autonom ous Systems, 2005. 67

PAGE 78

[13] Allen, P. et al., Integration of Vision, Fo rce and Tactile Sensing for Grasping, Department of Computer Sci ence, Columbia University, New York, NY 10027. [14] Corke, Peter. Puma Goodies http://www.cat.csiro.au/. June 2006. [15] Craig, John J. Introduction to Robotics: Mechanics an d Control. Third Edition. Peasrson Prentice Hall. U pper Saddle River, NJ. 2005.pp 80-100. [16] Logitech Webcams. Wuickcam Orbi t Pm. 2006. http://www.logitech.com/ May 2006. [17] Best Buy. Creative Labs WebCamera for notebooks. 2006. http://www.bestbuy.com/site/olspage. jsp?skuId=6572675&type=product&id=1077 630651844. June 2006. [18] SICK Sensor Intelli gence. DME 2000 and DT60. 2006. http://www.sick.com/gus/products/product_ catalogs/industrial_sensors/en.html March 2006. [19] Barrett Technology. BarrettHand, 1997, http://www.barrett.com/robot/products/hand/handfram.htm, July 2005. [20] Mathworks. Image Processing Toolbox 5.2. http://www.mathworks.com/products/image/. 2006. 68

PAGE 79

Bibliography Bjorkman, M. and Eklundh, O., Real-time epipolar geom etry estimation of binocular stereo heads, IEEE Trans. Pattern Analysis Mach. Intel. 24 (3) (2002) pp. 425-432. Ferrier, N., Vision Control of Robot Reac hing, University of Wisconsin-Madison, 1998. Gonzlez-Galvn, E. et al., An efficient multi-camera, multi-target scheme for the three-dimensional contro l of robots using uncalibra ted vision, Robotics and Computer-Integrated Manufacturing, 19 (5) 2003, PP. 387-400. http://www.sciencedirect.com/science/article/B6V4P-494C3CF1/2/dddbc768045d829 e1be6a973a0a5b06f. Katevas, N.I.; Sgouros, N.M.; Tzafestas, S.G et al., "The autonomous mobile robot SENARIO: a sensor aided intell igent navigation system for powered wheelchairs," Robotics & Automation Magazine, IEEE 4 (4), (1997) pp.60-70 http://ieeexplore.ieee.or g/iel3/100/13824/00637806.pdf. Mattone, R., Campagiorni, G., et al., Sor ting of items on a moving conveyor belt. Part 1: a technique for detecting and classifying objects, Robotics and Computer-Integrated Manufacturi ng, 16 (2-3) 2000, pp. 73-80. http://www.sciencedirect.com/science/article/B6V4P-409143G1/2/8309fb5434c94f084af8b6ffa2a1dbd8. Miller, A ; Allen, P., et al. From robot ic hands to human hands: a visualization and simulation engine for grasping research , Industrial Robot: An International Journal 32, no. 1 (2005): 55-63. Miller, A.T.; Allen, P.K., "Graspit! A versatile simulator for robotic grasping," Robotics & Automati on Magazine, IEEE 11(4), (2004), pp. 110122 http://ieeexplore.ieee.org/iel5 /100/29987/01371616.pdf?isnumber=29987 =JNL &arnumber=1371616&arnum ber=1371616&arSt=+110&ar ed=+122&arAuthor=Mi ller%2C+A.T.%3B+Allen%2C+P.K. Phybotics. Assistive and Therapy Robotics. 2004. http://appliedresource.com /RTD/index.htm Jan 2006. Reichel, L. et al., Range estimation on a robot using neuromorphic motion sensor, Robotics and Autonomous System. 51 (2005) pp. 167-274. 69

PAGE 80

Sansoni, G. and Docchio, F., Three-di mensional optical measurements and reverse engineering for automotive app lications, Robot ics and ComputerIntegrated Manufacturing., 20 (5) (2004), pp. 359-367. Scott, C. 2005. Vision-guided ro bot dispenses liquid gasket, . Small Robot Sensors, (2005) . 70

PAGE 81

Appendices 71

PAGE 82

Appendix A: C++ Code C++ is used throughout this project to control a robot end-effector the BarrettHand and a laser range finder. The C c ode used in this thesis was design to control Acquisition Data Report (ADR) interfaces. When conn ecting to a serial port, the ADR interface boards allows control of analog and digital input and outputs using American standard code fo r information interchange (ASCII) control commands. ADR interfaces are easy to use with Visual Basic, Basic, C or other high level languages that allow a ccess to a serial port. The C++ language protocol and definitions are set by the American National Standards Institute (ANSI). The common codes and defin ition used for this project are listed on the next page in Table 16. 72

PAGE 83

Appendix A: (Continued) Table 16: C++ Code Definitions Command Definition iostream.h Library that provides functionality to perform input and output operations with a stream BUF_SIZE 80 Sets the buffer size for 80 COMPort port(COM1) Calls to ADR interface port 1 comport.h Library that prov ides the functionality to perform port communication conio.h Loads DOS specific commands, it is not a standard ANSI Commands Define Defines a variable in the program If The beginning of a common if then statement Include Initiates the library in the program Int Initiates a command port.setBitRate Sets the bit ra te data rate expressed in bits per second. This is a similar to baud but the latter is more applicable to channels with more than two states. port.setDataBits Sets the data bit rate through the port port.setParity Sets the parity which is used in the error detection procedure port.setStopBits Sets the stop bits, which are extra "1" bits which follow the data and any parity bit. They mark the end of a unit of transmission Port.write Writes commands to the device connected to the port Printf Prints display in a command window Sleep Delays the program a specified amount of time stdio.h Library that provides ANSI I/O function library that allow r eading and writing to files and devices time.h Library that provides time data 73

PAGE 84

Appendix A: (Continued) A.1 Automatic C++Code The following code is the automatically generated C++ code generated by the BarrettHand program. This code is useful if the C function library for the hand is installed. In this research, that library was not available, but an example of the code is as follows. #include #include #include #include #include "BHand.h" BHand bh; // Handles all hand communication int value; // Hand parameter obtained with Get int result; // Retu rn value (error) of all BHand calls void Error(void) { printf( "ERROR: %d\n%s\n", resu lt, bh.ErrorMessage(result) ); exit(0); } void Initialize(void) { if( result=bh.InitSoftware(1,THR EAD_PRIORITY_TIME_CRITICAL) ) Error(); if( result=bh.ComSetTi meouts(0,100,15000,100,5000) ) Error(); if( result=bh.Baud(9600) ) Error(); if( result=bh.InitHand("") ) Error(); } // Execute commands, return 1 if interrupted with a key int Execute(void) printf( "Press Any Key to Abort..." ); if( result=bh.GoToHome() ) Error(); if( _kbhit() ) { _getch(); return 1; } if( result=bh.Close( "123" ) ) Error(); if( _kbhit() ) { _getch(); return 1; } 74

PAGE 85

Appendix A: (Continued) if( result=bh.Delay( 10000 ) ) Error(); if( _kbhit() ) { _getch(); return 1; } if( result=bh.GoToHome() ) Error(); if( _kbhit() ) { _getch(); return 1; } return 0; } // Main function initialize, execute void main(void) { printf( "Initialization..." ); Initialize(); printf( Done\n" ); printf( "Executing ); Execute(); printf( Done\n" ); } A.2: BarrettHand Initialization C++ Code This code was created for this pr oject to open communication with the BarrettHand Port and to initialize the hand. Whenever using the BarrettHand in C++, this code must be run prior to any other code or else the GCL library will not be activated. #include #include #include #include "comport.h" #include #include #define BUF_SIZE 80 //global variables int mStartFlagG = 0; // prototypes void sleep( clock_t wait ); int main() { COMPort port("COM1"); //char buffer[BUF_SIZE]; //int bytesRead; 75

PAGE 86

Appendix A: (Continued) int i = 0; port.setBitRate(COMPort::br9600); port.setParity(COMPort::None); port.setDataBits(COMPort::db8); port.setStopBits(COMPort::sb1); //Command lines being sent to the hand /****************************************************************************************** HI only needs to be run when first power ing up the BHand, otherwise it can be commented out, also is HSG is set to low, the fingers will not close. The timer also needs to be places in-betw een each command to allow, otherwise the command is bypassed ********************************* *****************************************************/ if (mStartFlagG == 0) { printf("Hi Starting\n"); port.write("HI\r",3); sleep( (clock_t)5 CLOCKS_PER_SEC ); port.write("123FSET HSG 350\r",16); } sleep( (clock_t)1 CLOCKS_PER_SEC ); sleep( (clock_t)7 CLOCKS_PER_SEC ); return 0; } /* Pauses for a specified number of milliseconds. */ void sleep( clock_t wait ) { clock_t goal; goal = wait + clock(); while( goal > clock() ) ; } A.3: BarrettHand Demo C++ Code This code was created to serve as a template for writing different commands to the BarrettHand. Basicall y every closed loop GCL command is tested in this program. This code is great for demonstrating the capabilities of the BarrettHand. #include #include #include #include "comport.h" #include #include #define BUF_SIZE 80 76

PAGE 87

Appendix A: (Continued) int mStartFlagG = 0; // prototypes void sleep( clock_t wait ); int main() { COMPort port("COM1"); //char buffer[BUF_SIZE]; //int bytesRead; int i = 0; //int MtimeClose, MtimeOpen; //port.setBitRate(COMPort::br19200); port.setBitRate(COMPort::br9600); port.setParity(COMPort::None); port.setDataBits(COMPort::db8); port.setStopBits(COMPort::sb1); /*printf("Enter seconds to stay close: "); scanf("%i",&MtimeClose); printf("\nEnter seconds to stay open: "); scanf("%i",&MtimeOpen); //while(port.read() != '+'); while(1) { bytesRead = port.read(buffer, 5); //sleep(5000); sleep( (clock_t)MtimeClose CLOCKS_PER_SEC ); //getchar(); buffer[bytesRead-1] = '\0'; cout << "Read << bytesRead << by tes. Message was:" << endl; cout << buffer << endl; }*/ /*if (mStartFlagG == 0) { printf("Hi Starting\n"); port.write("HI\r",3); sleep( (clock_t)5 CLOCKS_PER_SEC ); port.write("123FSET HSG 350\r",16); } sleep( (clock_t)5 CLOCKS_PER_SEC );*/ sleep( (clock_t)7 CLOCKS_PER_SEC ); port.write("123O\r",5); sleep( (clock_t)2 CLOCKS_PER_SEC ); port.write("123C\r",5); sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("123O\r",5); 77

PAGE 88

Appendix A: (Continued) sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("SC\r",3); sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("SO\r",3); sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("123C\r",5); sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("123O\r",5); sleep( (clock_t)5 CLOCKS_PER_SEC ); port.write("12C\r",4); sleep( (clock_t)6 CLOCKS_PER_SEC ); port.write("12O\r",4); sleep( (clock_t)2 CLOCKS_PER_SEC ); port.write("123SC\r",6); sleep( (clock_t)1 CLOCKS_PER_SEC ); port.write("123O\r",5); sleep( (clock_t)3 CLOCKS_PER_SEC ); port.write("123SO\r",6); sleep( (clock_t)14 CLOCKS_PER_SEC ); port.write("123C\r",5); sleep( (clock_t)22 CLOCKS_PER_SEC ); port.write("123O\r",5); printf("T starting\n"); port.write("T\r",2); mStartFlagG = 1; return 0; } /* Pauses for a specified number of milliseconds. */ void sleep( clock_t wait ) {clock_t goal; goal = wait + clock(); while( goal > clock() ) ; } A.4: Laser Range Finder C++ Code The following code operates the la ser range finder. The port is opened in the same way as the BarrettHand, but only values are read. The port automatically converts the bites the actual values, so no parsing is needed in C. #include "comport.h" #include #include #include 78

PAGE 89

Appendix A: (Continued) #include #define BUF_SIZE 80 int main() {COMPort port("COM2"); char buffer[BUF_SIZE]; int bytesRead, i; int NReadings = 5; port.setBitRate(COMPort::br9600); port.setParity(COMPort::Even); port.setDataBits(COMPort::db7); port.setStopBits(COMPort::sb1); float avg = 0.0; while(port.read() != '+'); i= 0; while(i <= NReadings) { bytesRead = port.read(buffer, 5); buffer[bytesRead-1] = '\0'; //cout << "Read << bytesRead << by tes. Message was:" << endl; cout << buffer << endl; i++; avg = avg + atof(buffer); } avg = avg/(NReadings+1)*0.0393700787; printf("distance in inches is = %4.6f\n",avg); return 0; } A.5: Laser Range Finder with BarrettHand C++ Code This code combines the BarrettHand with the laser range finder. It does this with an if statement that can be adjusted as needed. Basically, when a specific reading of the laser range finder occurs, the hand will close. /*#include "comport.h" #include #include #include #include #include #define BUF_SIZE 80 int main() {COMPort port("COM2"); char buffer[BUF_SIZE]; int bytesRead; 79

PAGE 90

Appendix A: (Continued) int n; int avg; int NReadings; int i; port.setBitRate(COMPort::br9600); port.setParity(COMPort::Even); port.setDataBits(COMPort::db7); port.setStopBits(COMPort::sb1); while(port.read() != '+'); { bytesRead = port.read(buffer, 5); buffer[bytesRead-1] = '\0'; //cout << "Read << bytesRead << by tes. Message was:" << endl; cout << buffer << endl; n = atof(buffer); } return 0; */ #include #include #include #include "comport.h" #include #include #include #define BUF_SIZE 80 // prototypes void sleep( clock_t wait ); int main() { int NReadings = 10; char buffer[BUF_SIZE]; int bytesRead, i; //create port1: will be assigned to BHand COMPort port1("COM1"); port1.setBitRate(COMPort::br9600); port1.setParity(COMPort::None); port1.setDataBits(COMPort::db8); port1.setStopBits(COMPort::sb1); //create port2: will be assigned to Laser COMPort port2("COM2"); port2.setBitRate(COMPort::br9600); port2.setParity(COMPort::Even); port2.setDataBits(COMPort::db7); 80

PAGE 91

Appendix A: (Continued) port2.setStopBits(COMPort::sb1); float avg = 0.0; while(port2.read() != '+'); i= 0; //while (i <= NReadings) for(;;) { i=0; i++; bytesRead = port2.read(buffer, 5); buffer[bytesRead-1] = '\0'; //cout << "Read << bytesRead << by tes. Message was:" << endl; cout << buffer << endl; avg = 0.0; avg = atof(buffer); //avg = avg + atof(buffer); avg = avg 0.0393700787; if(avg <= 5.0){ port1.write("123C\r",5); //sleep( (clock_t)1 CLOCKS_PER_SEC ); port1.write("123c\r",5); sleep( (clock_t)100 CLOCKS_PER_SEC ); //port1.write("123T\r",5); port1.write("123o\r",5); avg = 100.0;} } return 0; } /* Pauses for a specified number of milliseconds. */ void sleep( clock_t wait ) { clock_t goal; goal = wait + clock();while( goal > clock() ) ; } 81

PAGE 92

Appendix B: MATLAB Code MATLAB s vision acquisition and processing toolboxes were utilized during this project. MATLAB was chosen bec ause it can be easily integrated with the robot arm manipulation calculations. The toolboxes are able to be integrated with over the counter cameras such as ones from Logitech, Sony and other basic type web cameras. The vision acquisition toolbox prov ides functions for acquiring and displaying images from a camera. It in terfaces with Windows-compatible videocapture devices, such as USB and FireWire (IEEE-1394) scientific video cameras, as well as Web cameras, c apture boards, and DV camcorders. There are only a few main commands used from th is toolbox, which is shown below in Table 17. Table 17: MATLAB Image Acquisition Commands Command Definition data = getsnapshot(vid); Gets an instant snapshot from the video device Delete(vid); Deletes the video to free extra memory preview(vid); Previews the video using a frame rate of 30 frames per second set(vid.source,Brightness,100); Set the brightness for the previewed video vid = videoinput(winvideo, 1); Accesses a windows ready image device, such as a web camera Once the image has been acquired, it is processed using MATLABs image processing toolbox. Table 18, loca ted on the next page, shows the common commands and definitions for the MA TLAB image processing toolbox [19]. 82

PAGE 93

Appendix B: (Continued) Table 18: Common MATLAB Image Processing Commands Command Definition rgb2gray Convert RGB image or colormap to grayscale Bwareaopen Removes all objects underneath a specified pixel requirement imfill Fills in holes underneath a specified pixel requirement bwboundaries Traces the boundaries centroid Calculates the centroid B.1 Laser Range Finder MATLAB Code This code is used for the laser range finder. It first opens the port, then it obtains the data. Afterwards the data is parsed, meaning that the bits read are changed into values. When converting bits to numerical values the ASCII code for 0 -> 9 are 48 -> 57, so the value of a numeric character equals (ASCII code 48). In Hex. (characters A -> F, 65 -> 70) would be (ASCII code 55). Once the value of each digit is found, multiply eac h by the appropriate value (1, 10, 100) or (1, 16), and add the results. function avg1 = test_serial() % To construct a serial port object: clear clear all s1 = serial('COM1', 'BaudRate', 9600); % To connect the serial port object to the serial port: fopen(s1) % To query the device. A = fread(s1,7) % To disconnect the serial port object from the serial port. fclose(s1); delete(s1); %A1(1) = 53; %A1(2) = 48; %A1(3) = 48; %R = 0.0; %R = (char(A(3)) 48)*100+(char(A (4)) 48)*10+(char(A(5)) 48) % Parsing of Sick data set % 1 83

PAGE 94

Appendix B: (Continued) if A(2) == 177 A(2) = 48+1; end; if A(3) == 177 A(3) = 48+1; end; if A(4) == 177 A(4) = 48+1; end; if A(5) == 177 A(5) = 48+1; end; % 2 if A(2) == 178 A(2) = 48+2; end; if A(3) == 178 A(3) = 48+2; end; if A(4) == 178 A(4) = 48+2; end; if A(5) == 178 A(5) = 48+2; end; if A(2) == 179 A(2) = 48+3; end; if A(3) == 179 A(3) = 48+3; end; if A(4) == 179 A(4) = 48+3; end; if A(5) == 179 A(5) = 48+3; end; if A(2) == 180 A(2) = 48+4; end; if A(3) == 180 A(3) = 48+4; end; if A(4) == 180 84

PAGE 95

Appendix B: (Continued) A(4) = 48+4; end; if A(5) == 180 A(5) = 48+4; end; % 5 if A(2) == 181 A(2) = 48+5; end; if A(3) == 181 A(3) = 48+5; end; if A(4) == 181 A(4) = 48+5; end; if A(5) == 181 A(5) = 48+5; end; % 6 if A(2) == 182 A(2) = 48+6; end; if A(3) == 182 A(3) = 48+6; end; if A(4) == 182 A(4) = 48+6; end; if A(5) == 182 A(5) = 48+6; end; % 7 if A(2) == 183 A(2) = 48+7; end; if A(3) == 183 A(3) = 48+7; end; if A(4) == 183 A(4) = 48+7; end; if A(5) == 183 A(5) = 48+7; end; 85

PAGE 96

Appendix B: (Continued) % 8 if A(2) == 184 A(2) = 48+8; end; if A(3) == 184 A(3) = 48+8; end; if A(4) == 184 A(4) = 48+8; end; if A(5) == 184 A(5) = 48+8; end; % 9 if A(2) == 185 A(2) = 48+9; end; if A(3) == 185 A(3) = 48+9; end; if A(4) == 185 A(4) = 48+9; end; if A(5) == 185 A(5) = 48+9; end; R = [A(1) A(2) A(3) A(4) A(5)]; % current measurement is contained in A (elements 2, 3,4, and 5) Rx = char(R); avg1 = Rx; display(avg1); B.2: Centroid MATLAB Code This code handles the main image proc essing in MATLAB. It is explained in full detail in Chapter 4. function [xcen, ycen] = mycentroid() vidobj = videoinput(' winvideo', 2); preview(vidobj); pause(15); fabric = getsnapshot(vidobj); imwrite(fabric,'fabric2.png','png'); delete(vidobj) clear vidobj; 86

PAGE 97

Appendix B: (Continued) fabric = imread('fabric2.png'); %Figure(1), imshow(fabric), title('fabric'); load regioncoordinates; nColors = 6; sample_regions = false([size(fabr ic,1) size(fabric,2) nColors]); for count = 1:nColors sample_regions(:,:,count) = roipoly(f abric,region_coordinates(:,1,count),... r egion_coordinates(:,2,count)); end %imshow(sample_regions(:,:,2)),titl e('sample region for red'); cform = makecform('srgb2lab'); lab_fabric = applycform(fabric,cform); a = lab_fabric(:,:,2); b = lab_fabric(:,:,3); color_markers = repmat (0, [nColors, 2]); for count = 1:nColors color_markers(count,1) = mean2(a(sample_regions(:,:,count))); color_markers(count,2) = mean2(b(sample_regions(:,:,count))); end disp(sprintf('[%0.3f,%0.3f]',color_markers(2,1),color_markers(2,2))); color_labels = 0:nColors-1; a = double(a); b = double(b); distance = repmat(0,[si ze(a), nColors]); for count = 1:nColors distance(:,:,count) = ( (a colo r_markers(coun t,1)).^2 + ... (b co lor_markers(count,2)).^2 ).^0.5; end [value, label] = min(distance,[],3); label = color_labels(label); clear value distance; rgb_label = repmat(label,[1 1 3]); segmented_images = repmat(uint8(0 ),[size(fabric), nColors]); for count = 1:nColors color = fabric; color(rgb_label ~= color_labels(count)) = 0; segmented_images(:,: ,:,count) = color; end %imshow(segmented_images(:,:,:,2 )), title('red objects'); red1=segmented_images(:,:,:,2); I = rgb2gray(red1); threshold = graythresh(I); bw = im2bw(I,threshold); %imshow(bw) 87

PAGE 98

Appendix B: (Continued) %% Step 3: Remove the noise % Using morphology functions, remove pixels which do not belong to the % objects of interest. % remove all object contai ning fewer than 30 pixels bw = bwareaopen(bw,30); % fill a gap in the pen's cap se = strel('disk',2); bw = imclose(bw,se); % fill any holes, so that regionprops can be used to estimate % the area enclosed by each of the boundaries bw = imfill(bw,'holes'); %imshow(bw) %% Step 4: Find the boundaries % Concentrate only on the exterior boundaries. Option 'noholes' will % accelerate the processing by prevent ing |bwboundaries| from searching % for inner contours. [B,L] = bwboundaries(bw,'noholes'); % Display the label matrix and draw each boundary %imshow(label2rgb(L, @j et, [.5 .5 .5])) %pause a=label2rgb(L, @jet, [.5 .5 .5]); %imshow(a); L = rgb2gray(a); threshold = graythresh(L); bw = im2bw(L,threshold); %imshow(bw) %pause bw2 = imfill(bw,'holes'); L = bwlabel(bw2); s = regionprops(L, 'centroid'); centroids = cat(1, s.Centroid); %Display original image I and superimpose centroids. imtool(I) hold(imgca,'on') plot(imgca,centroids(:,1), centroids(:,2), 'g*'); plot(imgca, 160, 120, 'r*'); hold(imgca,'off') xcen = centroids(1,1); ycen = centroids(1,2); B.3: BarrettHand Demo This code opens the port for the Ba rrettHand and demonstrates many of the functions capable by the hand. 88

PAGE 99

Appendix B: (Continued) close all; try fclose(instrfind); %c lose serial comm, if any end; %create serial communication with the hand s1 = serial('COM1'); set(s1,'BaudRate',9600,'Parity','none', 'StopBits', 1, 'DataBits',8,... 'terminator',13); out1 = get(s1); out2 = get(s1,{'BaudRate','DataBits'}); get(s1,'Parity'); fopen(s1); %initialize hand once s1.FlowControl = 'hardware'; temp = sprintf('%3s','HI'); fprintf(s1,temp); pause(5); temp = sprintf('%3s','1C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','2C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','3C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','1O'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','2O') fprintf(s1,temp); pause(1.0) temp = sprintf('%3s','3O'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','SC'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','SO'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','SC'); fprintf(s1,temp); pause(1.0); 89

PAGE 100

Appendix B: (Continued) temp = sprintf('%3s','1C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','1O'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','2C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','2O') fprintf(s1,temp); pause(1.0);; temp = sprintf('%3s','3C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%3s','3O'); fprintf(s1,temp); pause(1.0); temp = sprintf('%5s','123C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%5s','123O'); fprintf(s1,temp); pause(1.0) temp = sprintf('%3s','SO'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','12IC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','13IC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','23IC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SOC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','12OC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','13OC 5000'); fprintf(s1,temp); 90

PAGE 101

Appendix B: (Continued) pause(1.0); temp = sprintf('%10s','23OC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SOC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','12IC 1000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','13IC 1000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','23IC 1000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 1000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIO 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','12IC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','13IO 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%10s','23IO 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIO 5000'); fprintf(s1,temp); pause(1.0); temp = sprintf('%5s','123C'); fprintf(s1,temp); pause(1.0); temp = sprintf('%5s','123O'); 91

PAGE 102

Appendix B: (Continued) fprintf(s1,temp); pause(1.0); temp = sprintf('%5s','HOME'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 300'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 300'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','SIC 300'); fprintf(s1,temp); pause(1.0); temp = sprintf('%9s','S0C 1000'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','1C'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','1O'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','2C'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','2O'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','3C'); fprintf(s1,temp); pause(1.1); temp = sprintf('%3s','3O'); fprintf(s1,temp); pause(1.1); temp = sprintf('%5s','123C'); fprintf(s1,temp); pause(1.1); temp = sprintf('%5s','123O'); fprintf(s1,temp); pause(1.1); 92

PAGE 103

Appendix B: (Continued) B.4 BarrettHand and Laser Code This code combines the BarrettHand software with the laser range finder. It calls up the laser range finder as a function, so all the parsing does not show up. This program handles port communication for both devices function testHand() try fclose(instrfind); %close serial comm, if any end; %create serial communication with the hand s1 = serial('COM1'); set(s1,'BaudRate',9600,'Parity','none', 'StopBits', 1, 'DataBits',8,... 'terminator',13); out1 = get(s1); out2 = get(s1,{'BaudRate','DataBits'}); get(s1,'Parity'); fopen(s1); temp = sprintf('%5s','123o'); %123o = open 3 fingers fprintf(s1,temp); %initialize hand once s1.FlowControl = 'hardware'; %if mdone == 1 % temp = sprintf('%3s','HI') % fprintf(s1,temp); % pause(20); %end; %mdone = 0; %create serial communication with Laser Ranger s5 = serial('COM5', 'BaudRate', 1200); %connect the serial port object to the serial port (COM5) fopen(s5); temp = sprintf('%5s','home'); %123o = open 3 fingers fprintf(s1,temp); %simulation dt = 0.03; %sampling period in seconds t=0;tm=0;nt=0;done=0;tic; timeFinal = 2.1500; avg1 = 300; %while (t < timeFinal) while(1) while tm
PAGE 104

Appendix B: (Continued) A = ParsingLaser(A); R = [A(1) A(2) A(3) A(4) A(5)]; % current measurement is contained in A (elements 2, 3,4, and 5) Rx = char(R); Rx = str2num(Rx); avg1 = double(Rx); %end; if (isempty(avg1) == 1) %do nothing %fprintf('wrong reading'); end; if ((isempty(avg1) == 0) && (avg1 >= 199)) mok = 0; if (avg1 <=405) fprintf('%d',avg1); fprintf('Close the hand'); %close hand temp = sprintf('%5s','123c'); %123c = close 3 fingers fprintf(s1,temp); % pause (45) % temp = sprintf('%5s','123o'); %123o = open 3 fingers % fprintf(s1,temp); break; fclose(s1);delete(s1); fclose(s5);delete(s5) end; end; B.5 Final Integration This is the final code used for running all the programs. It calls to the laser range finder code and the BarrettHand c ode with laser range finder to ease computation time. function [finalx,finaly, avg1] = newcamera() clear all; close all; try fclose(instrfind); %close serial comm, if any end; [xcen, ycen] = mycentroid(); %create serial communication with Laser Ranger s5 = serial('COM5', 'BaudRate', 1200); %connect the serial port object to the serial port (COM5) 94

PAGE 105

Appendix B: (Continued) fopen(s5); %simulation dt = 0.1; %sampling period in seconds t=0;tm=0;nt=0;done=0;tic; timeFinal = 2.1500; avg1 = 300; %while (t < timeFinal) while tm
PAGE 106

Appendix B: (Continued) %%avg 1= z coordinate and xx1 and yy1 ar e the x and y coordinates of a part %%from the left side of the screen (t his can be changed). Once this is %%completed the arm will be given some time to respond by being sent a %%command to move incrementally closer to the location. Another picture %%will be taken. 96