Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bep/cutlery grasp #1339

Draft
wants to merge 111 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
111 commits
Select commit Hold shift + click to select a range
7204cc7
add template for aruco visual servoing
PetervDooren Jul 21, 2023
3aff2eb
add examples of motion
PetervDooren Jul 21, 2023
ed142f0
add example and fix stuff
PetervDooren Aug 8, 2023
9a81529
update docstring
PetervDooren Aug 8, 2023
821a34c
Merge branch 'master' into feature/aruco_visual_servoing
PetervDooren Jan 6, 2024
db81e29
rename files
PetervDooren Jan 6, 2024
00b4075
rename comments and classes
PetervDooren Jan 6, 2024
5165229
fix bugs
PetervDooren Jan 6, 2024
c4ed7df
add force sensor example
PetervDooren Jan 6, 2024
0436f11
add yolo segmentor and fix force sensor edge up
PetervDooren Jan 6, 2024
251e98b
set cutlery ids
PetervDooren Jan 6, 2024
937f1e4
remove prints
PetervDooren Jan 6, 2024
c3f9df3
change image topic to hand cam
PetervDooren Jan 6, 2024
5e38e44
update cultery grasp colors
PetervDooren Feb 20, 2024
9b3acac
add example joint control
PetervDooren Mar 6, 2024
c21cf22
add entity input
PetervDooren Mar 22, 2024
b26f2ff
get transform from palm to gripper
PetervDooren Mar 22, 2024
9a3bc84
Added arm joint trajectories, made a start
LucetteAlewijns Mar 26, 2024
c328c00
Merge branch 'BEP/cutlery_grasp' of https://github.com/tue-robotics/t…
LucetteAlewijns Mar 26, 2024
6bc0414
added closing of gripper and fixed some running errors
LucetteAlewijns Apr 4, 2024
dbedd26
small error fix
LucetteAlewijns Apr 4, 2024
89aa597
Split up pre-grasp movement and changed grasp joint coordinates
LucetteAlewijns Apr 9, 2024
c3520fe
trying to fix error of not going into while loop
LucetteAlewijns Apr 9, 2024
bffbc88
implemented option to close the gripper with more force
LucetteAlewijns Apr 16, 2024
e83f2eb
Made a start on creating a tf2 listener for the gripper coordinates
LucetteAlewijns Apr 17, 2024
98ce727
improvement on the listener
LucetteAlewijns Apr 17, 2024
dcc7f80
fixing errors in the listener
LucetteAlewijns Apr 21, 2024
03922e2
trying another error fix
LucetteAlewijns Apr 21, 2024
5770fca
error fix
LucetteAlewijns Apr 21, 2024
a56788b
added time initialization
LucetteAlewijns Apr 21, 2024
3547850
error fix of not initialized time
LucetteAlewijns Apr 21, 2024
e3554f5
error fix missing argument
LucetteAlewijns Apr 21, 2024
a525d7e
error fix
LucetteAlewijns Apr 21, 2024
64aa0c6
trying another error fix
LucetteAlewijns Apr 21, 2024
8efe5a1
small changes
LucetteAlewijns Apr 22, 2024
e9aeeab
get robot model
LucetteAlewijns Apr 23, 2024
8529aef
gripper coordinates listener should work now
LucetteAlewijns Apr 28, 2024
d519ea7
center point included as output from detection
LucetteAlewijns Apr 29, 2024
13727c2
gripper coordinate listener gives output for each time step
LucetteAlewijns Apr 30, 2024
a767206
fix gripper coordinates
LucetteAlewijns May 5, 2024
39df5f4
Working and tested detection with opencv, this was rewritten into the…
LucetteAlewijns May 6, 2024
87e35fb
cutlery_detector was finalized
LucetteAlewijns May 12, 2024
e961e8e
Obtaining the direction of the cutlery was implemented(upwards or dow…
LucetteAlewijns May 17, 2024
f21b49a
detection will only process the image if it is cutlery, top grasp cod…
LucetteAlewijns May 24, 2024
5175d29
sleep was changed to a while-loop
LucetteAlewijns May 27, 2024
bc365bb
start was made on rewriting frames. direction calculation was corrected
LucetteAlewijns May 27, 2024
131c0f3
camera frame rewritten into hand palm link frame
LucetteAlewijns May 28, 2024
ab9a9eb
segmentation is also published when something different than cutlery …
LucetteAlewijns Jun 4, 2024
87cf536
base movement corrected andobject's orientation/slope rewritten into …
LucetteAlewijns Jun 6, 2024
861aa79
orientation gripper was corrected and wrist roll joint position now a…
LucetteAlewijns Jun 6, 2024
0caa028
grasp detection included, which finishes the first grasp
LucetteAlewijns Jun 6, 2024
d04f852
wrist rotation after grasp was implement to be in line with the edge …
LucetteAlewijns Jun 7, 2024
96cb2e9
backwards movement until table edge was implemented
LucetteAlewijns Jun 10, 2024
fa69162
small changes + part of last grasp
LucetteAlewijns Jun 12, 2024
0c02f1f
small fixes
LucetteAlewijns Jun 13, 2024
3b39f97
small fixes that were determined by testing and implementation of the…
LucetteAlewijns Jun 16, 2024
5cbbb51
small correction
LucetteAlewijns Jun 17, 2024
e0c93e1
code for testing
LucetteAlewijns Jun 17, 2024
0b620c0
small fix
LucetteAlewijns Jun 17, 2024
31108bc
other test option
LucetteAlewijns Jun 17, 2024
949033d
testing
LucetteAlewijns Jun 17, 2024
46081a9
testing
LucetteAlewijns Jun 17, 2024
df1edda
testing
LucetteAlewijns Jun 17, 2024
7736fa0
move downwards until force detection finalized
LucetteAlewijns Jun 17, 2024
e4bd53d
testing
LucetteAlewijns Jun 17, 2024
3f1a4e9
testing
LucetteAlewijns Jun 17, 2024
f117003
small fix
LucetteAlewijns Jun 17, 2024
90dc45f
testing
LucetteAlewijns Jun 17, 2024
4c48c93
small fix: wait until gripper is closed for grasp detection
LucetteAlewijns Jun 17, 2024
88c7254
typing mistake
LucetteAlewijns Jun 17, 2024
fb14388
small fix: signs weren't correct
LucetteAlewijns Jun 17, 2024
e401930
testing
LucetteAlewijns Jun 17, 2024
7ef6cae
testing
LucetteAlewijns Jun 17, 2024
758c814
testing
LucetteAlewijns Jun 17, 2024
d5bcdf7
fix wrist roll joint value
LucetteAlewijns Jun 17, 2024
3b9ae57
sleep implemented
LucetteAlewijns Jun 17, 2024
8344839
fixes to trajectories
LucetteAlewijns Jun 17, 2024
e6ef2a9
Correction on side grasp and on direction calculation
LucetteAlewijns Jun 20, 2024
5d2fdce
Detection now separates cutlery data from other objects, such that al…
LucetteAlewijns Jun 23, 2024
fa9d97a
Redone focal length calculations
LucetteAlewijns Jun 23, 2024
1c7b507
testing
LucetteAlewijns Jun 24, 2024
7c17897
testing
LucetteAlewijns Jun 24, 2024
1434b4a
testing
LucetteAlewijns Jun 24, 2024
129250e
testing
LucetteAlewijns Jun 24, 2024
a276e9f
testing
LucetteAlewijns Jun 24, 2024
fbdda29
small fix on direction
LucetteAlewijns Jun 24, 2024
9624fc1
testing
LucetteAlewijns Jun 25, 2024
5b908f2
testing
LucetteAlewijns Jun 25, 2024
366f701
testing
LucetteAlewijns Jun 25, 2024
2511be4
testing
LucetteAlewijns Jun 25, 2024
1a3bd4c
small fix
LucetteAlewijns Jun 25, 2024
480db2f
testing
LucetteAlewijns Jun 25, 2024
e356b5b
testing
LucetteAlewijns Jun 25, 2024
f926dfd
testing
LucetteAlewijns Jun 25, 2024
92943af
testing
LucetteAlewijns Jun 25, 2024
5d0e984
testing
LucetteAlewijns Jun 25, 2024
6b0dd53
testing
LucetteAlewijns Jun 26, 2024
eb46d62
clearing up the focal length calculation
LucetteAlewijns Jun 26, 2024
c584145
testing
LucetteAlewijns Jun 26, 2024
f900aab
testing
LucetteAlewijns Jun 26, 2024
5f8e01a
testing
LucetteAlewijns Jun 26, 2024
c3b8418
testing
LucetteAlewijns Jun 26, 2024
618e4de
test
LucetteAlewijns Jun 26, 2024
77fb59c
change on rpy second grasp
LucetteAlewijns Jun 26, 2024
3d27cbc
correction angle
LucetteAlewijns Jun 26, 2024
16a436e
testing
LucetteAlewijns Jun 26, 2024
e84db9e
testing
LucetteAlewijns Jun 26, 2024
12a207d
testing
LucetteAlewijns Jun 26, 2024
166d1b4
small correction
LucetteAlewijns Jun 26, 2024
7fee73c
testing
LucetteAlewijns Jun 26, 2024
8f6a0e2
Code finalization
LucetteAlewijns Jun 30, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 46 additions & 0 deletions robot_smach_states/examples/manipulation/example_top_grasp.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# System
import argparse

# ROS
import PyKDL as kdl
import rospy

# TU/e Robotics
from ed.entity import Entity
from ed.shape import RightPrism
from pykdl_ros import FrameStamped
from robot_skills.get_robot import get_robot

# Robot Smach States
import robot_smach_states.util.designators as ds
from robot_smach_states.manipulation.top_grasp import TopGrab


if __name__ == "__main__":

parser = argparse.ArgumentParser(description="Put an imaginary object in the world model and grasp it using the "
"'topgrasp' smach state")
parser.add_argument("x", type=float, help="x-coordinate (in map) of the imaginary object")
parser.add_argument("y", type=float, help="y-coordinate (in map) of the imaginary object")
parser.add_argument("z", type=float, help="z-coordinate (in map) of the imaginary object")
parser.add_argument("--robot", default="hero", help="Robot name (amigo, hero, sergio)")
args = parser.parse_args()

rospy.init_node("test_top_grasping")

robot = get_robot(args.robot)

entity_id = "test_item"
pose = FrameStamped(frame=kdl.Frame(kdl.Rotation.RPY(0.0, 0.0, 0.0), kdl.Vector(args.x, args.y, args.z)),
stamp=rospy.Time.now(),
frame_id="map")
robot.ed.update_entity(uuid=entity_id, frame_stamped=pose)
shape = RightPrism([kdl.Vector(0, 0, 0), kdl.Vector(0, 0.05, 0), kdl.Vector(0.05, 0.05, 0), kdl.Vector(0.05, 0, 0)], -0.1, 0.1)
item = Entity(entity_id, "test_type", pose.header.frame_id, pose.frame, shape, None, None, rospy.Time.now())

item = ds.Designator(item)

arm = ds.UnoccupiedArmDesignator(robot).lockable()

grab_state = TopGrab(robot, item, arm)
grab_state.execute()
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
import rospy
import cv2
import numpy as np
from ultralytics import YOLO
from cv_bridge import CvBridge
from sensor_msgs.msg import Image


class YoloSegmentor:
def __init__(self) -> None:
model_path = "~/MEGA/developers/Donal/yolov8x-seg.pt"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use ~/data/pytorch_models as folder.

device = "cuda"
self.model = YOLO(model_path).to(device)
self.class_ids = [42, 43, 44] # See the COCO dataset for class id to label info (table=60, person = 0)
self.active = False

self.publisher = rospy.Publisher('/hero/segmented_image', Image, queue_size=10)
self.subscriber = rospy.Subscriber('/hero/hand_camera/image_raw', Image, self.callback)

def start(self):
self.active = True

def stop(self):
self.active = False

@staticmethod
def detect(model, frame):
results = model(frame)
result = results[0]
segmentation_contours_idx = [np.array(seg, dtype=np.int32) for seg in result.masks.xy]
class_ids = np.array(result.boxes.cls.cpu(), dtype="int")
return class_ids, segmentation_contours_idx

def extract_table_segment(self, image, class_ids, segmentations):
table_mask = np.zeros_like(image, dtype=np.uint8)
for class_id, seg in zip(class_ids, segmentations):
if class_id in self.class_ids:
cv2.fillPoly(table_mask, [seg], color=(255, 0, 255))
else:
cv2.fillPoly(table_mask, [seg], color=(255, 0, 0))
return table_mask

def callback(self, data):
#rospy.loginfo("got message")
if not self.active:
return
bridge = CvBridge()
cv_image = bridge.imgmsg_to_cv2(data, desired_encoding='passthrough')
cv_image = cv2.cvtColor(cv_image, cv2.COLOR_RGB2BGR)
#rospy.loginfo("converted message")

classes, segmentations = self.detect(self.model, cv_image)
table_segment = self.extract_table_segment(cv_image, classes, segmentations)

# Publish the table segment as a binary mask
table_message = bridge.cv2_to_imgmsg(table_segment, encoding="passthrough")
self.publisher.publish(table_message)


if __name__ == '__main__':
rospy.init_node("cutlery_detector")
ts = YoloSegmentor()
ts.start()
rospy.spin()
216 changes: 216 additions & 0 deletions robot_smach_states/src/robot_smach_states/manipulation/top_grasp.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,216 @@
from __future__ import absolute_import

# ROS
import PyKDL as kdl
from pykdl_ros import VectorStamped, FrameStamped
import rospy
import smach
import tf2_ros
from geometry_msgs.msg import Twist

# TU/e Robotics
from robot_skills.robot import Robot
from robot_skills.arm.arms import PublicArm, GripperTypes
from ..utility import check_arm_requirements, ResolveArm
from ..util.designators import check_type
from ..navigation.navigate_to_grasp import NavigateToGrasp
from ..manipulation.grasp_point_determination import GraspPointDeterminant
from ..util.designators.arm import ArmDesignator
from ..util.designators.core import Designator

from robot_smach_states.manipulation.cutlery_detector import YoloSegmentor
from robot_skills.util.exceptions import TimeOutException


class PrepareGrasp(smach.State):
REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING],
"required_trajectories": ["prepare_grasp"], }

def __init__(self, robot: Robot, arm: ArmDesignator) -> None:
"""
Set the arm in the appropriate position before actually grabbing

:param robot: robot to execute state with
:param arm: Designator that resolves to the arm to grasp with
"""
smach.State.__init__(self, outcomes=['succeeded', 'failed'])

# Assign member variables
self.robot = robot
self.arm_designator = arm

def execute(self, userdata=None) -> str:
arm = self.arm_designator.resolve()
if not arm:
rospy.logerr("Could not resolve arm")
return "failed"

# Arm to position in a safe way
arm.send_joint_trajectory("prepare_grasp")
arm.wait_for_motion_done()

# Open gripper
arm.gripper.send_goal('open', timeout=0.0)
arm.wait_for_motion_done()
return 'succeeded'


class TopGrasp(smach.State):
REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING],
"required_goals": ["carrying_pose"], }

def __init__(self, robot: Robot, arm: ArmDesignator) -> None:
"""
Pick up an item given an arm and an entity to be picked up

:param robot: robot to execute this state with
:param arm: Designator that resolves to the arm to grasp with
"""
smach.State.__init__(self, outcomes=['succeeded', 'failed'])

# Assign member variables
self.robot = robot
self.arm_designator = arm

assert self.robot.get_arm(**self.REQUIRED_ARM_PROPERTIES) is not None,\
"None of the available arms meets all this class's requirements: {}".format(self.REQUIRED_ARM_PROPERTIES)

self.yolo_segmentor = YoloSegmentor()

def execute(self, userdata=None) -> str:
arm = self.arm_designator.resolve()
if not arm:
rospy.logerr("Could not resolve arm")
return "failed"

grasp_succeeded=False
rate = rospy.Rate(10) # loop rate in hz

base_to_gripper = self.frame_from_xyzrpy(0.5, # x distance to the robot
0.08, # y distance off center from the robot (fixed if rpy=0)
0.7, # z height of the gripper
0, 0, 0) # Roll pitch yaw. 0,0,0 for a horizontal gripper.
# start segmentation
self.yolo_segmentor.start()

move_arm = True
while not grasp_succeeded:
# control loop

#TODO get grasp pose wrt wrist

#TODO force sensor does not provide a good interface for this.
try:
arm._arm.force_sensor.wait_for_edge_up(1.0) # wait 1 second for a force detection
except TimeOutException:
rospy.loginfo("No edge up detected within timeout")
pass

# example base command
v = Twist()
v.linear.x = 0 # forward
v.linear.y = 0 # linear left
v.angular.z = 0 # rotation speed to the left
self.robot.base._cmd_vel.publish(v) # send command to the robot

# example arm pose command
if (move_arm):
pose_goal = FrameStamped(base_to_gripper,
rospy.Time.now(), #timestamp when this pose was created
"base_link" # the frame in which the pose is expressed. base link lies in the center of the robot at the height of the floor.
)
arm.send_goal(pose_goal) # send the command to the robot.
arm.wait_for_motion_done() # wait until the motion is complete
move_arm = False # reset flag to move the arm.
continue # dont wait for the rest of the loop.
#TODO get base-gripper transform

# check if done
if False: # check if the grasp has succeeded
grasp_succeeded = True

rospy.loginfo(f"print a message to show we are still running.")
rate.sleep()
# stop segmentation to preserve computation power
self.yolo_segmentor.stop()

return "succeeded"

@staticmethod
def frame_from_xyzrpy(x, y, z, roll, pitch, yaw):
"""
Helper function to create a kdl frame based on an x,y,z position and a RPY rotation
"""
return kdl.Frame(kdl.Rotation.RPY(roll, pitch, yaw), kdl.Vector(x, y, z))


class ResetOnFailure(smach.State):
""" Class to reset the robot after a grab has failed """

REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING], }

def __init__(self, robot, arm):
"""
Constructor

:param robot: robot object
:param arm: arm designator
"""
smach.State.__init__(self, outcomes=['done'])

self._robot = robot
self.arm_designator = arm

def execute(self, userdata=None):
""" Execute hook """
arm = self.arm_designator.resolve()
arm.reset()

if self._robot.robot_name == "amigo":
self._robot.torso.reset() # Move up to make resetting of the arm safer.
if arm is not None:
arm.gripper.send_goal('open')
self._robot.head.reset() # Sends a goal
self._robot.head.cancel_goal() # And cancels it...
if arm is not None:
arm.reset()
self._robot.torso.reset()
return 'done'


class TopGrab(smach.StateMachine):
def __init__(self, robot: Robot, item: Designator, arm: ArmDesignator):
"""
Let the given robot move to an entity and grab that entity using some arm

:param robot: Robot to use
:param item: Designator that resolves to the item to grab. E.g. EntityByIdDesignator
:param arm: Designator that resolves to the arm to use for grabbing. E.g. UnoccupiedArmDesignator
"""
smach.StateMachine.__init__(self, outcomes=['done', 'failed'])

# Check types or designator resolve types
check_type(arm, PublicArm)

with self:
smach.StateMachine.add('RESOLVE_ARM', ResolveArm(arm, self),
transitions={'succeeded': 'NAVIGATE_TO_GRAB',
'failed': 'failed'})

smach.StateMachine.add('NAVIGATE_TO_GRAB', NavigateToGrasp(robot, arm, item),
transitions={'unreachable': 'RESET_FAILURE',
'goal_not_defined': 'RESET_FAILURE',
'arrived': 'PREPARE_GRASP'})

smach.StateMachine.add('PREPARE_GRASP', PrepareGrasp(robot, arm),
transitions={'succeeded': 'GRAB',
'failed': 'RESET_FAILURE'})

smach.StateMachine.add('GRAB', TopGrasp(robot, arm),
transitions={'succeeded': 'done',
'failed': 'RESET_FAILURE'})

smach.StateMachine.add("RESET_FAILURE", ResetOnFailure(robot, arm),
transitions={'done': 'failed'})

check_arm_requirements(self, robot)