-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bep/cutlery grasp #1339
Draft
PetervDooren
wants to merge
111
commits into
master
Choose a base branch
from
BEP/cutlery_grasp
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Bep/cutlery grasp #1339
Changes from 14 commits
Commits
Show all changes
111 commits
Select commit
Hold shift + click to select a range
7204cc7
add template for aruco visual servoing
PetervDooren 3aff2eb
add examples of motion
PetervDooren ed142f0
add example and fix stuff
PetervDooren 9a81529
update docstring
PetervDooren 821a34c
Merge branch 'master' into feature/aruco_visual_servoing
PetervDooren db81e29
rename files
PetervDooren 00b4075
rename comments and classes
PetervDooren 5165229
fix bugs
PetervDooren c4ed7df
add force sensor example
PetervDooren 0436f11
add yolo segmentor and fix force sensor edge up
PetervDooren 251e98b
set cutlery ids
PetervDooren 937f1e4
remove prints
PetervDooren c3f9df3
change image topic to hand cam
PetervDooren 5e38e44
update cultery grasp colors
PetervDooren 9b3acac
add example joint control
PetervDooren c21cf22
add entity input
PetervDooren b26f2ff
get transform from palm to gripper
PetervDooren 9a3bc84
Added arm joint trajectories, made a start
LucetteAlewijns c328c00
Merge branch 'BEP/cutlery_grasp' of https://github.com/tue-robotics/t…
LucetteAlewijns 6bc0414
added closing of gripper and fixed some running errors
LucetteAlewijns dbedd26
small error fix
LucetteAlewijns 89aa597
Split up pre-grasp movement and changed grasp joint coordinates
LucetteAlewijns c3520fe
trying to fix error of not going into while loop
LucetteAlewijns bffbc88
implemented option to close the gripper with more force
LucetteAlewijns e83f2eb
Made a start on creating a tf2 listener for the gripper coordinates
LucetteAlewijns 98ce727
improvement on the listener
LucetteAlewijns dcc7f80
fixing errors in the listener
LucetteAlewijns 03922e2
trying another error fix
LucetteAlewijns 5770fca
error fix
LucetteAlewijns a56788b
added time initialization
LucetteAlewijns 3547850
error fix of not initialized time
LucetteAlewijns e3554f5
error fix missing argument
LucetteAlewijns a525d7e
error fix
LucetteAlewijns 64aa0c6
trying another error fix
LucetteAlewijns 8efe5a1
small changes
LucetteAlewijns e9aeeab
get robot model
LucetteAlewijns 8529aef
gripper coordinates listener should work now
LucetteAlewijns d519ea7
center point included as output from detection
LucetteAlewijns 13727c2
gripper coordinate listener gives output for each time step
LucetteAlewijns a767206
fix gripper coordinates
LucetteAlewijns 39df5f4
Working and tested detection with opencv, this was rewritten into the…
LucetteAlewijns 87e35fb
cutlery_detector was finalized
LucetteAlewijns e961e8e
Obtaining the direction of the cutlery was implemented(upwards or dow…
LucetteAlewijns f21b49a
detection will only process the image if it is cutlery, top grasp cod…
LucetteAlewijns 5175d29
sleep was changed to a while-loop
LucetteAlewijns bc365bb
start was made on rewriting frames. direction calculation was corrected
LucetteAlewijns 131c0f3
camera frame rewritten into hand palm link frame
LucetteAlewijns ab9a9eb
segmentation is also published when something different than cutlery …
LucetteAlewijns 87cf536
base movement corrected andobject's orientation/slope rewritten into …
LucetteAlewijns 861aa79
orientation gripper was corrected and wrist roll joint position now a…
LucetteAlewijns 0caa028
grasp detection included, which finishes the first grasp
LucetteAlewijns d04f852
wrist rotation after grasp was implement to be in line with the edge …
LucetteAlewijns 96cb2e9
backwards movement until table edge was implemented
LucetteAlewijns fa69162
small changes + part of last grasp
LucetteAlewijns 0c02f1f
small fixes
LucetteAlewijns 3b39f97
small fixes that were determined by testing and implementation of the…
LucetteAlewijns 5cbbb51
small correction
LucetteAlewijns e0c93e1
code for testing
LucetteAlewijns 0b620c0
small fix
LucetteAlewijns 31108bc
other test option
LucetteAlewijns 949033d
testing
LucetteAlewijns 46081a9
testing
LucetteAlewijns df1edda
testing
LucetteAlewijns 7736fa0
move downwards until force detection finalized
LucetteAlewijns e4bd53d
testing
LucetteAlewijns 3f1a4e9
testing
LucetteAlewijns f117003
small fix
LucetteAlewijns 90dc45f
testing
LucetteAlewijns 4c48c93
small fix: wait until gripper is closed for grasp detection
LucetteAlewijns 88c7254
typing mistake
LucetteAlewijns fb14388
small fix: signs weren't correct
LucetteAlewijns e401930
testing
LucetteAlewijns 7ef6cae
testing
LucetteAlewijns 758c814
testing
LucetteAlewijns d5bcdf7
fix wrist roll joint value
LucetteAlewijns 3b9ae57
sleep implemented
LucetteAlewijns 8344839
fixes to trajectories
LucetteAlewijns e6ef2a9
Correction on side grasp and on direction calculation
LucetteAlewijns 5d2fdce
Detection now separates cutlery data from other objects, such that al…
LucetteAlewijns fa9d97a
Redone focal length calculations
LucetteAlewijns 1c7b507
testing
LucetteAlewijns 7c17897
testing
LucetteAlewijns 1434b4a
testing
LucetteAlewijns 129250e
testing
LucetteAlewijns a276e9f
testing
LucetteAlewijns fbdda29
small fix on direction
LucetteAlewijns 9624fc1
testing
LucetteAlewijns 5b908f2
testing
LucetteAlewijns 366f701
testing
LucetteAlewijns 2511be4
testing
LucetteAlewijns 1a3bd4c
small fix
LucetteAlewijns 480db2f
testing
LucetteAlewijns e356b5b
testing
LucetteAlewijns f926dfd
testing
LucetteAlewijns 92943af
testing
LucetteAlewijns 5d0e984
testing
LucetteAlewijns 6b0dd53
testing
LucetteAlewijns eb46d62
clearing up the focal length calculation
LucetteAlewijns c584145
testing
LucetteAlewijns f900aab
testing
LucetteAlewijns 5f8e01a
testing
LucetteAlewijns c3b8418
testing
LucetteAlewijns 618e4de
test
LucetteAlewijns 77fb59c
change on rpy second grasp
LucetteAlewijns 3d27cbc
correction angle
LucetteAlewijns 16a436e
testing
LucetteAlewijns e84db9e
testing
LucetteAlewijns 12a207d
testing
LucetteAlewijns 166d1b4
small correction
LucetteAlewijns 7fee73c
testing
LucetteAlewijns 8f6a0e2
Code finalization
LucetteAlewijns File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
46 changes: 46 additions & 0 deletions
46
robot_smach_states/examples/manipulation/example_top_grasp.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,46 @@ | ||
# System | ||
import argparse | ||
|
||
# ROS | ||
import PyKDL as kdl | ||
import rospy | ||
|
||
# TU/e Robotics | ||
from ed.entity import Entity | ||
from ed.shape import RightPrism | ||
from pykdl_ros import FrameStamped | ||
from robot_skills.get_robot import get_robot | ||
|
||
# Robot Smach States | ||
import robot_smach_states.util.designators as ds | ||
from robot_smach_states.manipulation.top_grasp import TopGrab | ||
|
||
|
||
if __name__ == "__main__": | ||
|
||
parser = argparse.ArgumentParser(description="Put an imaginary object in the world model and grasp it using the " | ||
"'topgrasp' smach state") | ||
parser.add_argument("x", type=float, help="x-coordinate (in map) of the imaginary object") | ||
parser.add_argument("y", type=float, help="y-coordinate (in map) of the imaginary object") | ||
parser.add_argument("z", type=float, help="z-coordinate (in map) of the imaginary object") | ||
parser.add_argument("--robot", default="hero", help="Robot name (amigo, hero, sergio)") | ||
args = parser.parse_args() | ||
|
||
rospy.init_node("test_top_grasping") | ||
|
||
robot = get_robot(args.robot) | ||
|
||
entity_id = "test_item" | ||
pose = FrameStamped(frame=kdl.Frame(kdl.Rotation.RPY(0.0, 0.0, 0.0), kdl.Vector(args.x, args.y, args.z)), | ||
stamp=rospy.Time.now(), | ||
frame_id="map") | ||
robot.ed.update_entity(uuid=entity_id, frame_stamped=pose) | ||
shape = RightPrism([kdl.Vector(0, 0, 0), kdl.Vector(0, 0.05, 0), kdl.Vector(0.05, 0.05, 0), kdl.Vector(0.05, 0, 0)], -0.1, 0.1) | ||
item = Entity(entity_id, "test_type", pose.header.frame_id, pose.frame, shape, None, None, rospy.Time.now()) | ||
|
||
item = ds.Designator(item) | ||
|
||
arm = ds.UnoccupiedArmDesignator(robot).lockable() | ||
|
||
grab_state = TopGrab(robot, item, arm) | ||
grab_state.execute() |
64 changes: 64 additions & 0 deletions
64
robot_smach_states/src/robot_smach_states/manipulation/cutlery_detector.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,64 @@ | ||
import rospy | ||
import cv2 | ||
import numpy as np | ||
from ultralytics import YOLO | ||
from cv_bridge import CvBridge | ||
from sensor_msgs.msg import Image | ||
|
||
|
||
class YoloSegmentor: | ||
def __init__(self) -> None: | ||
model_path = "~/MEGA/developers/Donal/yolov8x-seg.pt" | ||
device = "cuda" | ||
self.model = YOLO(model_path).to(device) | ||
self.class_ids = [42, 43, 44] # See the COCO dataset for class id to label info (table=60, person = 0) | ||
self.active = False | ||
|
||
self.publisher = rospy.Publisher('/hero/segmented_image', Image, queue_size=10) | ||
self.subscriber = rospy.Subscriber('/hero/hand_camera/image_raw', Image, self.callback) | ||
|
||
def start(self): | ||
self.active = True | ||
|
||
def stop(self): | ||
self.active = False | ||
|
||
@staticmethod | ||
def detect(model, frame): | ||
results = model(frame) | ||
result = results[0] | ||
segmentation_contours_idx = [np.array(seg, dtype=np.int32) for seg in result.masks.xy] | ||
class_ids = np.array(result.boxes.cls.cpu(), dtype="int") | ||
return class_ids, segmentation_contours_idx | ||
|
||
def extract_table_segment(self, image, class_ids, segmentations): | ||
table_mask = np.zeros_like(image, dtype=np.uint8) | ||
for class_id, seg in zip(class_ids, segmentations): | ||
if class_id in self.class_ids: | ||
cv2.fillPoly(table_mask, [seg], color=(255, 0, 255)) | ||
else: | ||
cv2.fillPoly(table_mask, [seg], color=(255, 0, 0)) | ||
return table_mask | ||
|
||
def callback(self, data): | ||
#rospy.loginfo("got message") | ||
if not self.active: | ||
return | ||
bridge = CvBridge() | ||
cv_image = bridge.imgmsg_to_cv2(data, desired_encoding='passthrough') | ||
cv_image = cv2.cvtColor(cv_image, cv2.COLOR_RGB2BGR) | ||
#rospy.loginfo("converted message") | ||
|
||
classes, segmentations = self.detect(self.model, cv_image) | ||
table_segment = self.extract_table_segment(cv_image, classes, segmentations) | ||
|
||
# Publish the table segment as a binary mask | ||
table_message = bridge.cv2_to_imgmsg(table_segment, encoding="passthrough") | ||
self.publisher.publish(table_message) | ||
|
||
|
||
if __name__ == '__main__': | ||
rospy.init_node("cutlery_detector") | ||
ts = YoloSegmentor() | ||
ts.start() | ||
rospy.spin() |
216 changes: 216 additions & 0 deletions
216
robot_smach_states/src/robot_smach_states/manipulation/top_grasp.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,216 @@ | ||
from __future__ import absolute_import | ||
|
||
# ROS | ||
import PyKDL as kdl | ||
from pykdl_ros import VectorStamped, FrameStamped | ||
import rospy | ||
import smach | ||
import tf2_ros | ||
from geometry_msgs.msg import Twist | ||
|
||
# TU/e Robotics | ||
from robot_skills.robot import Robot | ||
from robot_skills.arm.arms import PublicArm, GripperTypes | ||
from ..utility import check_arm_requirements, ResolveArm | ||
from ..util.designators import check_type | ||
from ..navigation.navigate_to_grasp import NavigateToGrasp | ||
from ..manipulation.grasp_point_determination import GraspPointDeterminant | ||
from ..util.designators.arm import ArmDesignator | ||
from ..util.designators.core import Designator | ||
|
||
from robot_smach_states.manipulation.cutlery_detector import YoloSegmentor | ||
from robot_skills.util.exceptions import TimeOutException | ||
|
||
|
||
class PrepareGrasp(smach.State): | ||
REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING], | ||
"required_trajectories": ["prepare_grasp"], } | ||
|
||
def __init__(self, robot: Robot, arm: ArmDesignator) -> None: | ||
""" | ||
Set the arm in the appropriate position before actually grabbing | ||
|
||
:param robot: robot to execute state with | ||
:param arm: Designator that resolves to the arm to grasp with | ||
""" | ||
smach.State.__init__(self, outcomes=['succeeded', 'failed']) | ||
|
||
# Assign member variables | ||
self.robot = robot | ||
self.arm_designator = arm | ||
|
||
def execute(self, userdata=None) -> str: | ||
arm = self.arm_designator.resolve() | ||
if not arm: | ||
rospy.logerr("Could not resolve arm") | ||
return "failed" | ||
|
||
# Arm to position in a safe way | ||
arm.send_joint_trajectory("prepare_grasp") | ||
arm.wait_for_motion_done() | ||
|
||
# Open gripper | ||
arm.gripper.send_goal('open', timeout=0.0) | ||
arm.wait_for_motion_done() | ||
return 'succeeded' | ||
|
||
|
||
class TopGrasp(smach.State): | ||
REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING], | ||
"required_goals": ["carrying_pose"], } | ||
|
||
def __init__(self, robot: Robot, arm: ArmDesignator) -> None: | ||
""" | ||
Pick up an item given an arm and an entity to be picked up | ||
|
||
:param robot: robot to execute this state with | ||
:param arm: Designator that resolves to the arm to grasp with | ||
""" | ||
smach.State.__init__(self, outcomes=['succeeded', 'failed']) | ||
|
||
# Assign member variables | ||
self.robot = robot | ||
self.arm_designator = arm | ||
|
||
assert self.robot.get_arm(**self.REQUIRED_ARM_PROPERTIES) is not None,\ | ||
"None of the available arms meets all this class's requirements: {}".format(self.REQUIRED_ARM_PROPERTIES) | ||
|
||
self.yolo_segmentor = YoloSegmentor() | ||
|
||
def execute(self, userdata=None) -> str: | ||
arm = self.arm_designator.resolve() | ||
if not arm: | ||
rospy.logerr("Could not resolve arm") | ||
return "failed" | ||
|
||
grasp_succeeded=False | ||
rate = rospy.Rate(10) # loop rate in hz | ||
|
||
base_to_gripper = self.frame_from_xyzrpy(0.5, # x distance to the robot | ||
0.08, # y distance off center from the robot (fixed if rpy=0) | ||
0.7, # z height of the gripper | ||
0, 0, 0) # Roll pitch yaw. 0,0,0 for a horizontal gripper. | ||
# start segmentation | ||
self.yolo_segmentor.start() | ||
|
||
move_arm = True | ||
while not grasp_succeeded: | ||
# control loop | ||
|
||
#TODO get grasp pose wrt wrist | ||
|
||
#TODO force sensor does not provide a good interface for this. | ||
try: | ||
arm._arm.force_sensor.wait_for_edge_up(1.0) # wait 1 second for a force detection | ||
except TimeOutException: | ||
rospy.loginfo("No edge up detected within timeout") | ||
pass | ||
|
||
# example base command | ||
v = Twist() | ||
v.linear.x = 0 # forward | ||
v.linear.y = 0 # linear left | ||
v.angular.z = 0 # rotation speed to the left | ||
self.robot.base._cmd_vel.publish(v) # send command to the robot | ||
|
||
# example arm pose command | ||
if (move_arm): | ||
pose_goal = FrameStamped(base_to_gripper, | ||
rospy.Time.now(), #timestamp when this pose was created | ||
"base_link" # the frame in which the pose is expressed. base link lies in the center of the robot at the height of the floor. | ||
) | ||
arm.send_goal(pose_goal) # send the command to the robot. | ||
arm.wait_for_motion_done() # wait until the motion is complete | ||
move_arm = False # reset flag to move the arm. | ||
continue # dont wait for the rest of the loop. | ||
#TODO get base-gripper transform | ||
|
||
# check if done | ||
if False: # check if the grasp has succeeded | ||
grasp_succeeded = True | ||
|
||
rospy.loginfo(f"print a message to show we are still running.") | ||
rate.sleep() | ||
# stop segmentation to preserve computation power | ||
self.yolo_segmentor.stop() | ||
|
||
return "succeeded" | ||
|
||
@staticmethod | ||
def frame_from_xyzrpy(x, y, z, roll, pitch, yaw): | ||
""" | ||
Helper function to create a kdl frame based on an x,y,z position and a RPY rotation | ||
""" | ||
return kdl.Frame(kdl.Rotation.RPY(roll, pitch, yaw), kdl.Vector(x, y, z)) | ||
|
||
|
||
class ResetOnFailure(smach.State): | ||
""" Class to reset the robot after a grab has failed """ | ||
|
||
REQUIRED_ARM_PROPERTIES = {"required_gripper_types": [GripperTypes.GRASPING], } | ||
|
||
def __init__(self, robot, arm): | ||
""" | ||
Constructor | ||
|
||
:param robot: robot object | ||
:param arm: arm designator | ||
""" | ||
smach.State.__init__(self, outcomes=['done']) | ||
|
||
self._robot = robot | ||
self.arm_designator = arm | ||
|
||
def execute(self, userdata=None): | ||
""" Execute hook """ | ||
arm = self.arm_designator.resolve() | ||
arm.reset() | ||
|
||
if self._robot.robot_name == "amigo": | ||
self._robot.torso.reset() # Move up to make resetting of the arm safer. | ||
if arm is not None: | ||
arm.gripper.send_goal('open') | ||
self._robot.head.reset() # Sends a goal | ||
self._robot.head.cancel_goal() # And cancels it... | ||
if arm is not None: | ||
arm.reset() | ||
self._robot.torso.reset() | ||
return 'done' | ||
|
||
|
||
class TopGrab(smach.StateMachine): | ||
def __init__(self, robot: Robot, item: Designator, arm: ArmDesignator): | ||
""" | ||
Let the given robot move to an entity and grab that entity using some arm | ||
|
||
:param robot: Robot to use | ||
:param item: Designator that resolves to the item to grab. E.g. EntityByIdDesignator | ||
:param arm: Designator that resolves to the arm to use for grabbing. E.g. UnoccupiedArmDesignator | ||
""" | ||
smach.StateMachine.__init__(self, outcomes=['done', 'failed']) | ||
|
||
# Check types or designator resolve types | ||
check_type(arm, PublicArm) | ||
|
||
with self: | ||
smach.StateMachine.add('RESOLVE_ARM', ResolveArm(arm, self), | ||
transitions={'succeeded': 'NAVIGATE_TO_GRAB', | ||
'failed': 'failed'}) | ||
|
||
smach.StateMachine.add('NAVIGATE_TO_GRAB', NavigateToGrasp(robot, arm, item), | ||
transitions={'unreachable': 'RESET_FAILURE', | ||
'goal_not_defined': 'RESET_FAILURE', | ||
'arrived': 'PREPARE_GRASP'}) | ||
|
||
smach.StateMachine.add('PREPARE_GRASP', PrepareGrasp(robot, arm), | ||
transitions={'succeeded': 'GRAB', | ||
'failed': 'RESET_FAILURE'}) | ||
|
||
smach.StateMachine.add('GRAB', TopGrasp(robot, arm), | ||
transitions={'succeeded': 'done', | ||
'failed': 'RESET_FAILURE'}) | ||
|
||
smach.StateMachine.add("RESET_FAILURE", ResetOnFailure(robot, arm), | ||
transitions={'done': 'failed'}) | ||
|
||
check_arm_requirements(self, robot) |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use
~/data/pytorch_models
as folder.