-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bep/cutlery grasp #1339
Draft
PetervDooren
wants to merge
111
commits into
master
Choose a base branch
from
BEP/cutlery_grasp
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Bep/cutlery grasp #1339
Changes from all commits
Commits
Show all changes
111 commits
Select commit
Hold shift + click to select a range
7204cc7
add template for aruco visual servoing
PetervDooren 3aff2eb
add examples of motion
PetervDooren ed142f0
add example and fix stuff
PetervDooren 9a81529
update docstring
PetervDooren 821a34c
Merge branch 'master' into feature/aruco_visual_servoing
PetervDooren db81e29
rename files
PetervDooren 00b4075
rename comments and classes
PetervDooren 5165229
fix bugs
PetervDooren c4ed7df
add force sensor example
PetervDooren 0436f11
add yolo segmentor and fix force sensor edge up
PetervDooren 251e98b
set cutlery ids
PetervDooren 937f1e4
remove prints
PetervDooren c3f9df3
change image topic to hand cam
PetervDooren 5e38e44
update cultery grasp colors
PetervDooren 9b3acac
add example joint control
PetervDooren c21cf22
add entity input
PetervDooren b26f2ff
get transform from palm to gripper
PetervDooren 9a3bc84
Added arm joint trajectories, made a start
LucetteAlewijns c328c00
Merge branch 'BEP/cutlery_grasp' of https://github.com/tue-robotics/t…
LucetteAlewijns 6bc0414
added closing of gripper and fixed some running errors
LucetteAlewijns dbedd26
small error fix
LucetteAlewijns 89aa597
Split up pre-grasp movement and changed grasp joint coordinates
LucetteAlewijns c3520fe
trying to fix error of not going into while loop
LucetteAlewijns bffbc88
implemented option to close the gripper with more force
LucetteAlewijns e83f2eb
Made a start on creating a tf2 listener for the gripper coordinates
LucetteAlewijns 98ce727
improvement on the listener
LucetteAlewijns dcc7f80
fixing errors in the listener
LucetteAlewijns 03922e2
trying another error fix
LucetteAlewijns 5770fca
error fix
LucetteAlewijns a56788b
added time initialization
LucetteAlewijns 3547850
error fix of not initialized time
LucetteAlewijns e3554f5
error fix missing argument
LucetteAlewijns a525d7e
error fix
LucetteAlewijns 64aa0c6
trying another error fix
LucetteAlewijns 8efe5a1
small changes
LucetteAlewijns e9aeeab
get robot model
LucetteAlewijns 8529aef
gripper coordinates listener should work now
LucetteAlewijns d519ea7
center point included as output from detection
LucetteAlewijns 13727c2
gripper coordinate listener gives output for each time step
LucetteAlewijns a767206
fix gripper coordinates
LucetteAlewijns 39df5f4
Working and tested detection with opencv, this was rewritten into the…
LucetteAlewijns 87e35fb
cutlery_detector was finalized
LucetteAlewijns e961e8e
Obtaining the direction of the cutlery was implemented(upwards or dow…
LucetteAlewijns f21b49a
detection will only process the image if it is cutlery, top grasp cod…
LucetteAlewijns 5175d29
sleep was changed to a while-loop
LucetteAlewijns bc365bb
start was made on rewriting frames. direction calculation was corrected
LucetteAlewijns 131c0f3
camera frame rewritten into hand palm link frame
LucetteAlewijns ab9a9eb
segmentation is also published when something different than cutlery …
LucetteAlewijns 87cf536
base movement corrected andobject's orientation/slope rewritten into …
LucetteAlewijns 861aa79
orientation gripper was corrected and wrist roll joint position now a…
LucetteAlewijns 0caa028
grasp detection included, which finishes the first grasp
LucetteAlewijns d04f852
wrist rotation after grasp was implement to be in line with the edge …
LucetteAlewijns 96cb2e9
backwards movement until table edge was implemented
LucetteAlewijns fa69162
small changes + part of last grasp
LucetteAlewijns 0c02f1f
small fixes
LucetteAlewijns 3b39f97
small fixes that were determined by testing and implementation of the…
LucetteAlewijns 5cbbb51
small correction
LucetteAlewijns e0c93e1
code for testing
LucetteAlewijns 0b620c0
small fix
LucetteAlewijns 31108bc
other test option
LucetteAlewijns 949033d
testing
LucetteAlewijns 46081a9
testing
LucetteAlewijns df1edda
testing
LucetteAlewijns 7736fa0
move downwards until force detection finalized
LucetteAlewijns e4bd53d
testing
LucetteAlewijns 3f1a4e9
testing
LucetteAlewijns f117003
small fix
LucetteAlewijns 90dc45f
testing
LucetteAlewijns 4c48c93
small fix: wait until gripper is closed for grasp detection
LucetteAlewijns 88c7254
typing mistake
LucetteAlewijns fb14388
small fix: signs weren't correct
LucetteAlewijns e401930
testing
LucetteAlewijns 7ef6cae
testing
LucetteAlewijns 758c814
testing
LucetteAlewijns d5bcdf7
fix wrist roll joint value
LucetteAlewijns 3b9ae57
sleep implemented
LucetteAlewijns 8344839
fixes to trajectories
LucetteAlewijns e6ef2a9
Correction on side grasp and on direction calculation
LucetteAlewijns 5d2fdce
Detection now separates cutlery data from other objects, such that al…
LucetteAlewijns fa9d97a
Redone focal length calculations
LucetteAlewijns 1c7b507
testing
LucetteAlewijns 7c17897
testing
LucetteAlewijns 1434b4a
testing
LucetteAlewijns 129250e
testing
LucetteAlewijns a276e9f
testing
LucetteAlewijns fbdda29
small fix on direction
LucetteAlewijns 9624fc1
testing
LucetteAlewijns 5b908f2
testing
LucetteAlewijns 366f701
testing
LucetteAlewijns 2511be4
testing
LucetteAlewijns 1a3bd4c
small fix
LucetteAlewijns 480db2f
testing
LucetteAlewijns e356b5b
testing
LucetteAlewijns f926dfd
testing
LucetteAlewijns 92943af
testing
LucetteAlewijns 5d0e984
testing
LucetteAlewijns 6b0dd53
testing
LucetteAlewijns eb46d62
clearing up the focal length calculation
LucetteAlewijns c584145
testing
LucetteAlewijns f900aab
testing
LucetteAlewijns 5f8e01a
testing
LucetteAlewijns c3b8418
testing
LucetteAlewijns 618e4de
test
LucetteAlewijns 77fb59c
change on rpy second grasp
LucetteAlewijns 3d27cbc
correction angle
LucetteAlewijns 16a436e
testing
LucetteAlewijns e84db9e
testing
LucetteAlewijns 12a207d
testing
LucetteAlewijns 166d1b4
small correction
LucetteAlewijns 7fee73c
testing
LucetteAlewijns 8f6a0e2
Code finalization
LucetteAlewijns File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
46 changes: 46 additions & 0 deletions
46
robot_smach_states/examples/manipulation/example_top_grasp.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,46 @@ | ||
# System | ||
import argparse | ||
|
||
# ROS | ||
import PyKDL as kdl | ||
import rospy | ||
|
||
# TU/e Robotics | ||
from ed.entity import Entity | ||
from ed.shape import RightPrism | ||
from pykdl_ros import FrameStamped | ||
from robot_skills.get_robot import get_robot | ||
|
||
# Robot Smach States | ||
import robot_smach_states.util.designators as ds | ||
from robot_smach_states.manipulation.top_grasp import TopGrab | ||
|
||
|
||
if __name__ == "__main__": | ||
|
||
parser = argparse.ArgumentParser(description="Put an imaginary object in the world model and grasp it using the " | ||
"'topgrasp' smach state") | ||
parser.add_argument("x", type=float, help="x-coordinate (in map) of the imaginary object") | ||
parser.add_argument("y", type=float, help="y-coordinate (in map) of the imaginary object") | ||
parser.add_argument("z", type=float, help="z-coordinate (in map) of the imaginary object") | ||
parser.add_argument("--robot", default="hero", help="Robot name (amigo, hero, sergio)") | ||
args = parser.parse_args() | ||
|
||
rospy.init_node("test_top_grasping") | ||
|
||
robot = get_robot(args.robot) | ||
|
||
entity_id = "test_item" | ||
pose = FrameStamped(frame=kdl.Frame(kdl.Rotation.RPY(0.0, 0.0, 0.0), kdl.Vector(args.x, args.y, args.z)), | ||
stamp=rospy.Time.now(), | ||
frame_id="map") | ||
robot.ed.update_entity(uuid=entity_id, frame_stamped=pose) | ||
shape = RightPrism([kdl.Vector(0, 0, 0), kdl.Vector(0, 0.05, 0), kdl.Vector(0.05, 0.05, 0), kdl.Vector(0.05, 0, 0)], -0.1, 0.1) | ||
item = Entity(entity_id, "test_type", pose.header.frame_id, pose.frame, shape, None, None, rospy.Time.now()) | ||
|
||
item = ds.Designator(item) | ||
|
||
arm = ds.UnoccupiedArmDesignator(robot).lockable() | ||
|
||
grab_state = TopGrab(robot, item, arm) | ||
grab_state.execute() |
245 changes: 245 additions & 0 deletions
245
robot_smach_states/src/robot_smach_states/manipulation/cutlery_detector.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,245 @@ | ||
import rospy | ||
import cv2 | ||
import math | ||
import numpy as np | ||
from ultralytics import YOLO | ||
from cv_bridge import CvBridge | ||
from sensor_msgs.msg import Image | ||
|
||
|
||
|
||
class YoloSegmentor: | ||
def __init__(self) -> None: | ||
model_path = "~/MEGA/developers/Donal/yolov8x-seg.pt" | ||
device = "cuda" | ||
self.model = YOLO(model_path).to(device) | ||
self.class_ids = [42, 43, 44] # See the COCO dataset for class id to label info (fork = 42, knife = 43, spoon = 44) | ||
self.active = False | ||
|
||
self.intercept = 0 | ||
self.slope = 0 | ||
self.length = 0 | ||
|
||
self.publisher = rospy.Publisher('/hero/segmented_image', Image, queue_size=10) | ||
self.subscriber = rospy.Subscriber('/hero/hand_camera/image_raw', Image, self.callback) | ||
|
||
# Initialize the attributes to store the values | ||
self.x_center = None | ||
self.y_center = None | ||
self.slope = None | ||
self.upwards = None | ||
self.class_id = None | ||
self.filtered_masks = None | ||
self.time = rospy.Time.now() | ||
|
||
|
||
def start(self): | ||
self.active = True | ||
|
||
def stop(self): | ||
self.active = False | ||
|
||
@staticmethod | ||
def detect(model, frame): | ||
results = model(frame) | ||
global result | ||
result = results[0] | ||
|
||
segmentation_contours_idx = [np.array(seg, dtype=np.int32) for seg in result.masks.xy] | ||
class_ids = np.array(result.boxes.cls.cpu(), dtype="int") | ||
|
||
return class_ids, segmentation_contours_idx, result | ||
|
||
def extract_table_segment(self, image, class_ids, segmentation_contours_idx): | ||
table_segment = np.zeros_like(image, dtype=np.uint8) | ||
global width | ||
height, width, channels = image.shape #obtain data on image size | ||
|
||
for class_id, seg in zip(class_ids, segmentation_contours_idx): | ||
self.class_id = class_id | ||
if class_id in self.class_ids: | ||
cv2.fillPoly(table_segment, [seg], color=(255, 0, 255)) #If fork, knife or spoon a pink mask will be created | ||
else: | ||
cv2.fillPoly(table_segment, [seg], color=(255, 0, 0)) #If another object from the COCO dataset a blue mask will be created | ||
|
||
return table_segment | ||
|
||
def coordinates_center_point(self): | ||
# Separating boxes and masks for cutlery from other objects | ||
cutlery_boxes = [] | ||
filtered_masks = [] | ||
for i, box in enumerate(result.boxes): | ||
if int(box.cls) in self.class_ids: | ||
cutlery_boxes.append(box) | ||
filtered_masks.append(result.masks.xy[i]) | ||
self.filtered_masks = filtered_masks | ||
x_center = None | ||
y_center = None | ||
|
||
# Check if cutlery_boxes is empty | ||
if not cutlery_boxes: | ||
coordinates_box = None # No bounding boxes found | ||
else: | ||
# Access the xywh attribute of the first box in cutlery_boxes | ||
coordinates_box_tensor = cutlery_boxes[0].xywh | ||
if coordinates_box_tensor.numel() == 0: # emake sure that the tensor is not empty before accessing its elements | ||
coordinates_box = None | ||
else: | ||
coordinates_box = coordinates_box_tensor[0].tolist() | ||
x_center, y_center = coordinates_box[0], coordinates_box[1] # Center coordinates of the bounding box | ||
|
||
x_center, y_center = coordinates_box[0], coordinates_box[1] # Obtain the center coordinates of the bounding box = center coordinates object | ||
print(f"x_center, y_center = {x_center, y_center}") #print center coordinates | ||
|
||
# Store the values | ||
self.x_center = x_center | ||
self.y_center = y_center | ||
|
||
return x_center, y_center | ||
|
||
def visualize_center_point(self, x_center, y_center, table_segment): | ||
cv2.circle(table_segment, (int(x_center), int(y_center)), 5, (0, 255, 255), -1) # Draw yellow dot at the center point | ||
return table_segment | ||
|
||
def calculate_slope_intercept(self): | ||
|
||
inner_array = self.filtered_masks[0] # result.masks.xy is an array in an array, so has to be unpacked first | ||
x = inner_array[:,0] # first column = x coordinates of all points of the segmentation mask | ||
y = inner_array[:,1] # second column = y coordinates of all points of the segmentation mask | ||
|
||
mean_x = np.mean(x) | ||
mean_y = np.mean(y) | ||
min_x = int(min(x)) | ||
max_x = int(max(x)) | ||
min_y = int(min(y)) | ||
max_y = int(max(y)) | ||
|
||
length = math.sqrt((max_x - min_x)**2+(max_y - min_y)**2) | ||
self.length = length | ||
|
||
#for nearly vertical cases: | ||
if max_x - min_x < (1/5 * width): #if the object takes up a small width (less than 1/5 of the image) it is oriented (nearly) vertical in the camera coordinate frame | ||
numerator = np.sum((y - mean_y) ** 2) | ||
denominator = np.sum((x - mean_x) * (y - mean_y)) | ||
slope = numerator / denominator | ||
|
||
#for all other orientations: | ||
else: | ||
numerator = np.sum((x - mean_x) * (y - mean_y)) | ||
denominator = np.sum((x - mean_x) ** 2) | ||
slope = numerator / denominator | ||
intercept = mean_y - slope * mean_x | ||
|
||
print(f"Calculated slope: {slope}") | ||
print(f"Calculated intercept: {intercept}") | ||
|
||
# Store the slope value | ||
self.slope = slope | ||
|
||
return slope, intercept, min_x, max_x | ||
|
||
def predict(self, slope, intercept, min_x, max_x): | ||
y_minx = slope * min_x + intercept #y-coordinate corresponding to min_x | ||
y_maxx = slope * max_x + intercept #y-coordinate corresponding to max_x | ||
rospy.loginfo("predict") | ||
return y_minx, y_maxx | ||
|
||
def visualize_orientation(self, min_x, max_x, y_minx, y_maxx, table_segment): | ||
cv2.line(table_segment, (min_x, int(y_minx)), (max_x, int(y_maxx)), (0, 255, 0), 2) # Draw the line created by the least squares method in green | ||
return table_segment | ||
|
||
def object_direction(self, x_center, y_center, slope): | ||
perpendicular_slope = -1/slope | ||
perpendicular_intercept = y_center - perpendicular_slope * x_center | ||
|
||
inner_array = self.filtered_masks[0] | ||
x = inner_array[:,0] | ||
y = inner_array[:,1] | ||
coordinates_upper = 0 | ||
coordinates_lower = 0 | ||
if 0 <= perpendicular_slope < math.inf: | ||
for i in range(len(x)): | ||
if y[i]<= perpendicular_slope*x[i] + perpendicular_intercept: | ||
coordinates_upper += 1 | ||
else: | ||
coordinates_lower += 1 | ||
if -math.inf < perpendicular_slope < 0: | ||
for i in range(len(x)): | ||
if y[i]>= perpendicular_slope*x[i] + perpendicular_intercept: | ||
coordinates_upper += 1 | ||
else: | ||
coordinates_lower += 1 | ||
print("Size outline upper half of the mask:", coordinates_upper) | ||
print("Size outline lower half of the mask:", coordinates_lower) | ||
|
||
if coordinates_upper >= coordinates_lower: | ||
upwards = True | ||
elif coordinates_upper < coordinates_lower: | ||
upwards = False | ||
self.time = rospy.Time.now() | ||
self.upwards = upwards | ||
|
||
return upwards | ||
|
||
def callback(self, data): | ||
#rospy.loginfo("Received image data") | ||
|
||
if not self.active: | ||
#rospy.loginfo("Callback inactive") | ||
return | ||
bridge = CvBridge() | ||
cv_image = bridge.imgmsg_to_cv2(data, desired_encoding='passthrough') | ||
cv_image = cv2.cvtColor(cv_image, cv2.COLOR_RGB2BGR) | ||
rospy.loginfo("Image converted") | ||
|
||
class_ids, segmentation_contours_idx, result = self.detect(self.model, cv_image) | ||
rospy.loginfo("Object detected") | ||
|
||
table_segment = self.extract_table_segment(cv_image, class_ids, segmentation_contours_idx) | ||
|
||
#makes sure that even when no cutlery is detected, the segmentation of objects is visualized | ||
table_message = bridge.cv2_to_imgmsg(table_segment, encoding="passthrough") | ||
self.publisher.publish(table_message) | ||
|
||
if any(class_id in self.class_ids for class_id in class_ids): | ||
rospy.loginfo("Cutlery detected") | ||
|
||
x_center, y_center = self.coordinates_center_point() | ||
|
||
|
||
# Visualize center point | ||
table_segment_with_center = self.visualize_center_point(x_center, y_center, table_segment) | ||
rospy.loginfo("Center point visualized") | ||
|
||
# Calculate slope and intercept | ||
slope, intercept, min_x, max_x = self.calculate_slope_intercept() | ||
rospy.loginfo("Slope and intercept calculated") | ||
|
||
y_minx, y_maxx = self.predict(slope, intercept, min_x, max_x) | ||
|
||
# Visualize orientation | ||
table_segment_with_orientation = self.visualize_orientation(min_x, max_x, y_minx, y_maxx, table_segment_with_center) | ||
rospy.loginfo("Orientation visualized") | ||
|
||
# Publish the table segment as a binary mask | ||
table_message = bridge.cv2_to_imgmsg(table_segment_with_orientation, encoding="passthrough") | ||
self.publisher.publish(table_message) | ||
rospy.loginfo("Segmented image with orientation published") | ||
|
||
upwards = self.object_direction(x_center, y_center, slope) | ||
rospy.loginfo("Direction determined") | ||
|
||
else: | ||
rospy.loginfo("No cutlery detected") | ||
|
||
# Method to be able to access values in the top_grasp code | ||
def data_object(self): | ||
return self.x_center, self.y_center, self.length, self.slope, self. upwards, self.time | ||
|
||
|
||
if __name__ == '__main__': | ||
rospy.init_node("cutlery_detector") | ||
ts = YoloSegmentor() | ||
ts.start() | ||
|
||
rospy.spin() |
39 changes: 39 additions & 0 deletions
39
robot_smach_states/src/robot_smach_states/manipulation/gripper_tf2_listener.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,39 @@ | ||
import rospy | ||
import tf2_ros | ||
import geometry_msgs.msg | ||
from pykdl_ros import FrameStamped | ||
|
||
if __name__ == '__main__': | ||
rospy.init_node('tf2_gripper_listener') | ||
|
||
tfBuffer = tf2_ros.Buffer() | ||
listener = tf2_ros.TransformListener(tfBuffer) | ||
PetervDooren marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
gripper_coordinates = rospy.Publisher('/gripper_coordinates', geometry_msgs.msg.PoseStamped, queue_size=1) | ||
|
||
rate = rospy.Rate(10.0) | ||
|
||
while not rospy.is_shutdown(): | ||
try: | ||
gripper_id = FrameStamped.from_xyz_rpy(0, 0, 0, 0, 0, 0, rospy.Time(), frame_id="hand_palm_link") #generates the frame of the gripper | ||
gripper_bl = tfBuffer.lookup_transform("base_link", "hand_palm_link", rospy.Time()) #calculates the transform between the origin of the base frame and that of the gripper frame | ||
|
||
# PoseStamped coordinates | ||
# obtains coordinates and orientation of the gripper transformed into the coordinate frame of the robot's base | ||
msg = geometry_msgs.msg.PoseStamped() | ||
msg.header = gripper_bl.header | ||
msg.pose.position.x = gripper_bl.transform.translation.x | ||
msg.pose.position.y = gripper_bl.transform.translation.y | ||
msg.pose.position.z = gripper_bl.transform.translation.z | ||
msg.pose.orientation = gripper_bl.transform.rotation | ||
|
||
# publishing coordinates | ||
gripper_coordinates.publish(msg) | ||
|
||
|
||
except (tf2_ros.LookupException, tf2_ros.ConnectivityException, tf2_ros.ExtrapolationException): | ||
|
||
rate.sleep() | ||
continue | ||
|
||
rate.sleep() |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use
~/data/pytorch_models
as folder.