Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do I need to remake the dataset using your method? #20

Open
yys-abu opened this issue Jun 28, 2024 · 12 comments
Open

Do I need to remake the dataset using your method? #20

yys-abu opened this issue Jun 28, 2024 · 12 comments

Comments

@yys-abu
Copy link

yys-abu commented Jun 28, 2024

I want to train my dexterous hand to grasp objects and my own dexterous hand is INSPIRE-ROBOT dexterous hand. Can I use your dataset directly? You mentioned in the abstract of your paper: "In this work, we present a large-scale robotic dexterous grasp dataset,DexGraspNet, generated by our proposed highly efficient synthesis method that can be generally applied to any dexterous hand.”
If I must remake the dataset for my own dexterous hand,can I use the code directly?
Looking forward to your reply, thank you!

@mzhmxzh
Copy link
Collaborator

mzhmxzh commented Jun 30, 2024

Yes, it is recommended to regenerate the dataset using your own dexterous hand than to retarget the current dataset from Shadow Hand to your chosen hand. You may follow these steps to customize the codebase for new hands:

  1. Approximate the collision mesh of each finger bone with a capsule, as we only provide accelerations for capsules. Approximate the collision mesh of the palm with a box. I think these approximations are reasonable for INSPIRE-ROBOT.
  2. Manually sample some contact candidates on the surface of the collision mesh. These points should lie in the regions where the hand is supposed to make contact with the object during grasping.
  3. Manually assign some penetration key points inside each finger bone and determine their radius.
  4. Determine a canonical hand pose (wrist translation, wrist rotation, joint angles) for initialization.
  5. Run the data generation script and data validation script.

As a reference, it took me roughly 3 hours to write a version for the Allegro hand. You can check it out in the allegro branch of this repo and compare it with the main branch to see what sections need to be modified. Use the visualization scripts in the grasp_generation/tests folder to visualize the contact candidates, penetration key points, initialization, and generated grasps.

@yys-abu
Copy link
Author

yys-abu commented Jul 1, 2024

Yes, it is recommended to regenerate the dataset using your own dexterous hand than to retarget the current dataset from Shadow Hand to your chosen hand. You may follow these steps to customize the codebase for new hands:

  1. Approximate the collision mesh of each finger bone with a capsule, as we only provide accelerations for capsules. Approximate the collision mesh of the palm with a box. I think these approximations are reasonable for INSPIRE-ROBOT.
  2. Manually sample some contact candidates on the surface of the collision mesh. These points should lie in the regions where the hand is supposed to make contact with the object during grasping.
  3. Manually assign some penetration key points inside each finger bone and determine their radius.
  4. Determine a canonical hand pose (wrist translation, wrist rotation, joint angles) for initialization.
  5. Run the data generation script and data validation script.

As a reference, it took me roughly 3 hours to write a version for the Allegro hand. You can check it out in the allegro branch of this repo and compare it with the main branch to see what sections need to be modified. Use the visualization scripts in the grasp_generation/tests folder to visualize the contact candidates, penetration key points, initialization, and generated grasps.

I am very pleased to receive your reply. Now I have an INSPIRE-ROBOT dexterous hand's URDF, which is exported from Solidworks. I am not sure if it can be used.If posible,can I send you the INSPIRE-ROBOT dexterous hand's URDF? And can you tell me how to generate contact key points and penetration key points?
Looking forward to your reply, thank you!

@jianguo-cmd
Copy link

I changed the hand model, set contact points and penetration points. However, the generated grasping poses are always incorrect and there is a phenomenon of penetration. What could be the possible reasons for this?thank you!
20240704-141813
20240704-141808
20240704-141848

@yys-abu
Copy link
Author

yys-abu commented Jul 4, 2024

I changed the hand model, set contact points and penetration points. However, the generated grasping poses are always incorrect and there is a phenomenon of penetration. What could be the possible reasons for this?thank you! 20240704-141813 20240704-141808 20240704-141848

Your problem may be caused by incorrect initialization of hand posture. My hand model has not been successfully imported yet. Can you share your hand URDF or how to generate contact_points?
Looking forward to your reply, thank you!

@jianguo-cmd
Copy link

I changed the hand model, set contact points and penetration points. However, the generated grasping poses are always incorrect and there is a phenomenon of penetration. What could be the possible reasons for this?thank you! 20240704-141813 20240704-141808 20240704-141848

Your problem may be caused by incorrect initialization of hand posture. My hand model has not been successfully imported yet. Can you share your hand URDF or how to generate contact_points? Looking forward to your reply, thank you!

If you have a 3D model of a hand, you can export a URDF model from SolidWorks, and optionally convert the URDF into XML (I use MuJoCo for this conversion). The contact points are manually selected using 3D sketches within the 3D model of the hand, and their coordinate positions are relative to each finger's coordinate system. Currently, this is the method I'm using, but there might be more efficient ways to do it. I hope this information is helpful to you.

@jianguo-cmd
Copy link

I changed the hand model, set contact points and penetration points. However, the generated grasping poses are always incorrect and there is a phenomenon of penetration. What could be the possible reasons for this?thank you! 20240704-141813 20240704-141808 20240704-141848

Your problem may be caused by incorrect initialization of hand posture. My hand model has not been successfully imported yet. Can you share your hand URDF or how to generate contact_points? Looking forward to your reply, thank you!

Regarding finger initialization, I've adopted the approach of starting with all fingers fully extended at zero degrees as the initial position. Upon comparing the initialization codes (initializations.py) of Allegro and Shadow robotic hands, I've noticed differences in the rotation_hand initialization as well. Are there any specific strategies to consider for these initialization methods? How should I determine the appropriate initialization based on my hand model? I look forward to your response and greatly appreciate your assistance.

@yys-abu
Copy link
Author

yys-abu commented Jul 6, 2024

I changed the hand model, set contact points and penetration points. However, the generated grasping poses are always incorrect and there is a phenomenon of penetration. What could be the possible reasons for this?thank you! 20240704-141813 20240704-141808 20240704-141848

Your problem may be caused by incorrect initialization of hand posture. My hand model has not been successfully imported yet. Can you share your hand URDF or how to generate contact_points? Looking forward to your reply, thank you!

Regarding finger initialization, I've adopted the approach of starting with all fingers fully extended at zero degrees as the initial position. Upon comparing the initialization codes (initializations.py) of Allegro and Shadow robotic hands, I've noticed differences in the rotation_hand initialization as well. Are there any specific strategies to consider for these initialization methods? How should I determine the appropriate initialization based on my hand model? I look forward to your response and greatly appreciate your assistance.

I think during initialization, the hand should be facing the object, but I haven't found the specific code to modify this part yet. Have you found a solution?
Looking forward to your reply, thank you!

@jianguo-cmd
Copy link

image
Perhaps I should modify this part? I made some changes, and the success rate of the generated grasping poses has improved slightly, but there are still instances of fingers penetrating through the target object. I'm not sure what could be causing this issue.

@yys-abu
Copy link
Author

yys-abu commented Jul 9, 2024

image Perhaps I should modify this part? I made some changes, and the success rate of the generated grasping poses has improved slightly, but there are still instances of fingers penetrating through the target object. I'm not sure what could be causing this issue.

I think because the initial position of the hand is random, every position of the object should be marked with data as much as possible. For bad positions, it can only be forcefully grasped, and penetration may be caused by the mechanical structure of the hand combined with the bad position.

@yys-abu
Copy link
Author

yys-abu commented Jul 10, 2024

image Perhaps I should modify this part? I made some changes, and the success rate of the generated grasping poses has improved slightly, but there are still instances of fingers penetrating through the target object. I'm not sure what could be causing this issue.

I don't know if you have solved the penetration problem? I have found that for finer bottoms, the penetration problem is very serious, and perhaps we should increase the weight of this part of the energy function.
Looking forward to your reply, thank you!

@jianguo-cmd
Copy link

image Perhaps I should modify this part? I made some changes, and the success rate of the generated grasping poses has improved slightly, but there are still instances of fingers penetrating through the target object. I'm not sure what could be causing this issue.

I don't know if you have solved the penetration problem? I have found that for finer bottoms, the penetration problem is very serious, and perhaps we should increase the weight of this part of the energy function. Looking forward to your reply, thank you!

I haven't solved this problem. I've tried different initial poses, and the grasp results vary. Because each target object is different, setting a consistent initial orientation facing the target object doesn't work for all objects. As a result, the grasping pose isn't optimal for many objects, and there are still occurrences of penetration.

@CodingCatMountain
Copy link

CodingCatMountain commented Aug 5, 2024

@yys-abu Hey, Have you successfully used inspire-hand into this project???

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants