Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about camera Settings #229

Open
wagh311 opened this issue Jul 12, 2024 · 4 comments
Open

Questions about camera Settings #229

wagh311 opened this issue Jul 12, 2024 · 4 comments
Labels
question Further information is requested

Comments

@wagh311
Copy link

wagh311 commented Jul 12, 2024

Hello@JacopoPan ,I have two questions for you about the camera:

1.If I want to achieve obstacle avoidance with a drone, if I only have a forward-looking camera, does that mean that the speed direction of the drone must be consistent with the direction of the camera? Otherwise, if the drone moves backwards, then it will not be able to obtain the image behind it, will it not be able to avoid obstacles?
2.I used depth image in the simulation, so I was wondering if Crazyflie drone would be able to carry a depth camera in a real application? I looked up relevant information on the Internet and saw that Crazyflie drone only has a matching grayscale camera. Do you have more information about it?
3.About ActionType.VEL, I would like to ask whether the first three elements of it are referred to the world coordinate system or the body coordinate system?

Looking forward to your reply~~

@JacopoPan
Copy link
Member

Hi @wagh311

  1. the rotation matrix of the on-board images can be changed on this line
    rot_mat = np.array(p.getMatrixFromQuaternion(self.quat[nth_drone, :])).reshape(3, 3)
    #### Set target point, camera view and projection matrices #
  2. I agree that the best available vision deck for crazyflie is the AI deck, alternatively you have the multiranger but it only works in 4 directions, in general, also considering the limited memory/computational space, it would be difficult to move vision-based applications to real CF (there are many more physical constraints that those captured by this simulation)
  3. Shuold be in world frame but you can modify it here as needed
    target_vel=self.SPEED_LIMIT * np.abs(target_v[3]) * v_unit_vector # target the desired velocity vector
    )

@JacopoPan JacopoPan added the question Further information is requested label Jul 31, 2024
@wagh311
Copy link
Author

wagh311 commented Jul 31, 2024

Hi @JacopoPan ,Thank you for your reply. I still have the following questions:
1.In my task, the drone was only equipped with a forward-looking camera. Therefore, although the rear image can be obtained by changing rot_mat in gym-pybullet-drones, in the real scene, I think that the rear image information of drones cannot be obtained when only the front-looking camera exists. Is my understanding correct?
2.If crazyflie can't carry a depth camera, then I ask: Can the policy trained by crazyflie be transferred to other drones that can carry depth cameras in real environments? Will different types of drones affect sim2real?

@JacopoPan
Copy link
Member

  1. yes, if you don't have a gimbaled camera, you want to use yaw-control
  2. on most DYI drones, you are probably using a control policy through an autopilot like px4, ardupilot, or betaflight so you are trying to learn motor commands but something like collective thrust and body rates references

@wagh311
Copy link
Author

wagh311 commented Jul 31, 2024

Hi@JacopoPan,Thank you for your reply.
I want to use the velocity input to control the UAV. Is there any command in gym-pybullet-drones that can make the direction of the yaw Angle of the UAV always consistent with the velocity direction of the UAV? If not, can you give me some ideas for implementation? thank you!

In my mission, I wanted to control the UAV with a camera to track a moving target by velocity input, and there were obstacles in the environment. Therefore, I am worried that when the direction of the camera is inconsistent with the speed direction of the drone, the obstacle information may not be obtained, resulting in a collision. In Airsim, I through Airsim DrivetrainType. ForwardOnly solved this problem. So, I'm still confused, how do I solve this problem in gym-pybullet-drones? In other words, how do I make sure the drone is always getting the correct information about the obstacles?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants