You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Binary Gazebo Garden from official apt / Harmonic / Fortress
CPU (12 threads ) / GPU (nvidia rtx 3060)
Ogre2
Running on real hardware, on a single-GPU machine
Description
I run simulation with fuel collection world + fuel X1 model robot. Initially, I added the diff drive plugin for keyboard control, and there were no problems with the robot's driving. Next, I used the image viewer for 2 cameras in the gazebo gui. Next, I subscribed to the topics from 2 cameras and imu using gz to track the frequency of the data. And after subscribing to the cameras, the total RTF started to decrease, so the set update rate of the cameras in the sdf file was actually lower (for example, instead of 20 fps, it was 5-6 fps). I tried other versions - fortress, harmonic, everything works the same way, with the only exception that fortress works faster under the same conditions.
The problem is that the cpu in all versions and when increasing the fps on the cameras in sdf file, does not load more than 30 percent (all threads).
The same thing happens with the GPU. The video card has 12 gb, but only 2.3-2.5 is used in any scenarios.
I want to be able to increase the CPU/GPU utilization depending on the required power of the task, for example, by increasing the fps of sensors, or adding new algorithms. And for me, this does not happen automatically, but always remains at 30 percent CPU and about 30 percent GPU. I understand that i can remove shadows, reduce collision checking, that is, optimize the simulation itself, and it really works, but I don't want to optimize the simulation, I want to use more resources.
So how can this be done?
I haven't found much information on this topic, except for setting up physics parameters. I'm attaching them below.
The text was updated successfully, but these errors were encountered:
Environment
Description
I run simulation with fuel collection world + fuel X1 model robot. Initially, I added the diff drive plugin for keyboard control, and there were no problems with the robot's driving. Next, I used the image viewer for 2 cameras in the gazebo gui. Next, I subscribed to the topics from 2 cameras and imu using gz to track the frequency of the data. And after subscribing to the cameras, the total RTF started to decrease, so the set update rate of the cameras in the sdf file was actually lower (for example, instead of 20 fps, it was 5-6 fps). I tried other versions - fortress, harmonic, everything works the same way, with the only exception that fortress works faster under the same conditions.
The problem is that the cpu in all versions and when increasing the fps on the cameras in sdf file, does not load more than 30 percent (all threads).
The same thing happens with the GPU. The video card has 12 gb, but only 2.3-2.5 is used in any scenarios.
I want to be able to increase the CPU/GPU utilization depending on the required power of the task, for example, by increasing the fps of sensors, or adding new algorithms. And for me, this does not happen automatically, but always remains at 30 percent CPU and about 30 percent GPU. I understand that i can remove shadows, reduce collision checking, that is, optimize the simulation itself, and it really works, but I don't want to optimize the simulation, I want to use more resources.
So how can this be done?
I haven't found much information on this topic, except for setting up physics parameters. I'm attaching them below.
The text was updated successfully, but these errors were encountered: