-
Notifications
You must be signed in to change notification settings - Fork 9.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Low Frame Rate Issue with Concurrent Lidar and Camera #15493
Comments
What should I do?
Is it possible to use msg_adapter to do something helpful?
issuelidar frame rate dropping from the original 10 frames to only 2 or 3 frames I'm not sure what the problem is, simply you can add logs and check if the lidar module frequency is 10 hz, then we can narrow down the problem. By the way, you can use the following method to collect the module time cost PERF_BLOCK("filter_bank") It will save all the logs in a file called "PERF.info.log", then you can use https://github.com/ApolloAuto/apollo/blob/master/modules/tools/aperf/README.md |
@daohu527 The issue only occurs when I launch camera_detection_multi_stage_yolox3d.dag and lidar_detection.dag in the same launch file. If I use two separate launch files to launch them individually at the same time, the time for lidar_detection is about 100 ms. But if I want to use fusion, the issue seems to be unavoidable because I need to launch them in the same file.
|
which Apollo version do you use? I check the code default perf.log is on! Line 43 in ba32187
The reason why this is used is that it can record the time of each module, So I suggest you use aperf to analyze! |
@daohu527 |
@HandsomeAIccx You can refer to modify and add |
Version: Apollo 9.0 Hardware: X86 and Orin
Issue Details: I am experiencing an issue where launching lidar and camera perception using the same launch file results in the lidar frame rate dropping from the original 10 frames to only 2 or 3 frames. However, if I use two separate launch files to start the lidar and camera simultaneously, the frame rate remains normal.
Does this indicate that running camera perception and lidar perception in the same process leads to GPU resource contention, whereas separate processes do not encounter this issue?
Currently, I want to perform multi-sensor fusion, such as with perception_all.launch, which starts lidar perception and camera perception within the same process. I wish to separate these two. But because the channels sent by the them are internal channels, they cannot be found in cyber_monitor. Therefore, to start multi_sensor_fusion, it seems I have to run it in the same launch file, but the frame rate is poor. What should I do? Is it possible to use msg_adapter to do something helpful? Are there any examples available?
The text was updated successfully, but these errors were encountered: