Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add benchmark tests for TMVA Sofie #239

Open
wants to merge 35 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
e56ae76
CMake module for ONNXRuntime and first inference benchmark with Linea…
fsossai Sep 7, 2021
a6933f5
Removed unintentional debug line
fsossai Sep 7, 2021
21a873c
Fixed typo
fsossai Sep 7, 2021
f4226b3
GBenchmarks generated automatically from ONNX models
fsossai Sep 7, 2021
31b4ff6
Shrinked ONNX gbenchs into a single function
fsossai Sep 9, 2021
f5e1d00
Removed debug prints
fsossai Sep 9, 2021
90cb831
Added precompiled models
fsossai Sep 13, 2021
8712318
Automated SOFIEInference, removed profiler inference
fsossai Sep 30, 2021
11880fe
Removed useless file
fsossai Oct 7, 2021
80ab813
Removed SOFIEInferenceProfiler.cxx
fsossai Oct 19, 2021
027c1d0
Draft modification for benchmarking sofie
lmoneta Oct 7, 2021
4b2a625
fix for new google benchmark version
lmoneta Nov 8, 2021
12782a2
Add new SOFIE benchmark tests based on the Session.
lmoneta Nov 8, 2021
29f8c03
Add new tests for GEMM with SOFIEINference
lmoneta Nov 9, 2021
601b05e
Add resnet to benchmarks
lmoneta Nov 20, 2021
90b97e4
Add RDF benchmark
lmoneta Nov 23, 2021
66a5dca
Improve test reporting for RDF benchmarks
lmoneta Nov 24, 2021
36aa3c6
Add emitFromONNX to generate automatically headers and weight file
lmoneta Nov 24, 2021
4523f65
Improve inference tests by reporting time/event
lmoneta Nov 24, 2021
48c94b0
revert change in root/io/io/TBufferMergerBenchmarks.cxx for correct u…
lmoneta Nov 25, 2021
072d46e
Add benchmarks for recurent models and 3d convolution
lmoneta Dec 17, 2021
bf3df8b
Add tests for Generator models from the fast sim project
lmoneta Mar 8, 2022
1577bb9
Add LWTNN inference test and Fast simulation model from G4
lmoneta Mar 9, 2022
01831b5
Add new version of SOFIEFunctor for RDF using variadic templates
lmoneta Mar 9, 2022
52cf35e
Add benchmark on CMS onnx file (DDB_B1.onnx) used with batch size=1
lmoneta Apr 3, 2022
cda8978
remove unneeeded include for lwtnn and ONNX benchmarks
lmoneta Apr 3, 2022
2d82b54
- add optimization vectorization options for SOFIE tests
lmoneta Jul 1, 2022
5457d8f
fix APPLE case
lmoneta Oct 18, 2022
9cb47ca
add conv2DTranspose model
lmoneta Oct 19, 2022
6d9c75f
add support for writing output result for debugging
lmoneta Oct 24, 2022
1233851
Add some new COnvTranspose models for benchmarking
lmoneta Nov 4, 2022
3ec1884
add ubuntu20 results
lmoneta Nov 4, 2022
c6c6f5a
update for new ONNXruntime version (13)
lmoneta Nov 23, 2022
0f948b0
add benchmark for SOfieReader
lmoneta Feb 1, 2023
2c342a8
[tmva][sofie] Apply fixes for new SOFIE in 6.32
lmoneta Jun 11, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions cmake/modules/FindLWTNN.cmake
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Copyright (C) 1995-2019, Rene Brun and Fons Rademakers.
# All rights reserved.
#
# For the licensing terms see $ROOTSYS/LICENSE.
# For the list of contributors see $ROOTSYS/README/CREDITS.

# Find the LWTNN includes and library.
#
# This module defines
# LWTNN_INCLUDE_DIR, where to locate LWTNN.h file
# LWTNN_LIBRARIES, the libraries to link against to use LWTNN
# LWTNN_FOUND. If false, you cannot build anything that requires LWTNN.
# LWTNN_LIBRARY, where to find the libLWTNN library.

set(LWTNN_FOUND 0)
if(LWTNN_LIBRARY AND LWTNN_INCLUDE_DIR)
set(LWTNN_FIND_QUIETLY TRUE)
endif()

find_path(LWTNN_INCLUDE_DIR lwtnn/Graph.hh
$ENV{LWTNN_DIR}/include
$ENV{LWTNN} $ENV{LWTNN}/include
/usr/local/include
/usr/include
DOC "Specify the directory containing LWTNN.h"
)

find_library(LWTNN_LIBRARY NAMES lwtnn PATHS
$ENV{LWTNN_DIR}/lib
$ENV{LWTNN} $ENV{LWTNN}/lib $ENV{LWTNN}/.libs
/usr/local/lib
/usr/lib
/opt/LWTNN/lib
DOC "Specify the lwtnn library here."
)

if(LWTNN_INCLUDE_DIR AND LWTNN_LIBRARY)
set(LWTNN_FOUND 1 )
if(NOT LWTNN_FIND_QUIETLY)
message(STATUS "Found LWTNN includes at ${LWTNN_INCLUDE_DIR}")
message(STATUS "Found LWTNN library at ${LWTNN_LIBRARY}")
endif()
endif()

set(LWTNN_LIBRARIES ${LWTNN_LIBRARY})

mark_as_advanced(LWTNN_FOUND LWTNN_LIBRARY LWTNN_INCLUDE_DIR)
47 changes: 47 additions & 0 deletions cmake/modules/FindONNXRuntime.cmake
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Copyright (C) 1995-2019, Rene Brun and Fons Rademakers.
# All rights reserved.
#
# For the licensing terms see $ROOTSYS/LICENSE.
# For the list of contributors see $ROOTSYS/README/CREDITS.

# Find the ONNXRuntime includes and library.
#
# This module defines
# ONNXRuntime_INCLUDE_DIR, where to locate ONNXRuntime include file
# ONNXRuntime_LIBRARIES, the libraries to link against to use ONNXRuntime
# ONNXRuntime_FOUND. If false, you cannot build anything that requires ONNXRuntime.
# ONNXRuntime_LIBRARY, where to find the libONNXRuntime library.

set(ONNXRuntime_FOUND 0)
if(ONNXRuntime_LIBRARY AND ONNXRuntime_INCLUDE_DIR)
set(ONNXRuntime_FIND_QUIETLY TRUE)
endif()

find_path(ONNXRuntime_INCLUDE_DIR onnxruntime_cxx_api.h
$ENV{ONNXRuntime_DIR}/include
$ENV{ONNXRuntime} $ENV{ONNXRuntime}/include
/usr/local/include
/usr/include
DOC "Specify the directory containing ONNXRuntime.h"
)

find_library(ONNXRuntime_LIBRARY NAMES onnxruntime PATHS
$ENV{ONNXRuntime_DIR}/lib
$ENV{ONNXRuntime} $ENV{ONNXRuntime}/lib $ENV{ONNXRuntime}/.libs
/usr/local/lib
/usr/lib
/opt/ONNXRuntime/lib
DOC "Specify the ONNXRuntime library here."
)

if(ONNXRuntime_INCLUDE_DIR AND ONNXRuntime_LIBRARY)
set(ONNXRuntime_FOUND 1 )
if(NOT ONNXRuntime_FIND_QUIETLY)
message(STATUS "Found ONNXRuntime includes at ${ONNXRuntime_INCLUDE_DIR}")
message(STATUS "Found ONNXRuntime library at ${ONNXRuntime_LIBRARY}")
endif()
endif()

set(ONNXRuntime_LIBRARIES ${ONNXRuntime_LIBRARY})

mark_as_advanced(ONNXRuntime_FOUND ONNXRuntime_LIBRARY ONNXRuntime_INCLUDE_DIR)
1 change: 1 addition & 0 deletions root/tmva/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
add_subdirectory(tmva)
add_subdirectory(sofie)
245 changes: 245 additions & 0 deletions root/tmva/sofie/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,245 @@
# @author Federico Sossai (fsossai)


# Checking that all required model exist
if (NOT ONNX_MODELS_DIR)
set(ONNX_MODELS_DIR input_models)
endif()
file(GLOB ONNX_MODELS "${ONNX_MODELS_DIR}/*.onnx")

# Copying every ONNX model in the input directory to the build directory.
set(out_dir ${CMAKE_CURRENT_BINARY_DIR}/${ONNX_MODELS_DIR})
file(MAKE_DIRECTORY ${out_dir})
foreach(model ${ONNX_MODELS})
get_filename_component(fname ${model} NAME)
configure_file(${model} ${out_dir}/${fname} COPYONLY)
endforeach()

# Looking for ONNXRuntime
## to find OONX run time configure cmake with
## -DONNXRuntime_INCLUDE_DIRS=..onxruntime_location.../include -DONNXRuntime_LIBRARIES=.../lib
find_package(ONNXRuntime)
if(ONNXRuntime_FOUND)
message(STATUS "Found ONNXRuntime (library is ${ONNXRuntime_LIBRARY}, libraries ${ONNXRuntime_LIBRARIES})")

# Configuring ONNXRuntimeInference_Template.cxx.in
set(FUNC_NAME "BM_ONNXRuntime_Inference")
set(CAPTURE_STR "BENCHMARK_CAPTURE(${FUNC_NAME}, @1,\t@2)@3")
set(HEAD_COMMENT "Automatically configured by CMake")
set(ALL_CAPTURES "")
foreach(model ${ONNX_MODELS})
get_filename_component(fname ${model} NAME)
get_filename_component(fname_we ${model} NAME_WE)
string(REPLACE "@1" ${fname_we} cap ${CAPTURE_STR})
string(REPLACE "@2" "\"${ONNX_MODELS_DIR}/${fname}\"" cap ${cap})
list(APPEND ALL_CAPTURES ${cap})
endforeach()
string(REPLACE ";" "\n" BENCHMARK_CAPTURES "${ALL_CAPTURES}") # String[] -> String
string(REPLACE "@3" "->Unit(benchmark::kMillisecond);" BENCHMARK_CAPTURES "${BENCHMARK_CAPTURES}") # Adding semicolon
configure_file(ONNXRuntimeInference_Template.cxx.in ONNXRuntimeInference.cxx @ONLY)

RB_ADD_GBENCHMARK(ONNXRuntimeInference
ONNXRuntimeInference.cxx
LABEL short
LIBRARIES TMVA ${ONNXRuntime_LIBRARIES}
)
target_link_directories(ONNXRuntimeInference PRIVATE ${ONNXRuntime_LIBRARIES})
target_include_directories(ONNXRuntimeInference PRIVATE ${ONNXRuntime_INCLUDE_DIR})

else()
message(STATUS "ONNXRuntime not found")
endif()



#---TMVA-/SOFIE
if(ROOT_tmva_FOUND AND ROOT_tmva-sofie_FOUND)


### this is not used
if (Use_SOFIE_TEMPLATE)

# Configuring SOFIEInference_Template.cxx.in
set(FUNC_NAME "BM_SOFIE_Inference")
set(CAPTURE_STR "BENCHMARK_CAPTURE(${FUNC_NAME}, @1,\t@2)@3")
set(INCLUDES_STR "#include @1")
set(FUNCS_STR "\t\t{ @1,\t{@2,\t@3} }")
set(HEAD_COMMENT "Automatically configured by CMake")
set(ALL_CAPTURES "")
set(ALL_INCLUDES "")
set(ALL_FUNCS "")
set(COMPILED_MODELS_DIR ${ONNX_MODELS_DIR}/compiled)
file(GLOB COMPILED_MODELS "${COMPILED_MODELS_DIR}/*.hxx")
set(inc "")
set(cap "")
set(funcs "")
foreach(model ${COMPILED_MODELS})
get_filename_component(fname ${model} NAME)
get_filename_component(fname_we ${model} NAME_WE)
# Fixing the string for the include headers
string(REPLACE "@1" "\"${COMPILED_MODELS_DIR}/${fname}\"" inc ${INCLUDES_STR})
list(APPEND ALL_INCLUDES ${inc})
# Fixing the string for the GBenchmark captures
string(REPLACE "@1" ${fname_we} cap ${CAPTURE_STR})
string(REPLACE "@2" "\"${fname_we}\"" cap ${cap})
list(APPEND ALL_CAPTURES ${cap})
# Fixing the string for the actual infer function that each capture will call
string(REPLACE "@1" "\"${fname_we}\"" funcs ${FUNCS_STR})
string(REPLACE "@2" "TMVA_SOFIE_${fname_we}::infer" funcs ${funcs})
string(REPLACE "@3" "0" funcs ${funcs})
list(APPEND ALL_FUNCS ${funcs})
endforeach()

# Transforming list of strings into a single multi-line string
string(REPLACE ";" "\n" BENCHMARK_CAPTURES "${ALL_CAPTURES}") # String[] -> String
string(REPLACE "@3" ";" BENCHMARK_CAPTURES "${BENCHMARK_CAPTURES}") # Adding semicolon
string(REPLACE ";" "\n" INCLUDE_HEADERS "${ALL_INCLUDES}") # String[] -> String
string(REPLACE ";" ",\n" FUNC_TUPLES "${ALL_FUNCS}") # String[] -> String
configure_file(SOFIEInference_Template.cxx.in SOFIEInference.cxx @ONLY)


endif()


# configure_file(input_models/compiled/Linear_event.hxx Linear_event.hxx COPYONLY)
# configure_file(input_models/compiled/Linear_event.dat Linear_event.dat COPYONLY)

# configure_file(input_models/compiled/Linear_16.hxx Linear_16.hxx COPYONLY)
# configure_file(input_models/compiled/Linear_16.dat Linear_16.dat COPYONLY)

# configure_file(input_models/compiled/Linear_32.hxx Linear_32.hxx COPYONLY)
# configure_file(input_models/compiled/Linear_32.dat Linear_32.dat COPYONLY)

# configure_file(input_models/compiled/Linear_64.hxx Linear_64.hxx COPYONLY)
# configure_file(input_models/compiled/Linear_64.dat Linear_64.dat COPYONLY)


# configure_file(input_models/compiled/Conv_d100_L1_B1.hxx Conv_d100_L1_B1.hxx COPYONLY)
# configure_file(input_models/compiled/Conv_d100_L1_B1.dat Conv_d100_L1_B1.dat COPYONLY)

# configure_file(input_models/compiled/Conv_d100_L14_B1.hxx Conv_d100_L14_B1.hxx COPYONLY)
# configure_file(input_models/compiled/Conv_d100_L14_B1.dat Conv_d100_L14_B1.dat COPYONLY)
# configure_file(input_models/compiled/Conv_d100_L14_B32.hxx Conv_d100_L14_B32.hxx COPYONLY)
# #use file B1 as B32 for weights : it is the same
# configure_file(input_models/compiled/Conv_d100_L14_B32.dat Conv_d100_L14_B32.dat COPYONLY)

# configure_file(input_models/compiled/resnet18v1.hxx resnet18v1.hxx COPYONLY)
# configure_file(input_models/compiled/resnet18v1.dat resnet18v1.dat COPYONLY)


add_executable(emitFromONNX
EmitFromONNX.cxx
)
#target_include_directories(emitFromONNX PRIVATE )
target_link_libraries(emitFromONNX ${Protobuf_LIBRARIES} Core ROOTTMVASofie ROOTTMVASofieParser)
set_target_properties(emitFromONNX PROPERTIES POSITION_INDEPENDENT_CODE TRUE)

if (NOT ONNX_MODELS_DIR)
set(ONNX_MODELS_DIR input_models)
endif()

add_custom_target(SofieCompileModels)
add_dependencies(SofieCompileModels emitFromONNX)


file(GLOB ONNX_FILES "${ONNX_MODELS_DIR}/*.onnx")
foreach(onnx_file ${ONNX_FILES})
#get_filename_component(fname ${onnx_file} NAME_WE)
#get_filename_component(fdir ${onnx_file} DIRECTORY)
add_custom_command(TARGET SofieCompileModels POST_BUILD
COMMAND ./emitFromONNX ${onnx_file}
USES_TERMINAL
)
endforeach()

find_package(BLAS)
if(BLAS_FOUND)
message(STATUS "Found BLAS ( libraries ${BLAS_LIBRARIES})")

#set(SOFIE_BLAS_LIBS /home/moneta/intel/mkl/lib/intel64/libmkl_intel_lp64.so /home/moneta/intel/mkl/lib/intel64/libmkl_sequential.so /home/moneta/intel/mkl/lib/intel64/libmkl_core.so -lpthread)
#set(SOFIE_BLAS_LIBS /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/System/Library/Frameworks/Accelerate.framework)

#
# to set specific BLAS do : cmake -DBLA_Vendor=OpenBLAS, Intel10_64lp_seq or INtel64lp
# for Intel MKL need to set also MKLROOT env variable (see documentation of cmake FindBlas)
# need to source for example . $dir/intel/mkl/bin/mklvars.sh intel64

set(SOFIE_BLAS_LIBS ${BLAS_LIBRARIES})


# Benchmark for models emitted by SOFIE
RB_ADD_GBENCHMARK(SOFIEInference
SOFIEInference.cxx
LABEL short
LIBRARIES TMVA ROOTTMVASofie ${SOFIE_BLAS_LIBS}
)

add_dependencies(SOFIEInference SofieCompileModels)

RB_ADD_GBENCHMARK(RDF_SOFIE_Inference
RDF_SOFIE_Inference.cxx
LABEL short
LIBRARIES Core Hist Imt RIO Tree TreePlayer ROOTDataFrame ROOTVecOps TMVA ROOTTMVASofie ${SOFIE_BLAS_LIBS}
)

add_dependencies(RDF_SOFIE_Inference SofieCompileModels)

RB_ADD_GBENCHMARK(SOFIEInference_Reader
SOFIEInference_Reader.cxx
LABEL short
LIBRARIES Core Cling TMVA ROOTTMVASofie ${SOFIE_BLAS_LIBS}
)

add_dependencies(SOFIEInference_Reader SofieCompileModels)

#
# add optimization flags for best performances (factor 3 on simple Conv1 test)
#
#if (ROOT_PLATFORM MATCHES "linux|macosx" AND CMAKE_SYSTEM_PROCESSOR MATCHES x86_64 AND CMAKE_CXX_COMPILER_ID MATCHES "GNU|Clang")
## assume we run only on linux/macos with gnu or gcc
set(gnu-flags $<$<CXX_COMPILER_ID:GNU>:-fno-signaling-nans>)
if (APPLE)
target_compile_options(SOFIEInference PRIVATE ${gnu-flags} -ffast-math -fno-trapping-math -O3)
target_compile_options(RDF_SOFIE_Inference PRIVATE ${gnu-flags} -ffast-math -fno-trapping-math -O3)
else()
target_compile_options(SOFIEInference PRIVATE ${gnu-flags} -march=native -ffast-math -fno-trapping-math -O3)
target_compile_options(RDF_SOFIE_Inference PRIVATE ${gnu-flags} -march=native -ffast-math -fno-trapping-math -O3)
endif()

endif() # endif blas
endif() # endif TMVA/SOFIE

find_package(LWTNN QUIET)
if (LWTNN_FOUND)

message(STATUS "Found LWTNN (library is ${LWTNN_LIBRARY}, libraries ${LWTNN_LIBRARIES})")
configure_file(input_models/higgs_model_dense.json higgs_model_dense.json COPYONLY)
configure_file(input_models/Generator.json.gz Generator.json.gz COPYONLY)
execute_process(COMMAND gunzip -f ${CMAKE_CURRENT_BINARY_DIR}/Generator.json.gz)
# set(LWTNN_INCLUDE_DIR /home/moneta/cernbox/root/tests/tmva/sofie/lwtnn-build/include)
# set(LWTNN_LIBS /home/moneta/cernbox/root/tests/tmva/sofie/lwtnn-build/lib/liblwtnn.so)
RB_ADD_GBENCHMARK(LWTNNInference
LWTNNInference.cxx
LABEL short
LIBRARIES Core Hist Imt RIO Tree TreePlayer ROOTDataFrame ROOTVecOps TMVA ROOTTMVASofie ${LWTNN_LIBRARY})
target_include_directories(LWTNNInference PRIVATE ${LWTNN_INCLUDE_DIR})

RB_ADD_GBENCHMARK(RDF_lwtnn_Inference
RDF_lwtnn_Inference.cxx
LABEL short
LIBRARIES Core Hist Imt RIO Tree TreePlayer ROOTDataFrame ROOTVecOps TMVA ROOTTMVASofie ${LWTNN_LIBRARY})
target_include_directories(RDF_lwtnn_Inference PRIVATE ${LWTNN_INCLUDE_DIR})
else()
message(STATUS "LWTNN not found")
endif()

if (ONNXRuntime_FOUND)
configure_file(input_models/higgs_model_dense.onnx higgs_model_dense.onnx COPYONLY)
RB_ADD_GBENCHMARK(RDF_ONNXRuntime_Inference
RDF_ONNXRuntime_Inference.cxx
LABEL short
LIBRARIES Core Hist Imt RIO Tree TreePlayer ROOTDataFrame ROOTVecOps TMVA ROOTTMVASofie ${ONNXRuntime_LIBRARIES}
)
target_link_directories(RDF_ONNXRuntime_Inference PRIVATE ${ONNXRuntime_LIBRARIES})
target_include_directories(RDF_ONNXRuntime_Inference PRIVATE ${ONNXRuntime_INCLUDE_DIR})
endif()
Binary file added root/tmva/sofie/Conv_d100_L14_B32.onnx
Binary file not shown.
29 changes: 29 additions & 0 deletions root/tmva/sofie/EmitFromONNX.cxx
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
// Author: Federico Sossai
// Last modified: 2021/07/30
// Description:
// SOFIE command line compiler.
// This program is automatically run when the corresponding test target is built.
// Usage example: $./EmitFromONNX indir/mymodel.onnx outdir/myname.hxx

#include <iostream>

#include "TMVA/RModel.hxx"
#include "TMVA/RModelParser_ONNX.hxx"

using namespace TMVA::Experimental::SOFIE;

int main(int argc, char *argv[]){
if (argc < 2) {
std::cerr << "ERROR: missing input file\n";
return -1;
}

std::string outname= (argc > 2) ? argv[2] : "";
RModelParser_ONNX parser;
std::cout << "Parsing file " << argv[1] << std::endl;
RModel model = parser.Parse(argv[1]);
model.Generate(Options::kDefault, 1);
model.PrintRequiredInputTensors();
model.OutputGenerated(outname);
return 0;
}
Loading
Loading