android13/external/toolchain-utils/crosperf
liiir1985 7f62dcda9f initial 2024-06-22 20:45:49 +08:00
..
experiment_files initial 2024-06-22 20:45:49 +08:00
perf_files initial 2024-06-22 20:45:49 +08:00
test_cache initial 2024-06-22 20:45:49 +08:00
README.md initial 2024-06-22 20:45:49 +08:00
benchmark.py initial 2024-06-22 20:45:49 +08:00
benchmark_run.py initial 2024-06-22 20:45:49 +08:00
benchmark_run_unittest.py initial 2024-06-22 20:45:49 +08:00
benchmark_unittest.py initial 2024-06-22 20:45:49 +08:00
column_chart.py initial 2024-06-22 20:45:49 +08:00
compare_machines.py initial 2024-06-22 20:45:49 +08:00
config.py initial 2024-06-22 20:45:49 +08:00
config_unittest.py initial 2024-06-22 20:45:49 +08:00
crosperf initial 2024-06-22 20:45:49 +08:00
crosperf.py initial 2024-06-22 20:45:49 +08:00
crosperf_autolock.py initial 2024-06-22 20:45:49 +08:00
crosperf_unittest.py initial 2024-06-22 20:45:49 +08:00
default-telemetry-results.json initial 2024-06-22 20:45:49 +08:00
default_remotes initial 2024-06-22 20:45:49 +08:00
download_images.py initial 2024-06-22 20:45:49 +08:00
download_images_buildid_test.py initial 2024-06-22 20:45:49 +08:00
download_images_unittest.py initial 2024-06-22 20:45:49 +08:00
experiment.py initial 2024-06-22 20:45:49 +08:00
experiment_factory.py initial 2024-06-22 20:45:49 +08:00
experiment_factory_unittest.py initial 2024-06-22 20:45:49 +08:00
experiment_file.py initial 2024-06-22 20:45:49 +08:00
experiment_file_unittest.py initial 2024-06-22 20:45:49 +08:00
experiment_runner.py initial 2024-06-22 20:45:49 +08:00
experiment_runner_unittest.py initial 2024-06-22 20:45:49 +08:00
experiment_status.py initial 2024-06-22 20:45:49 +08:00
field.py initial 2024-06-22 20:45:49 +08:00
flag_test_unittest.py initial 2024-06-22 20:45:49 +08:00
generate_report.py initial 2024-06-22 20:45:49 +08:00
generate_report_unittest.py initial 2024-06-22 20:45:49 +08:00
help.py initial 2024-06-22 20:45:49 +08:00
image_checksummer.py initial 2024-06-22 20:45:49 +08:00
label.py initial 2024-06-22 20:45:49 +08:00
machine_image_manager.py initial 2024-06-22 20:45:49 +08:00
machine_image_manager_unittest.py initial 2024-06-22 20:45:49 +08:00
machine_manager.py initial 2024-06-22 20:45:49 +08:00
machine_manager_unittest.py initial 2024-06-22 20:45:49 +08:00
mock_instance.py initial 2024-06-22 20:45:49 +08:00
results_cache.py initial 2024-06-22 20:45:49 +08:00
results_cache_unittest.py initial 2024-06-22 20:45:49 +08:00
results_organizer.py initial 2024-06-22 20:45:49 +08:00
results_organizer_unittest.py initial 2024-06-22 20:45:49 +08:00
results_report.py initial 2024-06-22 20:45:49 +08:00
results_report_templates.py initial 2024-06-22 20:45:49 +08:00
results_report_unittest.py initial 2024-06-22 20:45:49 +08:00
run_tests.sh initial 2024-06-22 20:45:49 +08:00
schedv2.py initial 2024-06-22 20:45:49 +08:00
schedv2_unittest.py initial 2024-06-22 20:45:49 +08:00
settings.py initial 2024-06-22 20:45:49 +08:00
settings_factory.py initial 2024-06-22 20:45:49 +08:00
settings_factory_unittest.py initial 2024-06-22 20:45:49 +08:00
settings_unittest.py initial 2024-06-22 20:45:49 +08:00
suite_runner.py initial 2024-06-22 20:45:49 +08:00
suite_runner_unittest.py initial 2024-06-22 20:45:49 +08:00
test_flag.py initial 2024-06-22 20:45:49 +08:00
translate_xbuddy.py initial 2024-06-22 20:45:49 +08:00
unittest_keyval_file.txt initial 2024-06-22 20:45:49 +08:00

README.md

experiment_files

To use these experiment files, replace the board, remote and images placeholders and run crosperf on them.

Further information about crosperf: https://sites.google.com/a/google.com/chromeos-toolchain-team-home2/home/team-tools-and-scripts/crosperf-cros-image-performance-comparison-tool

The final experiment file should look something like the following (but with different actual values for the fields):

board: lumpy
remote: 123.45.67.089

# Add images you want to test:

my_image {
  chromeos_image: /usr/local/chromeos/src/build/images/lumpy/chromiumos_test_image.bin
}

vanilla_image {
   chromeos_root: /usr/local/chromeos
   build: lumpy-release/R35-5672.0.0
}

# Paste experiment benchmarks here. Example, I pasted
# `page_cycler_v2.morejs` here.

# This experiment just runs a short autotest which measures the performance
# of Telemetry's `page_cycler_v2.morejs`. In addition, it profiles cycles.

perf_args: record -e cycles

benchmark: page_cycler_v2.morejs {
   suite: telemetry_Crosperf
   iterations: 1
}

default_remotes

This is the list of machines allocated for toolchain team. This should be kept in sync with: https://chromeos-swarming.appspot.com/botlist?c=id&c=task&c=label-board&c=label-pool&c=os&c=status&d=asc&f=label-pool%3Atoolchain&k=label-pool&s=id