|
|
||
|---|---|---|
| .. | ||
| experiment_files | ||
| perf_files | ||
| test_cache | ||
| README.md | ||
| benchmark.py | ||
| benchmark_run.py | ||
| benchmark_run_unittest.py | ||
| benchmark_unittest.py | ||
| column_chart.py | ||
| compare_machines.py | ||
| config.py | ||
| config_unittest.py | ||
| crosperf | ||
| crosperf.py | ||
| crosperf_autolock.py | ||
| crosperf_unittest.py | ||
| default-telemetry-results.json | ||
| default_remotes | ||
| download_images.py | ||
| download_images_buildid_test.py | ||
| download_images_unittest.py | ||
| experiment.py | ||
| experiment_factory.py | ||
| experiment_factory_unittest.py | ||
| experiment_file.py | ||
| experiment_file_unittest.py | ||
| experiment_runner.py | ||
| experiment_runner_unittest.py | ||
| experiment_status.py | ||
| field.py | ||
| flag_test_unittest.py | ||
| generate_report.py | ||
| generate_report_unittest.py | ||
| help.py | ||
| image_checksummer.py | ||
| label.py | ||
| machine_image_manager.py | ||
| machine_image_manager_unittest.py | ||
| machine_manager.py | ||
| machine_manager_unittest.py | ||
| mock_instance.py | ||
| results_cache.py | ||
| results_cache_unittest.py | ||
| results_organizer.py | ||
| results_organizer_unittest.py | ||
| results_report.py | ||
| results_report_templates.py | ||
| results_report_unittest.py | ||
| run_tests.sh | ||
| schedv2.py | ||
| schedv2_unittest.py | ||
| settings.py | ||
| settings_factory.py | ||
| settings_factory_unittest.py | ||
| settings_unittest.py | ||
| suite_runner.py | ||
| suite_runner_unittest.py | ||
| test_flag.py | ||
| translate_xbuddy.py | ||
| unittest_keyval_file.txt | ||
README.md
experiment_files
To use these experiment files, replace the board, remote and images placeholders and run crosperf on them.
Further information about crosperf: https://sites.google.com/a/google.com/chromeos-toolchain-team-home2/home/team-tools-and-scripts/crosperf-cros-image-performance-comparison-tool
The final experiment file should look something like the following (but with different actual values for the fields):
board: lumpy
remote: 123.45.67.089
# Add images you want to test:
my_image {
chromeos_image: /usr/local/chromeos/src/build/images/lumpy/chromiumos_test_image.bin
}
vanilla_image {
chromeos_root: /usr/local/chromeos
build: lumpy-release/R35-5672.0.0
}
# Paste experiment benchmarks here. Example, I pasted
# `page_cycler_v2.morejs` here.
# This experiment just runs a short autotest which measures the performance
# of Telemetry's `page_cycler_v2.morejs`. In addition, it profiles cycles.
perf_args: record -e cycles
benchmark: page_cycler_v2.morejs {
suite: telemetry_Crosperf
iterations: 1
}
default_remotes
This is the list of machines allocated for toolchain team. This should be kept in sync with: https://chromeos-swarming.appspot.com/botlist?c=id&c=task&c=label-board&c=label-pool&c=os&c=status&d=asc&f=label-pool%3Atoolchain&k=label-pool&s=id