Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing Epic #5

Open
3 of 6 tasks
MTCam opened this issue May 27, 2020 · 0 comments
Open
3 of 6 tasks

Testing Epic #5

MTCam opened this issue May 27, 2020 · 0 comments

Comments

@MTCam
Copy link
Member

MTCam commented May 27, 2020

Overview

This issue is designed to summarize any discussions we have about our testing needs, capabilities, and any gaps between the two.

For testing emirge, we need the following high-level capabilities (please add):

  • System for adding, building, running tests during the development cycle
  • Automated build and test on remote HPC platforms
  • Testing dashboard
  • Test coverage
  • Performance data collection & analysis
  • Performance/benchmarking over time

Automated build and test on remote HPC platforms & dashboard with TEESD

What can we currently do and what do we need?
TEESD currently uses CTest and CDash to provide the following capabilities:

  • developers to (quickly) run selected tests
  • CI-type tests: detect changes in the repository, pull those changes, configure, build, and test the project on all remote HPC platforms we care about. Those platforms are currently Quartz, Lassen, and the Illinois-local MachineShop (dunkle, et al).
  • Regular, nightly tests on each platform
  • Testing results are published on a project-public testing dashboard

TEESD uses ABaTe constructs to provide the following:

  • Monitor multiple projects/branches
  • Environment management (works with Conda, pip, modules, Spack)
  • Test grouping into testing suites
  • Platform-specific batch system navigation
  • Inclusion of raw and image data into test results

Test Coverage

What do we want our coverage tools to do at the top/emirge level? What are the capabilities we currently have and what are our options?
Our current coverage strategy works using gcov and does not include Python. It works for C, C++, and F90, and submits coverage reports to the dashboard.

Performance data collection & analysis

What do we need and what do we have?
We have an in-house Profiler python package. It is integrated with TEESD and does the following simple things:

  • Times user-defined code constructs (in a naive way)
  • Generates timing reports in gnuplot-ready text format
  • MPI parallel times with or without barriers
  • Through customized tests, using gnuplot - plots performance results and submits to dashboard

What else is needed?

  • Better timing/profiling for PyOpenCL, or other asynchronous execution

Benchmarking results over time

What do we have, and what do we need?
AirSpeedVelocity (asv) is set up for use in TEESD. We have added no emirge-specific benchmarks. It is not integrated with the dashboard. What benchmarks should we have?
Needed:

  • Project-public results summary
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant