3.6. Python Unit Testing¶
This document is currently just a PROPOSED testing standard. It is derived from the LSST document in the hopes of leveraging a common test framework.
This document was derived from the version 6.0 of the LSST/DM Python Testing document (https://github.com/lsst-dm/dm_dev_guide/blob/master/python/testing.rst). External documents referenced in the original LSST/DM document have been partially imported as needed for clarity, or else now reference similarly modified Data Lab documents.
This page provides technical guidance to developers writing unit tests for
Data Lab’s Python code base. Tests should be written using the
unittest framework, with default test discovery, and should support
being run using the pytest test runner as well as from the command line.
3.6.1. Introduction to
This document will not attempt to explain full details of how to use
unittest but instead shows common scenarios encountered in the codebase.
unittest example is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
import unittest import math class DemoTestCase1(unittest.TestCase): """Demo test case 1.""" def testDemo(self): self.assertGreater(10, 5) with self.assertRaises(TypeError): 1 + "2" class DemoTestCase2(unittest.TestCase): """Demo test case 2.""" def testDemo1(self): self.assertNotEqual("string1", "string2") def testDemo2(self): self.assertAlmostEqual(3.14, math.pi, places=2) if __name__ == "__main__": unittest.main()
The important things to note in this example are:
Test file names must begin with
test_to allow pytest to automatically detect them without requiring an explicit test list, which can be hard to maintain and can lead to missed tests.
If the test is being executed using python from the command line the
unittest.main()call performs the test discovery and executes the tests, setting exit status to non-zero if any of the tests fail.
Test classes are executed in the order in which they appear in the test file. In this case the tests in
DemoTestCase1will be executed before those in
Test classes must, ultimately, inherit from
unittest.TestCasein order to be discovered. The tests themselves must be methods of the test class with names that begin with
test. All other methods and classes will be ignored by the test system but can be used by tests.
Specific test asserts, such as
assertIn(), should be used wherever possible. It is always better to use a specific assert because the error message will contain more useful detail and the intent is more obvious to someone reading the code. Only use
assertFalse()if you are checking a boolean value, or a complex statement that is unsupported by other asserts.
When testing that an exception is raised always use
assertRaises()as a context manager, as shown in line 10 of the above example.
If a test method completes, the test passes; if it throws an uncaught exception the test has failed.
3.6.2. Supporting Pytest¶
pytest provides a rich execution and reporting environment for tests and can be used to run multiple test files together.
The pytest scheme for discovering tests inside Python modules is much more
flexible than that provided by
unittest, but test files should not
take advantage of that flexibility as it can lead to inconsistency in test
reports that depend on the specific test runner, and it is required that an
individual test file can be executed by running it directly with
python. In particular, care must be taken not to have free
functions that use a
test prefix or non-
test classes that are named with a
Test prefix in the test files.
3.6.3. Testing Flask Applications¶
3.6.4. Common Issues¶
This section describes some common problems that are encountered when using pytest.
18.104.22.168. Testing global state¶
to run all files in the
tests directory named
test_*.py. To ensure
that the order of test execution does not matter it is useful to sometimes
run the tests in reverse order by listing the test files manually:
$ pytest `ls -r tests/test_*.py`
pytest plugins are usually all enabled by default.
22.214.171.124. Test Skipping and Expected Failures¶
When writing tests it is important that tests are skipped using the proper
than returning from the test early.
unittest supports skipping of
individual tests and entire classes using decorators or skip exceptions.
It is also possible to indicate that a particular test is expected to fail,
being reported as an error if the test unexpectedly passes.
Expected failures can be used to write test code that triggers a reported
bug before the fix to the bug has been implemented and without causing the
continuous integration system to die. One of the primary advantages of using
a modern test runner such as pytest is that it is very easy to generate
machine-readable pass/fail/skip/xfail statistics to see how the system
is evolving over time, and it is also easy to enable code coverage.
Jenkins now provides test result information.
3.6.5. Enabling additional Pytest options: flake8¶
As described in Code MAY be validated with flake8, Python modules can be
configured using the
setup.cfg file. This configuration is
supported by pytest and can be used to enable additional testing or
tuning on a per-package basis. pytest uses the
in the configuration file. To enable automatic flake8 testing
as part of the normal test execution the following can be added to the
[tool:pytest] addopts = --flake8 flake8-ignore = E133 E211 E221 E223 E226 E228 N802 N803 N806 W504
addopts parameter adds additional command-line options to the
pytest command when it is run from the command-line
A wrinkle with the configuration of the
pytest-flake8 plugin is that
it inherits the
exclude settings from the
[flake8] section of
setup.cfg but you are required to explicitly
list the codes to ignore when running within pytest by using the
flake8-ignore parameter. One advantage of this approach is that you can
ignore error codes from specific files such that the unit tests will pass,
but running flake8 from the command line will remind you there
is an outstanding issue. This feature should be used sparingly, but can be
useful when you wish to enable code linting for the bulk of the project but
have some issues preventing full compliance.
With this configuration each Python file tested by pytest will have flake8 run on it.