django-smoketest
================
[](https://github.com/ccnmtl/django-smoketest/actions)
[](https://coveralls.io/github/ccnmtl/django-smoketest?branch=master)
Motivation
----------
Smoke test framework for Django.
Smoke tests are tests that are run on a production environment to
quickly detect major systemic problems. Eg, after you run a deploy,
you want to quickly check that everything is running properly so you
can roll back quickly instead if there are problems. Too often, this
just means visiting the site and manually clicking around through a
few links (at best).
You probably already have unit tests verifying the correctness of low
level parts of your code, and integration and acceptance tests running
on a staging server or CI system. Maybe you've even got automatic
configuration management ensuring that your staging server is
configured as an exact replica of production. So logically, if your
code passes all the tests on the staging server and the production
server is configured the same, everything *must* work right in
production. Right? Wouldn't it be wonderful if the world were so
simple? Of course we know that it's not. That's why we want smoke
tests to actually verify that at least the major components of the
system are all basically functional and able to talk to each other and
we didn't do something stupid like writing code that depends on a new
environment variable that hasn't been set to the correct value on
production yet.
You probably don't want to run your unit tests or integration tests
in production with production settings in effect. Who knows what kind
of insanity would result? Test data sprayed all through your
production database, deleting user data from the file system, the sun
rising in the west and setting in the east?
This is what smoke tests are for. Smoke tests should be *safe* to run
in production. Verify that the application can connect to the
database, that whatever filesystem mounts are expected are in place,
etc. bridging that last gap between existing test coverage and the
wilderness of production. But all while stepping carefully around the
production data.
I also find myself frequently writing small views to support ad-hoc
monitoring. Eg, if an application relies on an NFS mount for some
infrequent operation and that mount has a tendency to go stale, a cron
job that runs every few minutes (or via nagios or some other
monitoring application) and has the application try to read a
file off the mount can help ensure that we are alerted to the stale
mount before users encounter it.
Getting Started
---------------
Install django-smoketest
$ pip install django-smoketest
Add `smoketest` to your `INSTALLED_APPS`.
In each application of yours that you want to define smoke tests for,
make a `smoke.py` file or a `smoke` directory with an
`__init__.py` and one or more python files with your tests.
In your `urls.py`, add something like:
('smoketest/', include('smoketest.urls'))
To your `urlpatterns`.
In your `smoke.py` (or module), you put something like this:
from smoketest import SmokeTest
from myapp.models import FooModel
class DemoTest(SmokeTest):
def test_foomodel_reads(self):
""" just make sure we can read data from the db """
cnt = FooModel.objects.all().count()
self.assertTrue(cnt > 0)
def test_foomodel_writes(self):
""" make sure we can also write to the database
but do not leave any test detritus around. Smoketests
are automatically rolled back.
"""
f = FooModel.objects.create()
Now, if you make a `GET` to `http://yourapp/smoketest/`,
django-smoketest will go through your code, finding any `smoke`
modules, and run the tests you have defined (if you've used unittest
or nose, you get the idea):
PASS
test classes: 1
tests run: 3
tests passed: 3
tests failed: 0
tests errored: 0
time: 1200.307861328ms
So you can just check the result for `PASS` if you are calling it from
a monitoring script or as part of an automated deploy.
If tests fail or error out, you instead get something like:
FAIL
test classes: 1
tests run: 8
tests passed: 5
tests failed: 2
tests errored: 1
time: 3300.07861328ms
module1.smoke.DemoTest.test_foo failed
module1.smoke.DemoTest.test_bar failed
module1.smoke.DemoTest.test_baz errored
If your HTTP client makes the request with `application/json` in the
`Accept:` headers, responses will be JSON objects with the same
information in a more easily parseable form:
$ curl -H "Accept: application/json" http://yourapp/smoketest/
{"status": "FAIL", "tests_failed": 2,
"errored_tests": ["module1.smoke.DemoTest.test_baz"],
"tests_run": 8, "test_classes": 1, "tests_passed": 5,
"failed_tests": ["module1.smoke.DemoTest.test_foo",
"module1.smoke.DemoTest.test_foo"], "tests_errored": 1,
"time": 1.6458759307861328}
QUESTION: I'm thinking about keeping the output simple to parse
automatically, but maybe we ought to just stick with unittest's
existing output format instead?
API
---
The main class is `smoketests.SmokeTest`, which should be though of as
equivalent to `unittest.TestCase`. It will do basically the usual
stuff there, running `setUp` and `tearDown` methods, and supporting
the usual array of `assertEquals`, `assertRaises`, `assertTrue`
methods.
All smoketests are wrapped in a database transaction which is then
rolled back after running. This frees you up to do potentially
destructive things and just let the DB clean up for you. The usual
caveats apply about making sure you are using a database that supports
transactions and that it can only roll back database operations, not
other side effects.
By default, django-smoketest will search through all apps mentioned in
your `INSTALLED_APPS`, looking for smoketests. If you define a
`SMOKETEST_SKIP_APPS` setting with a list of apps, django-smoketest
will bypass any mentioned there.
Asserts supported (so far):
* assertEqual(a, b)
* assertNotEqual(a, b)
* assertTrue(t)
* assertFalse(x)
* assertIs(a, b)
* assertIsNot(a, b)
* assertIsNone(x)
* assertIsNotNone(x)
* assertIn(a, b)
* assertNotIn(a, b)
* assertIsInstance(a, b)
* assertNotIsInstance(a, b)
* assertRaises(exception, function)
* assertLess(a, b)
* assertLessEqual(a, b)
* assertGreater(a, b)
* assertGreaterEqual(a, b)
* assertAlmostEqual(a, b)
* assertNotAlmostEqual(a, b)
All call accepts custom message as the last parameter (msg) just like
all assert calls in unittest libraries.
Open Questions
--------------
What other unittest/nose flags, conventions, etc should we support?
`--failfast`? output verbosity? ability to target or skip specific
tests in certain cases? Automatic timeouts (a lot of smoke tests
involve trying to connect to an external service and failing if it
takes more than a specified period of time)?
Progress
--------
TODO:
* I think it only handles `smoke.py` files or `smoke/__init__.py` and
won't yet find subclasses in submodules like `smoke/foo.py`.
* setUpClass/tearDownClass
* extended assert* methods (listed in `smoketest/__init__.py`)
DONE:
* walk `INSTALLED_APPLICATIONS` and find/run smoke tests
* report numbers in simple text format
* run setUp and tearDown methods
* when tests fail/error, report which ones failed/errored
* proper `module.class.method` info on test failures/errors report
* support the basic expected set of assert* methods from unittest
* JSON output
* time test runs and include in output
* run tests in a rolled back transaction
* report additional info (exception/tracebacks) on errors (Kristijan Mitrovic <kmitrovic>)
* support messages on asserts (Kristijan Mitrovic <kmitrovic>)
* `SMOKETEST_SKIP_APPS`
Raw data
{
"_id": null,
"home_page": "https://github.com/ccnmtl/django-smoketest",
"name": "django-smoketest",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "",
"author": "Anders Pearson",
"author_email": "ctl-dev@columbia.edu",
"download_url": "https://files.pythonhosted.org/packages/28/44/bc26e5932b1de0f2563495e3e8a31da4a5485780bd8db005b7936fbbe854/django-smoketest-1.2.1.tar.gz",
"platform": "any",
"description": "django-smoketest\n================\n\n[](https://github.com/ccnmtl/django-smoketest/actions)\n[](https://coveralls.io/github/ccnmtl/django-smoketest?branch=master)\n\nMotivation\n----------\n\nSmoke test framework for Django.\n\nSmoke tests are tests that are run on a production environment to\nquickly detect major systemic problems. Eg, after you run a deploy,\nyou want to quickly check that everything is running properly so you\ncan roll back quickly instead if there are problems. Too often, this\njust means visiting the site and manually clicking around through a\nfew links (at best).\n\nYou probably already have unit tests verifying the correctness of low\nlevel parts of your code, and integration and acceptance tests running\non a staging server or CI system. Maybe you've even got automatic\nconfiguration management ensuring that your staging server is\nconfigured as an exact replica of production. So logically, if your\ncode passes all the tests on the staging server and the production\nserver is configured the same, everything *must* work right in\nproduction. Right? Wouldn't it be wonderful if the world were so\nsimple? Of course we know that it's not. That's why we want smoke\ntests to actually verify that at least the major components of the\nsystem are all basically functional and able to talk to each other and\nwe didn't do something stupid like writing code that depends on a new\nenvironment variable that hasn't been set to the correct value on\nproduction yet.\n\nYou probably don't want to run your unit tests or integration tests\nin production with production settings in effect. Who knows what kind\nof insanity would result? Test data sprayed all through your\nproduction database, deleting user data from the file system, the sun\nrising in the west and setting in the east?\n\nThis is what smoke tests are for. Smoke tests should be *safe* to run\nin production. Verify that the application can connect to the\ndatabase, that whatever filesystem mounts are expected are in place,\netc. bridging that last gap between existing test coverage and the\nwilderness of production. But all while stepping carefully around the\nproduction data.\n\nI also find myself frequently writing small views to support ad-hoc\nmonitoring. Eg, if an application relies on an NFS mount for some\ninfrequent operation and that mount has a tendency to go stale, a cron\njob that runs every few minutes (or via nagios or some other\nmonitoring application) and has the application try to read a\nfile off the mount can help ensure that we are alerted to the stale\nmount before users encounter it.\n\nGetting Started\n---------------\n\nInstall django-smoketest\n\n $ pip install django-smoketest\n\nAdd `smoketest` to your `INSTALLED_APPS`.\n\nIn each application of yours that you want to define smoke tests for,\nmake a `smoke.py` file or a `smoke` directory with an\n`__init__.py` and one or more python files with your tests.\n\nIn your `urls.py`, add something like:\n\n ('smoketest/', include('smoketest.urls'))\n\nTo your `urlpatterns`.\n\nIn your `smoke.py` (or module), you put something like this:\n\n from smoketest import SmokeTest\n from myapp.models import FooModel\n \n \n class DemoTest(SmokeTest):\n def test_foomodel_reads(self):\n \"\"\" just make sure we can read data from the db \"\"\"\n cnt = FooModel.objects.all().count()\n self.assertTrue(cnt > 0)\n \n def test_foomodel_writes(self):\n \"\"\" make sure we can also write to the database\n but do not leave any test detritus around. Smoketests\n\t\t\tare automatically rolled back.\n \"\"\"\n f = FooModel.objects.create()\n \nNow, if you make a `GET` to `http://yourapp/smoketest/`,\ndjango-smoketest will go through your code, finding any `smoke`\nmodules, and run the tests you have defined (if you've used unittest\nor nose, you get the idea):\n\n PASS\n test classes: 1\n tests run: 3\n tests passed: 3\n tests failed: 0\n tests errored: 0\n time: 1200.307861328ms\n\nSo you can just check the result for `PASS` if you are calling it from\na monitoring script or as part of an automated deploy.\n\nIf tests fail or error out, you instead get something like:\n\n FAIL\n test classes: 1\n tests run: 8\n tests passed: 5\n tests failed: 2\n tests errored: 1\n time: 3300.07861328ms\n module1.smoke.DemoTest.test_foo failed\n module1.smoke.DemoTest.test_bar failed\n module1.smoke.DemoTest.test_baz errored\n\nIf your HTTP client makes the request with `application/json` in the\n`Accept:` headers, responses will be JSON objects with the same\ninformation in a more easily parseable form:\n\n $ curl -H \"Accept: application/json\" http://yourapp/smoketest/\n {\"status\": \"FAIL\", \"tests_failed\": 2,\n \"errored_tests\": [\"module1.smoke.DemoTest.test_baz\"],\n \"tests_run\": 8, \"test_classes\": 1, \"tests_passed\": 5,\n \"failed_tests\": [\"module1.smoke.DemoTest.test_foo\",\n \"module1.smoke.DemoTest.test_foo\"], \"tests_errored\": 1,\n \"time\": 1.6458759307861328}\n\nQUESTION: I'm thinking about keeping the output simple to parse\nautomatically, but maybe we ought to just stick with unittest's\nexisting output format instead?\n\nAPI\n---\n\nThe main class is `smoketests.SmokeTest`, which should be though of as\nequivalent to `unittest.TestCase`. It will do basically the usual\nstuff there, running `setUp` and `tearDown` methods, and supporting\nthe usual array of `assertEquals`, `assertRaises`, `assertTrue`\nmethods.\n\nAll smoketests are wrapped in a database transaction which is then\nrolled back after running. This frees you up to do potentially\ndestructive things and just let the DB clean up for you. The usual\ncaveats apply about making sure you are using a database that supports\ntransactions and that it can only roll back database operations, not\nother side effects.\n\nBy default, django-smoketest will search through all apps mentioned in\nyour `INSTALLED_APPS`, looking for smoketests. If you define a\n`SMOKETEST_SKIP_APPS` setting with a list of apps, django-smoketest\nwill bypass any mentioned there.\n\nAsserts supported (so far):\n\n* assertEqual(a, b)\n* assertNotEqual(a, b)\n* assertTrue(t)\n* assertFalse(x)\n* assertIs(a, b)\n* assertIsNot(a, b)\n* assertIsNone(x)\n* assertIsNotNone(x)\n* assertIn(a, b)\n* assertNotIn(a, b)\n* assertIsInstance(a, b)\n* assertNotIsInstance(a, b)\n* assertRaises(exception, function)\n* assertLess(a, b)\n* assertLessEqual(a, b)\n* assertGreater(a, b)\n* assertGreaterEqual(a, b)\n* assertAlmostEqual(a, b)\n* assertNotAlmostEqual(a, b)\n\nAll call accepts custom message as the last parameter (msg) just like\nall assert calls in unittest libraries.\n\n\nOpen Questions\n--------------\n\nWhat other unittest/nose flags, conventions, etc should we support?\n`--failfast`? output verbosity? ability to target or skip specific\ntests in certain cases? Automatic timeouts (a lot of smoke tests\ninvolve trying to connect to an external service and failing if it\ntakes more than a specified period of time)?\n\nProgress\n--------\n\nTODO:\n\n* I think it only handles `smoke.py` files or `smoke/__init__.py` and\n won't yet find subclasses in submodules like `smoke/foo.py`.\n* setUpClass/tearDownClass\n* extended assert* methods (listed in `smoketest/__init__.py`)\n\nDONE:\n\n* walk `INSTALLED_APPLICATIONS` and find/run smoke tests\n* report numbers in simple text format\n* run setUp and tearDown methods\n* when tests fail/error, report which ones failed/errored\n* proper `module.class.method` info on test failures/errors report\n* support the basic expected set of assert* methods from unittest\n* JSON output\n* time test runs and include in output\n* run tests in a rolled back transaction\n* report additional info (exception/tracebacks) on errors (Kristijan Mitrovic <kmitrovic>)\n* support messages on asserts (Kristijan Mitrovic <kmitrovic>)\n* `SMOKETEST_SKIP_APPS`\n\n\n",
"bugtrack_url": null,
"license": "BSD",
"summary": "Django smoketest framework",
"version": "1.2.1",
"project_urls": {
"Homepage": "https://github.com/ccnmtl/django-smoketest"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a446ed23d6912d211e6bc6b9c76bbb41198027d659fade8570668513f044509b",
"md5": "adc4354f1ed23bb22186cafbea486660",
"sha256": "6fc0b6603dd69ce20f341b0498ca25c39251a0380d0cf1f115eeba834a2330fe"
},
"downloads": -1,
"filename": "django_smoketest-1.2.1-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "adc4354f1ed23bb22186cafbea486660",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": null,
"size": 14042,
"upload_time": "2023-08-10T13:15:12",
"upload_time_iso_8601": "2023-08-10T13:15:12.004662Z",
"url": "https://files.pythonhosted.org/packages/a4/46/ed23d6912d211e6bc6b9c76bbb41198027d659fade8570668513f044509b/django_smoketest-1.2.1-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2844bc26e5932b1de0f2563495e3e8a31da4a5485780bd8db005b7936fbbe854",
"md5": "59312d442d22a91cf25f955b7b2fc48e",
"sha256": "708f75c9855af737b3fe5403cb83107cf3d0d3511bb7f816cb8753ce63f0a87b"
},
"downloads": -1,
"filename": "django-smoketest-1.2.1.tar.gz",
"has_sig": false,
"md5_digest": "59312d442d22a91cf25f955b7b2fc48e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 12905,
"upload_time": "2023-08-10T13:15:14",
"upload_time_iso_8601": "2023-08-10T13:15:14.837271Z",
"url": "https://files.pythonhosted.org/packages/28/44/bc26e5932b1de0f2563495e3e8a31da4a5485780bd8db005b7936fbbe854/django-smoketest-1.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-10 13:15:14",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ccnmtl",
"github_project": "django-smoketest",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "django-smoketest"
}