...one of the most highly
regarded and expertly designed C++ library projects in the
world.
— Herb Sutter and Andrei
Alexandrescu, C++
Coding Standards
To define properly datasets, the notion of sample should be introduced first. A sample is defined as polymorphic tuple. The size of the tuple will be by definition the arity of the sample itself.
A dataset is a collection of samples, that
size
,
which in turn can be infinite,
Hence the dataset implements the notion of sequence.
The descriptive power of the datasets in Unit Test Framework comes from
stl
containers, C
arrays)
Tip | |
---|---|
Only "monomorphic" datasets are supported, which means that all samples in a dataset have the same type and same arity [2] . |
As we will see in the next sections, datasets representing collections of different types may be combined together (e.g.. zip or grid). These operations result in new datasets, in which the samples are of an augmented type.
The interface of the dataset should implement the two following functions/fields:
iterator begin()
where iterator
is a forward iterator,
boost::unit_test::data::size_t size() const
indicates the size of the dataset. The returned type is a dedicated
class size_t
that can indicate an infinite dataset size.
arity
indicating the arity of the samples returned by the dataset
Once a dataset class D
is declared, it should be registered to the framework by specializing
the class
boost::unit_test::data::monomorphic::is_dataset
with the condition that
boost::unit_test::data::monomorphic::is_dataset<D>::value
evaluates to true
.
The following example implements a custom dataset generating a Fibonacci sequence.
Code |
---|
#define BOOST_TEST_MODULE dataset_example68 #include <boost/test/included/unit_test.hpp> #include <boost/test/data/test_case.hpp> #include <boost/test/data/monomorphic.hpp> #include <sstream> namespace bdata = boost::unit_test::data; // Dataset generating a Fibonacci sequence class fibonacci_dataset { public: // Samples type is int using sample=int; enum { arity = 1 }; struct iterator { iterator() : a(1), b(1) {} int operator*() const { return b; } void operator++() { a = a + b; std::swap(a, b); } private: int a; int b; // b is the output }; fibonacci_dataset() {} // size is infinite bdata::size_t size() const { return bdata::BOOST_TEST_DS_INFINITE_SIZE; } // iterator iterator begin() const { return iterator(); } }; namespace boost { namespace unit_test { namespace data { namespace monomorphic { // registering fibonacci_dataset as a proper dataset template <> struct is_dataset<fibonacci_dataset> : boost::mpl::true_ {}; }}}} // Creating a test-driven dataset BOOST_DATA_TEST_CASE( test1, fibonacci_dataset() ^ bdata::make( { 1, 2, 3, 5, 8, 13, 21, 35, 56 } ), fib_sample, exp) { BOOST_TEST(fib_sample == exp); } |
Output |
---|
> example68 Running 9 test cases... test.cpp(60): error: in "test1/_7": check fib_sample == exp has failed [34 != 35] Failure occurred in a following context: fib_sample = 34; exp = 35; test.cpp(60): error: in "test1/_8": check fib_sample == exp has failed [55 != 56] Failure occurred in a following context: fib_sample = 55; exp = 56; *** 2 failures are detected in the test module "dataset_example68" |
[2] polymorphic datasets will be considered in the future. Their need is mainly driven by the replacement of the typed parametrized test cases by the dataset-like API.