Sunday, November 4, 2012

Acceptance Testing: JBehave

Acceptance testing is one of the crucial phase in product testing as it determines whether the system operates based on the specifications set. It ensures that the system functions as expected, integrating with numerous components/services to provide accurate results. Such automated testing of the product as a whole, based on a pre-decided set of scenarios (mainly from testers) ensures that we catch the faults before the manual testing takes over. It not only saves time for both testers/developers but also boosts developer confidence while making crucial changes in legacy code. There could be various approaches followed to write acceptance tests. Either the data needed for the test is created from scratch in a regular or in-memory database before and deleted once the test is completed in case of the regular database, or a static database for acceptance test is used\maintained were the required data needed for the test is essentially always present.

  There are 5 core principles for writing acceptance tests mentioned as below:
  1. Acceptance tests should be isolated and external to the application under test.
  2. Acceptance tests should be executed against the live application.
  3. Acceptance tests should be independent of any development environments.
  4. Acceptance tests should always be executed against actual data.
  5. Acceptance tests should imitate the manual verification criteria.
  With all said about the advantages of acceptance testing, there are two major Java frameworks supporting such testing, mainly JBehave and Cucumber. Both have a basic idea of writing stories which contain various test scenarios, using Give-When-Then clauses. All the scenarios are executed by mapping Give-When-Then clauses to corresponding methods and executing the mapped methods based on the order in the story. Upon the completion of execution, a report is generated based on the story and providing the execution results. But    JBehave and Cucumber are differ in some aspects of their workings. JBehave on one hand requires the story to be tightly coupled with its java implementation class, Cucumber only requires such coupling based on the scenarios in the story, irrespective of its implementation class. Lets dive in to have a closer look at each of the frameworks.

JBehave is been quite a framework for acceptance testing has most of the basic set of features such as reusing scenarios, skip scenarios, html/json/xml/text reporting, running multiple stories, jenkins plugin etc.

In maven world, jbehave can be configured by adding a dependency in the pom.xml for "jbehave-core" (version=3.6.8) in the group "org.jbehave". Also in order to execute all the stories using maven (mvn integration-test) a plugin entry must be added in the plugins section as follows:

















                <include>${embeddables}</include>                        <!-- include all stories -->


              <excludes />







                <metaFilter>-skip</metaFilter>     <!-- specify annotation to filter and skip the scenario -->









Once maven is configured and ready, we can write story scenarios and its implementation. Now, in order to invoke all the stories from Eclipse, a java class inheriting JUnitStories is implemented which specifies the similar configuration as in the maven plugin above.

   public class AllStories extends JUnitStories {

    public AllStories() {



        .doIgnoreFailureInStories(false)            // stop rest of the scenarios if any scenario fails

        .doIgnoreFailureInView(false)              //

        .useThreads(1)                                     // specify number of threads to use

        .useStoryTimeoutInSecs(300);           // story execution timout in seconds

        // specify annotation to filter and skip the scenario



    public Configuration configuration() {

        Class<? extends Embeddable> embeddableClass = this.getClass();

        // Enables to decorate and format non-Html reports

        Properties viewResources = new Properties();

        viewResources.put("decorateNonHtml", "true");

        // Start from default ParameterConverters instance

        ParameterConverters parameterConverters = new ParameterConverters();

        // factory to allow parameter conversion and loading from external resources (used by StoryParser too)

        parameterConverters.addConverters(new DateConverter(new SimpleDateFormat("yyyy-MM-dd")));

        return new MostUsefulConfiguration()

   .useStoryControls(new StoryControls().doDryRun(false).doSkipScenariosAfterFailure(true))

            .useStoryLoader(new LoadFromClasspath(embeddableClass))

            .useStoryPathResolver(new UnderscoredCamelCaseResolver())

            .useStoryReporterBuilder(new StoryReporterBuilder()



                .withPathResolver(new ResolveToPackagedName())


                // generates report in the following formats

                .withFormats(CONSOLE, TXT, HTML, XML)


                // displays full exception stacktrace in the generated report



   // Specify the class which implements the methods mapped to the scenarios

    public InjectableStepsFactory stepsFactory() {

        return new InstanceStepsFactory(configuration(), new Object[] { new StorySteps() });


    // Specify the relative path to the stories with the stories to include and exclude.
    protected List<String> storyPaths() {
      String codeLocation = codeLocationFromClass(this.getClass()).getFile();    

      return new StoryFinder().findPaths(codeLocation, Arrays.asList("**/**/*.story"), Arrays.asList("**/excluded*.story"));


The configuration above enables report generation by setting doGenerateViewAfterStories to true. It sets jbehave to stop the execution of scenarios in case of failure in executing any scenario. The execution is configured to run on a single thread in order to show an accurate execution duration in the report. Also it prints the scenario statements one by one with the debug results providing clear understanding. In order to prevent timeout of the story due to long service calls, it is set to a comfortable amount of 5 minutes. Also a "@skip" meta matcher is added in order to skip the scenarios in the story.
    In the configuration method we specify the properties in order to format non-Html reports. The formats in which reports are generated are, Text, Html, Xml and in Eclipse/Command console. The stack trace is enabled in the report on failure of the scenario using the withFailureTrace() method.
    After the configuration and story loader class is ready, we write the actual story with scenarios as follows:

Story: A customer with name "John" needs to setup a account.

Scenario: Customer account is created with default settings.
Given a customer with the name "John" with
| a | b | c |
| 1 | 0 | 1 |
| 2 | 6 | 4 |
When customer tries to create an account
Then get an customer account id which is not null and greater than 0
Scenario: ......
Meta: @skip
Given ......

Note that in the last scenario above we use the Meta information providing the property with the name "skip" but no value. The meta matchers can also be used as name-value pair such as "@ignore true".  The order of the scenarios in the story is the order of the execution of the scenarios.
   Moving forward we write the corresponding implementation for the scenarios in a class which is referenced in the stepsFactory() method of AllStories class. Annotations such as @Given, @When, @Then are used to bind the methods to the corresponding scenario's Given, When, Then clause. The @BeforeStory and @AfterStory annotations are used to initialize the story and clean up after its execution respectively. It is important to know that the statements following Given-When-Then in the story should match the ones in the annotations in order for the method to bind to the corresponding statement. Further we use quotes to highlight the parameters and "$" to identify the parameters for parsing. The identifying character for the parameter can be changed to "%" for example using the following statement in the configuration of AllStories class.
return new MostUsefulConfiguration().useStepPatternParser(new RegexPrefixCapturingPatternParser("%")) 

Further, the parameters parsed using "$" from the variables in the story are assigned to the method's parameters of the type String, Integer or ExampleTable. Any text in the appropriate position is converted to String while number is converted to Integer. The table specified is converted to ExampleTable, one of the JBehave object types. ExampleTable is mainly a list of Maps consisting of values from the table assigned to the header acting as the key for the Map.

public class StorySteps{



public void initialize(){ .... }


@Given("a customer with the name \"$customerName\" with $someTable")

public void setupOrg(String customerName, ExamplesTable someTable) throws Exception {


    List<Map<String, String>> rows = someTable.getRows();

for (Map<String, String> row : rows) {

             String a = row.get("a");



@When("dealer tries to create an account")
public void whenCustomerTriesToCreateAnAccount() throws Exception {

@Then("get an customer account id which is not null and greater than $number")
public void thenGetAnCustomerAccountIdWhichIsNotNullAndGreaterThan(Integer number) {



public void cleanUp() throws Exception  { ... }



The reports generated by jbehave are impressive providing a list of all stories along with their execution time, total scenarios, success, failures etc. Each story then provides the details of its scenarios as Given-When-Then and, colored Green for success and Red with stacktrace for failure. The only odd thing in the report is the Given section which when in a tabular form in the story, gets converted to a chunk of text without indentation and spacing. Even using the "{trim=false}" property before the table doesn't work to preserve the spacing of the columns on the report.

Jbehave also provides a plugin for Jenkins, Continuous Integration system, which parses the report generated in xml format to provide Test statistics similar to Junit. The configuration is simple, in the "Post-build Actions" add "Publish testing tools result report", then add "JBehave-3.x". Usually the pattern "**/jbehave/*" works, but with more specific pattern such as "**/jbehave/stories.*.xml" it certainly works. The jenkins report of jbehave is nothing fancy but a list of all the scenarios executed or failed, and the current testing trend.

Next we will continue with our next discussion on Cucumber-JVM Framework.

No comments: