The official Zebrunner JUnit 4 agent provides reporting and smart reruns functionality. There are a few special configuration steps that should be performed in order to enable the agent's functionality.
The agent comes bundled with JUnit 4.13, so you should comment out or exclude your JUnit dependency from the project. If you are using a version of JUnit below 4.13, we cannot guarantee the correct functionality of the agent.
Including the agent into your project is easy - just add the dependency to the build descriptor.
dependencies {
testImplementation 'com.zebrunner:agent-junit:1.0.0'
}
<dependency>
<groupId>com.zebrunner</groupId>
<artifactId>agent-junit</artifactId>
<version>1.0.0</version>
<scope>test</scope>
</dependency>
The JUnit4 framework requires additional configuration to catch test lifecycle events. The Zebrunner agent comes with a bespoke Java instrumentation agent that listens to the test lifecycle events and reports them to Zebrunner.
Technically speaking, you need to add a VM argument referencing the Zebrunner agent jar file. This can be done in several ways: using a build tool (Maven or Gradle) or directly from the IDE.
The maven-surefire-plugin
provides an ability to add VM arguments in a convenient way. You only need to provide the absolute path to the jar file with the Zebrunner agent.
The maven-dependency-plugin
can be used to obtain the absolute path to a project's dependency. The properties
goal of this plugin supplies a set of properties with paths to all project dependencies. If your project is already using the maven-dependency-plugin
, this is the best way to go.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.2</version>
<executions>
<execution>
<goals>
<goal>properties</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
<configuration>
<argLine>-javaagent:${com.zebrunner:agent-junit:jar}</argLine>
</configuration>
</plugin>
The ${com.zebrunner:agent-junit:jar}
property is generated by the maven-dependency-plugin
during the initialization phase. Maven automatically sets the generated value when maven-surefire-plugin
launches tests.
The maven-surefire-plugin
provides an ability to add VM arguments in a convenient way. You only need to provide the absolute path to the jar file with the Zebrunner agent.
The maven-antrun-plugin
can be used to obtain the absolute path to a project dependency. If your project is already using the maven-antrun-plugin
, this is the best way to go.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<phase>initialize</phase>
<configuration>
<exportAntProperties>true</exportAntProperties>
<tasks>
<basename file="${maven.dependency.com.zebrunner.agent-junit.jar.path}" property="com.zebrunner:agent-junit:jar"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
<configuration>
<argLine>-javaagent:${com.zebrunner:agent-junit:jar}</argLine>
</configuration>
</plugin>
The ${com.zebrunner:agent-junit:jar}
property is generated by the maven-antrun-plugin
during the initialization phase. Maven automatically sets the generated value when maven-surefire-plugin
launches tests.
Gradle provides support for adding a VM argument out of the box. The only thing you need to do is add the jvmArgs
property to the test
task. Value of this property must point to the local path to the Zebrunner agent.
The following code snippet shows content of build.gradle
file.
dependencies {
// some project dependencies
testImplementation 'junit:junit:4.13'
testImplementation 'org.hamcrest:hamcrest-library:1.3'
testImplementation 'com.zebrunner:agent-junit:1.0.0'
}
def junitAgentArtifact = configurations.testRuntimeClasspath.resolvedConfiguration.resolvedArtifacts.find { it.name == 'agent-junit' }
test.doFirst {
jvmArgs "-javaagent:${junitAgentArtifact.file}"
}
Most modern IDEs provide an ability to run tests locally and specify environment and/or VM arguments. This ability makes it possible to run test locally on dev machines and report results to Zebrunner.
We strongly recommend that you do not run tests locally using IDE support along with the Zebrunner agent, but instead use the build tool support.
To add a VM argument for run via IDE, open the preferred IDE's settings. Then find the VM arguments setting and append the following line to the property value.
-javaagent:<path-to-junit-agent-jar>
The value you append to VM arguments setting must contain a valid path to your local junit-agent jar file. In most cases, this jar has been downloaded by your build tool and saved in its local repository (.m2
folder for Maven or .gradle/caches/modules-2/files-2.1
folder for Gradle).
Once the agent is available on the classpath of your test project, it is not automatically enabled. The valid configuration must be provided.
It is currently possible to provide the configuration via:
- Environment variables
- Program arguments
- YAML file
- Properties file
The configuration lookup will be performed in the order listed above, meaning the environment configuration will always take precedence over YAML and so on. It is also possible to override configuration parameters by passing them through a configuration provider with higher precedence.
Once the configuration is set up, the agent is ready to track your test run events, with no additional configuration required.
The following configuration parameters are recognized by the agent:
REPORTING_ENABLED
- enables or disables reporting. The default value isfalse
. If disabled, the agent will use no op component implementations that will simply log output for tracing purposes with thetrace
level;REPORTING_SERVER_HOSTNAME
- mandatory if reporting is enabled. Zebrunner server hostname. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Service URL' section;REPORTING_SERVER_ACCESS_TOKEN
- mandatory if reporting is enabled. Access token must be used to perform API calls. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Token' section;REPORTING_PROJECT_KEY
- optional value. The project that the test run belongs to. The default value isUNKNOWN
. You can manage projects in Zebrunner in the appropriate section;REPORTING_RUN_DISPLAY_NAME
- optional value. The display name of the test run. The default value isDefault Suite
;REPORTING_RUN_BUILD
- optional value. The build number that is associated with the test run. It can depict either the test build number, or the application build number;REPORTING_RUN_ENVIRONMENT
- optional value. The environment in which the tests will run.
The following configuration parameters are recognized by the agent:
reporting.enabled
- enables or disables reporting. The default value isfalse
. If disabled, the agent will use no op component implementations that will simply log output for tracing purposes with thetrace
level;reporting.server.hostname
- mandatory if reporting is enabled. Zebrunner server hostname. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Service URL' section;reporting.server.accessToken
- mandatory if reporting is enabled. Access token must be used to perform API calls. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Token' section;reporting.projectKey
- optional value. The project that the test run belongs to. The default value isUNKNOWN
. You can manage projects in Zebrunner in the appropriate section;reporting.run.displayName
- optional value. The display name of the test run. The default value isDefault Suite
;reporting.run.build
- optional value. The build number that is associated with the test run. It can depict either the test build number, or the application build number;reporting.run.environment
- optional value. The environment in which the tests will run.
Agent recognizes agent.yaml
or agent.yml
file in the resources root folder. It is currently not possible to configure an alternative file location.
Below is a sample configuration file:
reporting:
enabled: true
project-key: UNKNOWN
server:
hostname: localhost:8080
access-token: <token>
run:
display-name: Nightly Regression Suite
build: 1.12.1.96-SNAPSHOT
environment: TEST-1
reporting.enabled
- enables or disables reporting. The default value isfalse
. If disabled, the agent will use no op component implementations that will simply log output for tracing purposes with thetrace
level;reporting.server.hostname
- mandatory if reporting is enabled. Zebrunner server hostname. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Service URL' section;reporting.server.access-token
- mandatory if reporting is enabled. Access token must be used to perform API calls. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Token' section;reporting.project-key
- optional value. The project that the test run belongs to. The default value isUNKNOWN
. You can manage projects in Zebrunner in the appropriate section;reporting.run.display-name
- optional value. The display name of the test run. The default value isDefault Suite
;reporting.run.build
- optional value. The build number that is associated with the test run. It can depict either the test build number, or the application build number;reporting.run.environment
- optional value. The environment in which the tests will run.
The agent recognizes only agent.properties
file in the resources root folder. It is currently not possible to configure an alternative file location.
Below is a sample configuration file:
reporting.enabled=true
reporting.project-key=UNKNOWN
reporting.server.hostname=localhost:8080
reporting.server.access-token=<token>
reporting.run.display-name=Nightly Regression Suite
reporting.run.build=1.12.1.96-SNAPSHOT
reporting.run.environment=TEST-1
reporting.enabled
- enables or disables reporting. The default value isfalse
. If disabled, the agent will use no op component implementations that will simply log output for tracing purposes with thetrace
level;reporting.server.hostname
- mandatory if reporting is enabled. Zebrunner server hostname. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Service URL' section;reporting.server.access-token
- mandatory if reporting is enabled. Access token must be used to perform API calls. Can be obtained in Zebrunner on the 'Account & profile' page under the 'Token' section;reporting.project-key
- optional value. The project that the test run belongs to. The default value isUNKNOWN
. You can manage projects in Zebrunner in the appropriate section;reporting.run.display-name
- optional value. The display name of the test run. The default value isDefault Suite
;reporting.run.build
- optional value. The build number that is associated with the test run. It can depict either the test build number, or the application build number;reporting.run.environment
- optional value. The environment in which the tests will run.
It is possible to configure additional reporting capabilities by improving your tracking experience.
It is also possible to enable the log collection for your tests. Currently, three logging frameworks are supported out of the box: logback, log4j, log4j2. We recommend using slf4j (Simple Logging Facade for Java) which provides abstraction over logging libraries. All you have to do to enable logging is to register the reporting appender in your test framework configuration file.
Add logback (and, optionally, slf4j) dependencies to your build descriptor.
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.30'
implementation 'ch.qos.logback:logback-core:1.2.3'
implementation 'ch.qos.logback:logback-classic:1.2.3'
}
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.30</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>
Add logging appender to logback.xml
file. Feel free to customize the logging pattern according to your needs:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="ZebrunnerAppender" class="com.zebrunner.agent.core.appender.logback.ReportingAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%t] %-5level - %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="ZebrunnerAppender" />
</root>
</configuration>
Add log4j (and, optionally, slf4j) dependency to your build descriptor.
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.30'
implementation 'log4j:log4j:1.2.17'
}
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.30</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
</dependencies>
Add logging appender in log4j.properties
file. Feel free to customize the logging pattern according to your needs:
log4j.rootLogger = INFO, zebrunner
log4j.appender.zebrunner=com.zebrunner.agent.core.appender.log4j.ReportingAppender
log4j.appender.zebrunner.layout=org.apache.log4j.PatternLayout
log4j.appender.zebrunner.layout.conversionPattern=pattern">[%d{HH:mm:ss}] %-5p (%F:%L) - %m%n
Add log4j2 (and, optionally, slf4j) dependency to your build descriptor:
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.30'
implementation 'org.apache.logging.log4j:log4j-api:2.13.3'
implementation 'org.apache.logging.log4j:log4j-core:2.13.3'
}
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.30</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.13.3</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.13.3</version>
</dependency>
</dependencies>
Add logging appender to log4j2.xml
file. Feel free to customize the logging pattern according to your needs:
<?xml version="1.0" encoding="UTF-8"?>
<configuration packages="com.zebrunner.agent.core.appender.log4j2">
<properties>
<property name="pattern">[%d{HH:mm:ss}] %-5p (%F:%L) - %m%n</property>
</properties>
<appenders>
<ZebrunnerAppender name="ZebrunnerAppender">
<PatternLayout pattern="${pattern}" />
</ZebrunnerAppender>
</appenders>
<loggers>
<root level="info">
<appender-ref ref="ZebrunnerAppender"/>
</root>
</loggers>
</configuration>
No additional steps are required to collect test logs and track them in Zebrunner.
import java.lang.invoke.MethodHandles;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AwesomeTests {
private static final Logger LOGGER = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
@Test
public void awesomeTest() {
LOGGER.info("Test info");
}
}
In case you are using JUnit 5 as a UI testing framework, it may be useful to have an ability to track captured screenshots in scope of Zebrunner reporting. The agent comes with a Java API allowing you to send your screenshots to Zebrunner, so they will be attached to the test.
Below is a sample code of test sending a screenshot to Zebrunner:
import com.zebrunner.agent.core.registrar.Screenshot;
import org.junit.Test;
public class AwesomeTests {
@Test
public void myAwesomeTest() {
byte[] screenshotBytes = // capture screenshot
Screenshot.upload(screenshotBytes, capturedAtMillis);
// meaningful assertions
}
}
A screenshot should be passed as a byte array along with a unix timestamp in milliseconds corresponding to the moment when the screenshot was captured.
If null
is supplied instead of a timestamp, it will be generated automatically. However, it is recommended to use an accurate timestamp in order to get accurate tracking.
The uploaded screenshot will appear among test logs. The actual position depends on the provided (or generated) timestamp.
In case your tests produce some artifacts, it may be useful to track them in Zebrunner. The agent comes with a few convenient methods for uploading artifacts in Zebrunner and linking them to the currently running test.
Artifacts can be uploaded using the Artifact
class. This class has 4 static methods to upload artifacts represented by any Java type associated with the files. Together with an artifact, you must provide the artifact name. This name must contain the file extension that reflects the actual content of the file. If the file extension is incorrect, this file will not be saved in Zebrunner.
Here is a sample test:
import java.io.InputStream;
import java.io.File;
import java.nio.file.Path;
import com.zebrunner.agent.core.registrar.Artifact;
import org.junit.Test;
public class AwesomeTests {
@Test
public void awesomeTest() {
// some code here
InputStream inputStream;
byte[] byteArray;
File file;
Path path;
Artifact.upload(inputStream, "file.docx");
Artifact.upload(byteArray, "image.png");
Artifact.upload(file, "application.apk");
Artifact.upload(path, "test-log.txt");
// meaningful assertions
}
}
Artifact upload process is performed in the background, so it will not affect test execution. The uploaded artifacts will appear under the test name in the run results in Zebrunner.
It is also allowed to attach links to external artifacts. Any kind of external resources can be used. In order to attach an external artifact, you should use a static method of the ArtifactReference
class.
Here is an example:
import com.zebrunner.agent.core.registrar.ArtifactReference;
import org.junit.Test;
public class AwesomeTests {
@Test
public void awesomeTest() {
// some code here
ArtifactReference.attach("Zebrunner", "https://zebrunner.com/");
// meaningful assertions
}
}
The example above adds a link to zebrunner.com to the list of test artifacts.
In some cases, it may be useful to attach some meta information related to a test - its Jira id, its priority, or any other useful data.
The agent comes with a concept of a label. Label is a key-value pair associated with a test. The key is represented by a String
, the label value accepts a vararg of Strings
.
There is a bunch of annotations that can be used to attach a label to a test. All the annotations can be used on both class and method levels. It is also possible to override a class-level label on a method-level. There is one generic annotation and a few bespoke ones that don't require a label name:
@Priority
@JiraReference
@TestLabel
- the generic one.
There is also a Java API to attach labels during test execution. The Label
class has a static method that can be used to attach a label.
Here is a sample:
import com.zebrunner.agent.core.annotation.JiraReference;
import com.zebrunner.agent.core.annotation.Priority;
import com.zebrunner.agent.core.annotation.TestLabel;
import com.zebrunner.agent.core.registrar.Label;
import org.junit.Test;
public class AwesomeTests {
@Test
@Priority(Priority.P1)
@JiraReference("ZBR-1231")
@TestLabel(name = "app", value = {"reporting-service:v1.0", "reporting-service:v1.1"})
public void awesomeTest() {
// some code here
Label.attach("Chrome", "85.0");
// meaningful assertions
}
}
The test from the sample above attaches 5 labels: 1 priority, 1 jira-reference, 2 app, 1 Chrome label.
The values of attached labels will be displayed in Zebrunner under the name of a corresponding test. The values of the @JiraReference
annotation will be displayed in blue pills to the right of the test name.
You may want to add transparency to the process of automation maintenance by having an engineer responsible for evolution of specific tests or test classes.
Zebrunner comes with a concept of a maintainer - a person that can be assigned to maintain tests. In order to keep track of those, the agent comes with the @Maintainer
annotation.
This annotation can be placed on both test class and method. It is also possible to override a class-level maintainer on a method-level. If a base test class is marked with this annotation, all child classes will inherit the annotation unless they have an explicitly specified one.
See a sample test class below:
import com.zebrunner.agent.core.reporting.Maintainer;
import org.junit.Test;
@Maintainer("kenobi")
public class AwesomeTests {
@Test
@Maintainer("skywalker")
public void awesomeTest() {
// meaningful assertions
}
@Test
public void anotherAwesomeTest() {
// meaningful assertions
}
}
In the example above, kenobi
will be reported as a maintainer of anotherAwesomeTest
(class-level value taken into account), while skywalker
will be reported as a maintainer of test awesomeTest
.
The maintainer username should be a valid Zebrunner username, otherwise it will be set to anonymous
.
To check out the project and build from the source, do the following:
git clone git://github.com/zebrunner/java-agent-junit4.git
cd java-agent-junit4
./gradlew build
Zebrunner Reporting service is released under version 2.0 of the Apache License.