Skip to main content

GitLab CI/CD Integration with QA Sphere

Automatically upload test results from your GitLab CI/CD pipelines to QA Sphere using the QAS CLI tool. This integration eliminates manual result entry and provides instant visibility into your automated test results.

What You'll Achieve

With this integration, every time your GitLab pipeline runs:

  • Test results automatically upload to QA Sphere
  • New test runs are created with pipeline information
  • Tests are matched to existing QA Sphere test cases
  • Pass/fail status, execution time, and screenshots are recorded
  • Test history and trends are tracked over time

Prerequisites

Before starting, ensure you have:

  • A GitLab project with automated tests (Playwright, Cypress, Jest, etc.)
  • Tests configured to generate JUnit XML format results
  • A QA Sphere account with Test Runner role or higher
  • Test cases in QA Sphere with markers (e.g., BD-001, PRJ-123)

How It Works

GitLab CI/CD Integration Diagram

  1. Your pipeline runs automated tests
  2. Tests generate JUnit XML results file
  3. QAS CLI tool reads the XML file
  4. CLI matches tests to QA Sphere cases using markers
  5. Results are uploaded and appear in QA Sphere

Setup Steps

Step 1: Create QA Sphere API Key

  1. Log into your QA Sphere account
  2. Click the gear icon ⚙️ in the top right → Settings
  3. Navigate to API Keys
  4. Click Create API Key
  5. Copy and save the key - you won't see it again!

Your API key format: t123.ak456.abc789xyz

Step 2: Configure GitLab Variables

Add these secrets to your GitLab project:

  1. Go to your GitLab project
  2. Navigate to SettingsCI/CDVariables
  3. Click Add variable and create:
KeyValueFlags
QAS_TOKENYour API key (e.g., t123.ak456.abc789xyz)Protected, Masked
QAS_URLYour QA Sphere URL (e.g., https://company.eu1.qasphere.com)Protected
  1. Click Add variable to save
Security

Never commit API keys to your repository. Always use GitLab CI/CD variables.

Step 3: Add Test Case Markers

Ensure your test names include QA Sphere markers in the format PROJECT-SEQUENCE. These markers can be found in QA Sphere interface for each test case separately. TMS in Software Development

Playwright Example:

test('BD-001: User can login with valid credentials', async ({ page }) => {
await page.goto('https://example.com/login');
await page.fill('#username', '[email protected]');
await page.fill('#password', 'password123');
await page.click('#login-button');
await expect(page).toHaveURL('/dashboard');
});

test('BD-002: User sees error with invalid credentials', async ({ page }) => {
// test implementation
});

Cypress Example:

describe('Login Flow', () => {
it('BD-001: should login successfully with valid credentials', () => {
cy.visit('/login');
cy.get('#username').type('[email protected]');
cy.get('#password').type('password123');
cy.get('#login-button').click();
cy.url().should('include', '/dashboard');
});
});

Jest Example:

describe('API Tests', () => {
test('BD-015: GET /users returns user list', async () => {
const response = await fetch('/api/users');
expect(response.status).toBe(200);
const data = await response.json();
expect(data).toHaveLength(5);
});
});

Step 4: Configure Test Framework

Configure your test framework to generate JUnit XML output:

Playwright Configuration

// playwright.config.js
const { defineConfig } = require('@playwright/test');

module.exports = defineConfig({
testDir: './tests',
timeout: 30000,

// JUnit reporter for CI/CD
reporter: [
['list'], // Console output
['junit', { outputFile: 'junit-results/results.xml' }] // For QA Sphere
],

use: {
headless: true,
screenshot: 'only-on-failure',
video: 'retain-on-failure',
},

projects: [
{ name: 'chromium', use: { browserName: 'chromium' } },
{ name: 'firefox', use: { browserName: 'firefox' } },
{ name: 'webkit', use: { browserName: 'webkit' } },
],
});

Cypress Configuration

// cypress.config.js
const { defineConfig } = require('cypress');

module.exports = defineConfig({
e2e: {
reporter: 'cypress-multi-reporters',
reporterOptions: {
configFile: 'reporter-config.json'
}
}
});
// reporter-config.json
{
"reporterEnabled": "spec, mocha-junit-reporter",
"mochaJunitReporterReporterOptions": {
"mochaFile": "junit-results/results.xml"
}
}

Jest Configuration

// jest.config.js
module.exports = {
reporters: [
'default',
['jest-junit', {
outputDirectory: './junit-results',
outputName: 'results.xml',
classNameTemplate: '{classname}',
titleTemplate: '{title}'
}]
]
};

Step 5: Create GitLab Pipeline

Create or update .gitlab-ci.yml in your repository root:

For Playwright Projects

stages:
- test
- report

variables:
# Disable Husky git hooks in CI
HUSKY: 0

# Run Playwright tests
test:
stage: test
# IMPORTANT: Match image version to your @playwright/test package version
image: mcr.microsoft.com/playwright:v1.51.1-jammy
script:
- npm ci
- npx playwright test

artifacts:
when: always # Upload artifacts even if tests fail
paths:
- junit-results/
- test-results/
- playwright-report/
reports:
junit: junit-results/results.xml
expire_in: 1 week

rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

# Upload results to QA Sphere
upload-to-qasphere:
stage: report
image: node:18
needs:
- job: test
artifacts: true
before_script:
- npm install -g qas-cli
script:
# Upload test results (automatically creates new run)
- qasphere junit-upload ./junit-results/results.xml

- echo "✅ Test results uploaded to QA Sphere"

when: always
allow_failure: true

rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

For Cypress Projects

stages:
- test
- report

test:
stage: test
image: cypress/browsers:node18.12.0-chrome107
script:
- npm ci
- npx cypress run

artifacts:
when: always
paths:
- junit-results/
- cypress/videos/
- cypress/screenshots/
reports:
junit: junit-results/results.xml
expire_in: 1 week

rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

upload-to-qasphere:
stage: report
image: node:18
needs:
- job: test
artifacts: true
before_script:
- npm install -g qas-cli
script:
- qasphere junit-upload ./junit-results/results.xml
- echo "✅ Results uploaded to QA Sphere"
when: always
allow_failure: true
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

For Jest Projects

stages:
- test
- report

test:
stage: test
image: node:18
script:
- npm ci
- npm test

artifacts:
when: always
paths:
- junit-results/
- coverage/
reports:
junit: junit-results/results.xml
expire_in: 1 week

rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

upload-to-qasphere:
stage: report
image: node:18
needs:
- job: test
artifacts: true
before_script:
- npm install -g qas-cli
script:
- qasphere junit-upload ./junit-results/results.xml
- echo "✅ Results uploaded to QA Sphere"
when: always
allow_failure: true
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"

Step 6: Push and Verify

  1. Commit your changes:
git add .gitlab-ci.yml playwright.config.js  # or your config files
git commit -m "Add GitLab CI/CD with QA Sphere integration"
git push origin main
  1. Monitor the pipeline:

    • Go to GitLab → BuildPipelines
    • Watch your pipeline execute
    • Check both test and upload-to-qasphere stages

    Check jobs statuses

  2. Verify in QA Sphere:

    • Log into QA Sphere
    • Navigate to your project → Test Runs
    • See the new run with your test results

Advanced Usage

Available CLI Options

The QAS CLI junit-upload command creates a new test run within a QA Sphere project from your JUnit XML files or uploads results to an existing run.

qasphere junit-upload [options] <path-to-junit-xml>

Options:

  • -r, --run-url <url> - Upload to an existing test run (otherwise creates a new run)
  • --run-name <template> - Name template for creating new test runs (only used when --run-url is not specified)
  • --attachments - Detect and upload attachments (screenshots, videos, logs)
  • --force - Ignore API request errors, invalid test cases, or attachment issues
  • -h, --help - Show command help

Run Name Template Placeholders

The --run-name option supports the following placeholders:

Environment Variables:

  • {env:VARIABLE_NAME} - Any environment variable (e.g., {env:CI_PIPELINE_ID}, {env:CI_COMMIT_SHA})

Date Placeholders:

  • {YYYY} - 4-digit year (e.g., 2025)
  • {YY} - 2-digit year (e.g., 25)
  • {MMM} - 3-letter month (e.g., Jan, Feb, Mar)
  • {MM} - 2-digit month (e.g., 01, 02, 12)
  • {DD} - 2-digit day (e.g., 01, 15, 31)

Time Placeholders:

  • {HH} - 2-digit hour in 24-hour format (e.g., 00, 13, 23)
  • {hh} - 2-digit hour in 12-hour format (e.g., 01, 12)
  • {mm} - 2-digit minute (e.g., 00, 30, 59)
  • {ss} - 2-digit second (e.g., 00, 30, 59)
  • {AMPM} - AM/PM indicator

Default Template: If --run-name is not specified, the default template is:

Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}

Example Output:

  • Automated test run - Jan 15, 2025, 02:30:45 PM
note

The --run-name option is only used when creating new test runs (i.e., when --run-url is not specified).

Usage Examples:

# Create new run with default name template
qasphere junit-upload ./junit-results/results.xml

# Upload to existing run (--run-name is ignored)
qasphere junit-upload -r https://company.eu1.qasphere.com/project/BD/run/42 ./junit-results/results.xml

# Simple static name
qasphere junit-upload --run-name "v1.4.4-rc5" ./junit-results/results.xml

# With environment variables
qasphere junit-upload --run-name "Pipeline #{env:CI_PIPELINE_ID} - {env:CI_COMMIT_REF_NAME}" ./junit-results/results.xml
# Output: "Pipeline #12345 - main"

# With date placeholders
qasphere junit-upload --run-name "Release {YYYY}-{MM}-{DD}" ./junit-results/results.xml
# Output: "Release 2025-01-15"

# With date and time placeholders
qasphere junit-upload --run-name "Nightly Tests {MMM} {DD}, {YYYY} at {HH}:{mm}" ./junit-results/results.xml
# Output: "Nightly Tests Jan 15, 2025 at 22:34"

# Complex template with multiple placeholders
qasphere junit-upload --run-name "Build {env:BUILD_NUMBER} - {YYYY}/{MM}/{DD} {hh}:{mm} {AMPM}" ./junit-results/results.xml
# Output: "Build v1.4.4-rc5 - 2025/01/15 10:34 PM"

# With attachments
qasphere junit-upload --attachments ./junit-results/results.xml

# Multiple files
qasphere junit-upload ./junit-results/*.xml

# Force upload on errors
qasphere junit-upload --force ./junit-results/results.xml

Upload to Existing Test Run

To update a specific test run instead of creating a new one:

upload-to-qasphere:
script:
- |
RUN_ID=42 # Your run ID
qasphere junit-upload \
-r ${QAS_URL}/project/BD/run/${RUN_ID} \
./junit-results/results.xml

Upload with Attachments

Include screenshots and logs with your results:

upload-to-qasphere:
script:
- qasphere junit-upload --attachments ./junit-results/results.xml

The CLI automatically detects and uploads:

  • Screenshots from test failures
  • Video recordings
  • Log files
  • Any files referenced in the JUnit XML

Upload Multiple XML Files

If you have multiple test suites generating separate XML files:

upload-to-qasphere:
script:
- qasphere junit-upload ./junit-results/*.xml

Branch-Specific Runs

Create different runs for different branches:

upload-to-qasphere:
script:
- |
if [ "$CI_COMMIT_REF_NAME" = "main" ]; then
# Upload to production run
qasphere junit-upload -r ${QAS_URL}/project/BD/run/100 ./junit-results/results.xml
elif [ "$CI_COMMIT_REF_NAME" = "develop" ]; then
# Upload to development run
qasphere junit-upload -r ${QAS_URL}/project/BD/run/101 ./junit-results/results.xml
else
# Create new run for feature branches
qasphere junit-upload ./junit-results/results.xml
fi

Add Pipeline Metadata

Use the --run-name option to include GitLab pipeline information in test run titles:

upload-to-qasphere:
script:
# Include pipeline ID and branch name
- |
qasphere junit-upload \
--run-name "Pipeline #{env:CI_PIPELINE_ID} - {env:CI_COMMIT_REF_NAME}" \
./junit-results/results.xml
# Output: "Pipeline #12345 - main"

Common GitLab Variables:

  • {env:CI_PIPELINE_ID} - Pipeline ID number
  • {env:CI_COMMIT_REF_NAME} - Branch or tag name
  • {env:CI_COMMIT_SHORT_SHA} - Short commit SHA
  • {env:CI_COMMIT_SHA} - Full commit SHA
  • {env:GITLAB_USER_NAME} - User who triggered the pipeline
  • {env:CI_JOB_NAME} - Current job name

Examples:

# Pipeline with date and time
- qasphere junit-upload --run-name "Pipeline #{env:CI_PIPELINE_ID} - {YYYY}-{MM}-{DD} {HH}:{mm}" ./junit-results/results.xml

# Branch and commit info
- qasphere junit-upload --run-name "{env:CI_COMMIT_REF_NAME} - {env:CI_COMMIT_SHORT_SHA}" ./junit-results/results.xml

# Complete metadata
- qasphere junit-upload --run-name "Build #{env:CI_PIPELINE_ID} ({env:CI_COMMIT_REF_NAME}) - {MMM} {DD}, {hh}:{mm} {AMPM}" ./junit-results/results.xml

Force Upload on Errors

Continue uploading even if some tests can't be matched:

upload-to-qasphere:
script:
- qasphere junit-upload --force ./junit-results/results.xml

Common Scenarios

Scenario 1: Nightly Test Runs

Run tests on a schedule and upload results with descriptive names:

test:
rules:
- if: $CI_PIPELINE_SOURCE == "schedule" # Only run on scheduled pipelines
- if: $CI_COMMIT_BRANCH == "main"

upload-to-qasphere:
script:
# Create run with date in the name
- qasphere junit-upload --run-name "Nightly Tests - {YYYY}-{MM}-{DD}" ./junit-results/results.xml
# Output: "Nightly Tests - 2025-01-15"

# Or with time included
- qasphere junit-upload --run-name "Nightly {MMM} {DD}, {YYYY} at {HH}:{mm}" ./junit-results/results.xml
# Output: "Nightly Jan 15, 2025 at 22:30"

Create the schedule in GitLab: CI/CDSchedulesNew schedule

Scenario 2: Parallel Test Execution

Run tests in parallel and upload all results:

test-unit:
stage: test
script:
- npm run test:unit
artifacts:
paths:
- junit-results/unit-results.xml

test-integration:
stage: test
script:
- npm run test:integration
artifacts:
paths:
- junit-results/integration-results.xml

upload-to-qasphere:
stage: report
needs:
- job: test-unit
artifacts: true
- job: test-integration
artifacts: true
script:
- npm install -g qas-cli
- qasphere junit-upload ./junit-results/*.xml

Scenario 3: Multi-Environment Testing

Test against different environments:

test-staging:
variables:
TEST_ENV: "staging"
BASE_URL: "https://staging.example.com"
script:
- npm test
artifacts:
paths:
- junit-results/staging-results.xml

test-production:
variables:
TEST_ENV: "production"
BASE_URL: "https://example.com"
script:
- npm test
artifacts:
paths:
- junit-results/production-results.xml

upload-to-qasphere:
script:
- npm install -g qas-cli
- qasphere junit-upload ./junit-results/*.xml

Scenario 4: Version/Release Tagging

Tag test runs with version numbers or release names:

variables:
VERSION: "v1.4.5"

upload-to-qasphere:
script:
# Simple version tag
- qasphere junit-upload --run-name "Release {env:VERSION}" ./junit-results/results.xml
# Output: "Release v1.4.5"

# Version with date
- qasphere junit-upload --run-name "Release {env:VERSION} - {YYYY}-{MM}-{DD}" ./junit-results/results.xml
# Output: "Release v1.4.5 - 2025-01-15"

# Pre-release/RC builds
- qasphere junit-upload --run-name "{env:VERSION}-rc{env:CI_PIPELINE_ID}" ./junit-results/results.xml
# Output: "v1.4.5-rc12345"

For tagged releases in GitLab:

upload-to-qasphere:
script:
- |
if [ -n "$CI_COMMIT_TAG" ]; then
# For git tags, use tag name
qasphere junit-upload --run-name "Release {env:CI_COMMIT_TAG}" ./junit-results/results.xml
else
# For regular commits, use branch and commit
qasphere junit-upload --run-name "{env:CI_COMMIT_REF_NAME} - {env:CI_COMMIT_SHORT_SHA}" ./junit-results/results.xml
fi
rules:
- if: $CI_COMMIT_TAG
- if: $CI_COMMIT_BRANCH == "main"

Troubleshooting

Issue: Tests Not Appearing in QA Sphere

Symptoms:

  • Upload succeeds but no results in QA Sphere
  • "Test case not found" warnings in logs

Solutions:

  1. Ensure test cases exist in QA Sphere:

    • Check that BD-001, BD-002, etc. exist in your QA Sphere project
    • Verify the project code matches (BD, PRJ, etc.)
  2. Check marker format:

    • Must be PROJECT-NUMBER format
    • Examples: BD-001, PRJ-123, TEST-456

Issue: Authentication Failed (401 Error)

Symptoms:

Error: Authentication failed (401)

Solutions:

  1. Verify API key is correct:

    • Go to QA Sphere → Settings → API Keys
    • Check the key hasn't been deleted
    • Regenerate if needed
  2. Check GitLab variables:

    • Settings → CI/CD → Variables
    • Verify QAS_TOKEN is set correctly
    • Ensure no extra spaces or line breaks
  3. Verify key permissions:

    • API key must have Test Runner role or higher
    • Check user permissions in QA Sphere

Issue: JUnit XML File Not Found

Symptoms:

Error: File ./junit-results/results.xml does not exist

Solutions:

  1. Check test job artifacts:
test:
artifacts:
paths:
- junit-results/ # Make sure this matches your output path
  1. Verify test framework configuration:

    • Playwright: Check playwright.config.js reporter
    • Cypress: Check reporter-config.json
    • Jest: Check jest.config.js reporters
  2. Add debug output:

upload-to-qasphere:
script:
- ls -la junit-results/ # List files
- cat junit-results/results.xml # Show content
- qasphere junit-upload ./junit-results/results.xml

Issue: Playwright Version Mismatch

Symptoms:

Error: Executable doesn't exist at /ms-playwright/chromium...
║ - current: mcr.microsoft.com/playwright:v1.40.0-jammy
║ - required: mcr.microsoft.com/playwright:v1.51.1-jammy

Solution:

Match Docker image version to your Playwright package version:

# Check your Playwright version
npm list @playwright/test
# Output: @playwright/[email protected]
# Update .gitlab-ci.yml
test:
image: mcr.microsoft.com/playwright:v1.51.1-jammy # Match the version

Issue: Pipeline Fails But Tests Pass

Symptoms:

  • Tests execute successfully
  • Artifacts are uploaded
  • Job still marked as failed

Solution:

Force job success while preserving test results:

test:
script:
- npm ci
- |
set +e
npx playwright test
TEST_EXIT=$?
set -e
echo "Tests completed with exit code: $TEST_EXIT"
exit 0 # Force success

artifacts:
when: always
paths:
- junit-results/

Issue: Upload Job Doesn't Run

Symptoms:

  • Test job completes
  • Upload job never starts

Solutions:

  1. Check needs configuration:
upload-to-qasphere:
needs:
- job: test # Must match test job name exactly
artifacts: true
  1. Verify branch rules:
upload-to-qasphere:
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
  1. Check when clause:
upload-to-qasphere:
when: always # Run even if test job fails

Best Practices

1. Always Use Markers

Include QA Sphere markers in all automated tests:

// ✅ Good
test('BD-001: User can login successfully', async ({ page }) => {});

// ❌ Bad - no marker
test('User can login successfully', async ({ page }) => {});

2. Upload on Every Pipeline Run

Configure upload to run even when tests fail:

upload-to-qasphere:
when: always
allow_failure: true

This ensures you track both passing and failing test results.

3. Use Descriptive Run Names

Use the --run-name option to create meaningful test run titles:

upload-to-qasphere:
script:
# Include pipeline and branch information
- |
qasphere junit-upload \
--run-name "Pipeline #{env:CI_PIPELINE_ID} - {env:CI_COMMIT_REF_NAME}" \
./junit-results/results.xml

For branch-specific runs, you can also upload to existing runs:

script:
- |
if [ "$CI_COMMIT_REF_NAME" = "main" ]; then
# Upload to production run
qasphere junit-upload -r ${QAS_URL}/project/BD/run/100 ./junit-results/results.xml
elif [ "$CI_COMMIT_REF_NAME" = "develop" ]; then
# Upload to development run
qasphere junit-upload -r ${QAS_URL}/project/BD/run/101 ./junit-results/results.xml
else
# Create new run for feature branches
qasphere junit-upload --run-name "{env:CI_COMMIT_REF_NAME} - {env:CI_COMMIT_SHORT_SHA}" ./junit-results/results.xml
fi

4. Secure Your API Keys

  • Store in GitLab CI/CD variables
  • Mark as Protected and Masked
  • Rotate keys periodically
  • Never commit to repository
  • Never log or print in pipeline

5. Upload Attachments for Failures

Help debug failures by including screenshots:

script:
- qasphere junit-upload --attachments ./junit-results/results.xml

6. Match Playwright Versions

Always keep Docker image version in sync with npm package:

// package.json
{
"devDependencies": {
"@playwright/test": "1.51.1"
}
}
# .gitlab-ci.yml
image: mcr.microsoft.com/playwright:v1.51.1-jammy

7. Test Locally First

Before pushing to GitLab, test the integration locally:

# Set environment variables
export QAS_TOKEN=your.api.key
export QAS_URL=https://company.eu1.qasphere.com

# Run tests
npm test

# Upload results
npx qas-cli junit-upload ./junit-results/results.xml

8. Monitor Upload Success

Add logging to track upload status:

upload-to-qasphere:
script:
- npm install -g qas-cli
- |
if qasphere junit-upload ./junit-results/results.xml; then
echo "✅ Successfully uploaded results to QA Sphere"
else
echo "❌ Failed to upload results to QA Sphere"
exit 1
fi

Complete Working Example

Here's a complete, production-ready configuration:

# .gitlab-ci.yml
stages:
- test
- report

variables:
HUSKY: 0
PLAYWRIGHT_VERSION: "1.51.1"

# Cache node modules for faster builds
cache:
paths:
- node_modules/

# Run Playwright tests
test:
stage: test
image: mcr.microsoft.com/playwright:v${PLAYWRIGHT_VERSION}-jammy
before_script:
- npm ci
script:
- npx playwright test
artifacts:
when: always
paths:
- junit-results/
- test-results/
- playwright-report/
reports:
junit: junit-results/results.xml
expire_in: 1 week
retry:
max: 1
when:
- runner_system_failure
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
tags:
- docker

# Upload results to QA Sphere
upload-to-qasphere:
stage: report
image: node:18
needs:
- job: test
artifacts: true
before_script:
- npm install -g qas-cli
script:
# Verify file exists
- test -f junit-results/results.xml || (echo "Results file not found" && exit 1)

# Upload with descriptive run name and attachments
- |
qasphere junit-upload \
--run-name "Pipeline #{env:CI_PIPELINE_ID} - {env:CI_COMMIT_REF_NAME} ({env:CI_COMMIT_SHORT_SHA})" \
--attachments \
./junit-results/results.xml

- echo "✅ Test results uploaded to QA Sphere"
- echo "View at: ${QAS_URL}/project/BD/runs"

when: always
allow_failure: true
rules:
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "develop"
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
tags:
- docker

Next Steps

Once you have the basic integration working:

  1. Add More Tests - Expand your test coverage with proper markers
  2. Set Up Schedules - Run tests nightly using GitLab's scheduled pipelines
  3. Create Dashboards - Use QA Sphere reports to track quality trends
  4. Configure Notifications - Get alerts for test failures
  5. Integrate with Jira - Link test failures to bug tickets

Additional Resources

Getting Help

If you encounter issues:

  1. Check the Troubleshooting section above
  2. Review pipeline logs in GitLab
  3. Test CLI locally with same configuration
  4. Contact QA Sphere support: [email protected]

Summary: You now have everything you need to integrate QA Sphere with GitLab CI/CD. The QAS CLI tool automatically handles test result uploads, making test management seamless and automated. Every pipeline run will now update QA Sphere with the latest test results.