Skip to content

Contributing to Verity

Welcome to the Verity project! We're excited that you're interested in contributing to an open-source platform for real-time data verification and event-driven integrity checks. Whether you're fixing a bug, adding a feature, or improving documentation, your contributions make Verity better for everyone.

First time contributing?

We recommend starting with issues labeled good first issue to get familiar with the codebase and our development workflow.


Development Setup

Prerequisites

Before you begin, make sure you have the following tools installed:

Tool Minimum Version Purpose
Python 3.9+ Backend services, CLI tooling
Node.js 18+ Frontend, documentation tooling
Docker 24+ Local development, integration tests
Helm 3.12+ Kubernetes deployment manifests
Git 2.40+ Version control

Clone and Install

git clone https://github.com/verity-platform/verity.git
cd verity
git clone git@github.com:verity-platform/verity.git
cd verity

Install all dependencies with the bootstrap script:

# Create a Python virtual environment
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

# Install Python dependencies
pip install -e ".[dev]"

# Install Node.js dependencies
npm ci

# Install pre-commit hooks
pre-commit install

Verify Your Setup

Run the quick validation suite to confirm everything is working:

# Run unit tests
pytest tests/unit/ -q

# Run linters
ruff check .
mypy services/

# Build documentation
mkdocs build --strict

Ready to go

If all commands pass without errors, your development environment is set up correctly.


Project Structure

Verity follows a monorepo layout. Here's how the repository is organized:

verity/
├── services/           # Backend microservices
│   ├── gateway/        # API gateway and request routing
│   ├── verifier/       # Core verification engine
│   ├── notifier/       # Event notification service
│   └── scheduler/      # Task scheduling and orchestration
├── libs/               # Shared libraries and utilities
│   ├── common/         # Cross-service utilities, logging, config
│   ├── models/         # Shared data models and schemas
│   └── clients/        # Internal service client SDKs
├── tests/              # Test suites
│   ├── unit/           # Fast, isolated unit tests
│   ├── integration/    # Cross-service integration tests
│   ├── contract/       # Consumer-driven contract tests
│   └── load/           # Performance and load test scenarios
├── docs/               # Documentation source (MkDocs)
├── infra/              # Infrastructure-as-code
│   ├── helm/           # Helm charts for Kubernetes
│   ├── terraform/      # Cloud provisioning
│   └── docker/         # Dockerfiles and compose configs
├── mkdocs.yml          # Documentation site configuration
├── pyproject.toml      # Python project metadata
└── package.json        # Node.js project metadata

Service independence

Each service under services/ is independently deployable and has its own Dockerfile, configuration, and test coverage. Shared logic lives in libs/ to avoid duplication.


Code Standards

Python

We enforce consistent Python code quality with the following tools:

Tool Purpose Config File
Ruff Linting and formatting pyproject.toml
mypy Static type checking pyproject.toml
pytest Test runner pyproject.toml
# Lint and auto-fix
ruff check . --fix
ruff format .

# Type checking
mypy services/ libs/

Type annotations required

All public functions and methods must include type annotations. The CI pipeline enforces mypy --strict on all service code.

TypeScript

For any TypeScript code (tooling, documentation plugins):

Tool Purpose Config File
ESLint Linting .eslintrc.json
Prettier Formatting .prettierrc
# Lint
npx eslint . --fix

# Format
npx prettier --write .

Commit Conventions

We follow the Conventional Commits specification. Every commit message must follow this format:

<type>(<scope>): <description>

[optional body]

[optional footer(s)]

Allowed types:

Type Description
feat A new feature
fix A bug fix
docs Documentation changes
style Code style changes (formatting, no logic change)
refactor Code refactoring
perf Performance improvements
test Adding or updating tests
ci CI/CD pipeline changes
chore Build process or tooling changes

Examples:

feat(verifier): add support for batch verification requests
fix(gateway): resolve timeout on large payload ingestion
docs(contributing): update local development instructions

Testing

Verity maintains a comprehensive, layered testing strategy. All tests must pass before a pull request can be merged.

Unit Tests

Fast, isolated tests that validate individual functions and classes.

# Run all unit tests
pytest tests/unit/ -v

# Run tests for a specific service
pytest tests/unit/verifier/ -v

# Run with coverage report
pytest tests/unit/ --cov=services --cov-report=html

Coverage threshold

We maintain a minimum of 80% line coverage across all services. New code should include corresponding tests.

Integration Tests

Tests that validate interactions between services using Docker Compose.

# Start dependencies
docker compose -f infra/docker/docker-compose.test.yml up -d

# Run integration tests
pytest tests/integration/ -v --timeout=120

# Tear down
docker compose -f infra/docker/docker-compose.test.yml down -v

Contract Tests

Consumer-driven contract tests ensure API compatibility between services.

pytest tests/contract/ -v

Load Tests

Performance tests written with k6 validate throughput and latency under load.

# Run a specific load test scenario
k6 run tests/load/verify_throughput.js

# Run with custom options
k6 run --vus 50 --duration 2m tests/load/verify_throughput.js

Documentation

Our documentation is built with MkDocs Material and published automatically on every merge to main.

Local Preview

# Install documentation dependencies
pip install mkdocs-material mkdocs-awesome-pages-plugin

# Start the local preview server
mkdocs serve

Then open http://127.0.0.1:8000 in your browser.

Writing Standards

  • Use sentence case for headings (e.g., "Getting started" not "Getting Started").
  • Keep paragraphs concise — aim for 3–5 sentences maximum.
  • Use admonitions (!!! note, !!! warning, !!! tip) to highlight important information.
  • Include code examples wherever possible — they're more helpful than prose.
  • Add alt text to all images for accessibility.
  • Place new pages in the appropriate section and update mkdocs.yml navigation.

Pull Request Process

Branch Naming

Create a branch from main using the following convention:

<type>/<short-description>

Examples:

feat/batch-verification
fix/gateway-timeout
docs/update-contributing-guide

Submitting a Pull Request

  1. Create your branch from main and make your changes.
  2. Write or update tests to cover your changes.
  3. Run the full validation suite locally:
    ruff check . && mypy services/ && pytest tests/unit/ -q
    
  4. Push your branch and open a pull request against main.
  5. Fill out the PR template completely — incomplete PRs will not be reviewed.

Review Process

  • All PRs require at least one approving review from a maintainer.
  • Reviewers may request changes — please address feedback promptly.
  • Keep PRs focused and small. Large PRs are harder to review and more likely to have issues.

CI Checks

The following checks run automatically on every pull request:

Check Tool Must Pass
Linting Ruff, ESLint
Type checking mypy
Unit tests pytest
Integration tests pytest + Docker
Build Docker build
Docs build MkDocs

Merging blocked

Pull requests cannot be merged until all CI checks pass and at least one maintainer has approved the changes.


Reporting Issues

Bug Reports

Found a bug? Please open an issue with the following details:

  • Summary — A clear, one-line description of the problem.
  • Steps to reproduce — Minimal steps to trigger the issue.
  • Expected behavior — What you expected to happen.
  • Actual behavior — What actually happened.
  • Environment — OS, Python version, Docker version, relevant config.

Feature Requests

Have an idea? Open a feature request and describe:

  • Problem statement — What problem does this solve?
  • Proposed solution — How do you envision this working?
  • Alternatives considered — What other approaches did you think about?

Code of Conduct

All contributors are expected to follow our Code of Conduct. We are committed to providing a welcoming, inclusive, and harassment-free experience for everyone.

Please report any unacceptable behavior to conduct@verity.dev.


License

Verity is licensed under the Apache License 2.0. By contributing, you agree that your contributions will be licensed under the same terms.

Copyright 2024 Verity Contributors

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Thank you for contributing to Verity. Every improvement — no matter how small — helps build a more reliable platform for everyone.