Skip to content

Containerised toolkit for managing non‑functional requirements across services. Provides a React UI and API, generates Markdown views from YAML (by domain, service, team, release, operation), and can publish NFR documentation to Confluence per‑repo via Docker Compose.

Notifications You must be signed in to change notification settings

NHSDigital/nfrs-toolkit

Repository files navigation

NFRs Toolkit (embedded)

This folder describes how to use the shared NFR tooling that currently lives inside this repository. The intent is to lift this into its own repo later without changing how it is used.

Components

  • UI/API: nfrs-toolkit/src/nfrs-maintenance – React maintenance app + Express API
  • NFR scripts: scripts/nfr – generators, validation, migration helpers
  • Jira helpers: scripts/jira – coverage and mapping utilities
  • Confluence publisher: scripts/confluence – markdown → Confluence

Pointing the toolkit at a project

The NFR API server now honours the NFR_BASE_PATH environment variable.

  • Default: requirements/nfrs in this repo (as before)
  • Override to point at another project:
export NFR_BASE_PATH=/path/to/other/repo/requirements/nfrs
cd nfrs-toolkit/src/nfrs-maintenance
npm install
npm run dev

The UI will read and write NFR YAMLs under NFR_BASE_PATH, so you can use this toolkit against any repo that follows the same requirements/nfrs structure.

Alternatively, you can use the helper script in this repo:

./nfrs-toolkit/nfrs-cli.sh serve                # uses this repo's requirements/nfrs
./nfrs-toolkit/nfrs-cli.sh serve --nfr-root /path/to/other/requirements/nfrs
./nfrs-toolkit/nfrs-cli.sh generate             # build all NFR markdown pages
export CONFLUENCE_BASE_URL=...
export CONFLUENCE_SPACE=...
export CONFLUENCE_AUTH=bearer
export CONFLUENCE_USER=...
export CONFLUENCE_TOKEN=...
export JIRA_BASE_URL=...
export JIRA_AUTH=bearer
export JIRA_USER=...
export JIRA_TOKEN=...
./nfrs-toolkit/nfrs-cli.sh publish-confluence   # publish to Confluence

Next steps

  • Extract this folder (plus the referenced services/ and scripts/ paths) into a standalone nfrs-toolkit repo.
  • Add a thin CLI wrapper (e.g. nfrs serve, nfrs generate, nfrs publish-confluence) that calls into the existing scripts with the correct environment.

Running in Docker

You can run the toolkit in a container without installing Node/Python locally.

  1. Build the image from the nfrs-toolkit directory:

    docker build -t nfrs-toolkit .
  2. Run the maintenance UI/API (serve), mounting the NFR repo you want to edit:

    docker run --rm \
      -p 5174:5174 \
      -p 3001:3001 \
      -e NFR_BASE_PATH=/data/requirements/nfrs \
      -v /path/to/your/repo/requirements/nfrs:/data/requirements/nfrs \
      nfrs-toolkit serve
    • The UI will be available at http://localhost:5174
    • The API will be available at http://localhost:3001
    • Adjust the host path in the -v mount to point at the requirements/nfrs directory of the project you want to manage.
  3. Generate NFR markdown/pages inside the container (generate):

    docker run --rm \
      -e NFR_BASE_PATH=/data/requirements/nfrs \
      -v /path/to/your/repo:/data \
      nfrs-toolkit generate
    • Expects the mounted repo to contain requirements/nfrs and docs/nfrs as per this toolkit.
  4. Publish generated markdown to Confluence (publish-confluence):

    docker run --rm \
      -e NFR_BASE_PATH=/data/requirements/nfrs \
      -e CONFLUENCE_BASE_URL=... \
      -e CONFLUENCE_SPACE=... \
      -e CONFLUENCE_USER=... \
      -e CONFLUENCE_TOKEN=... \
      -v /path/to/your/repo:/data \
      nfrs-toolkit publish-confluence
    • The exact environment variables depend on your existing scripts/nfr/confluence_env.sh and scripts/nfr/confluence_secrets.sh setup; replicate those values as -e flags when running the container.

Using docker compose

You can also run the toolkit via docker compose using docker-compose.yml.

  1. Set the host repo path (the repo that contains requirements/nfrs):

    export NFR_HOST_REPO=/absolute/path/to/your/repo
    # Inside the container the repo is mounted at /data and the
    # NFRs live under /data/requirements/nfrs by default.
    export NFR_BASE_PATH=/data/requirements/nfrs

    or create a .env file alongside docker-compose.yml:

    echo "NFR_HOST_REPO=/absolute/path/to/your/repo" > .env
    echo "NFR_BASE_PATH=/data/requirements/nfrs" >> .env
  2. Start the maintenance UI/API:

    docker compose up
  3. Run one-off commands (init / generate / publish-confluence):

    # Initialise baseline NFRs into the mounted repo
    docker compose run --rm nfrs-toolkit init --target /data/requirements/nfrs
    
    # Generate markdown/pages
    docker compose run --rm nfrs-toolkit generate
    
    # Publish to Confluence
    docker compose run --rm nfrs-toolkit publish-confluence
  4. Convenience Make targets

    You can use the provided Makefile as a thin wrapper around the docker compose commands:

    # Initialise baseline NFRs into the mounted repo
    make init
    
    # Build/update the toolkit image (pick up script changes)
    docker compose build nfrs-toolkit
    
    # Preview what would be published (no changes made)
    make publish-confluence-dry-run
    
    # Publish markdown to Confluence
    make publish-confluence

    These targets assume your .env (or shell) exports NFR_HOST_REPO, NFR_BASE_PATH, and the required CONFLUENCE_* / JIRA_* variables.

  5. Automated solution assurance (experimental)

    There is an experimental helper to run "solution assurance" on a single Jira ticket using an LLM (for example GitHub Copilot / GitHub Models via an OpenAI-compatible endpoint). It fetches the issue JSON from Jira, summarises the key fields and attachments, then asks the model to return a JSON verdict (score, issues, recommendations).

    From the repo root, after configuring Jira and LLM env vars:

    # Jira configuration (can also come from .env)
    export JIRA_BASE_URL=https://nhsd-jira.digital.nhs.uk
    export JIRA_AUTH=basic
    export JIRA_USER=your.user
    export JIRA_TOKEN=your-token
    
    # LLM configuration (point at your Copilot / model endpoint)
    export ASSURE_LLM_ENDPOINT=https://your-openai-compatible-endpoint
    export ASSURE_LLM_API_KEY=your-api-key
    export ASSURE_LLM_MODEL=gpt-4.1-mini
    
    # Run assurance for a single ticket
    .venv/bin/python -m scripts.jira.assure_issue FTRS-1234 --format markdown
    
    # Or just print the constructed prompt (no LLM call):
    .venv/bin/python -m scripts.jira.assure_issue FTRS-1234 --dry-run --format markdown

About

Containerised toolkit for managing non‑functional requirements across services. Provides a React UI and API, generates Markdown views from YAML (by domain, service, team, release, operation), and can publish NFR documentation to Confluence per‑repo via Docker Compose.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published