This folder describes how to use the shared NFR tooling that currently lives inside this repository. The intent is to lift this into its own repo later without changing how it is used.
- UI/API:
nfrs-toolkit/src/nfrs-maintenance– React maintenance app + Express API - NFR scripts:
scripts/nfr– generators, validation, migration helpers - Jira helpers:
scripts/jira– coverage and mapping utilities - Confluence publisher:
scripts/confluence– markdown → Confluence
The NFR API server now honours the NFR_BASE_PATH environment variable.
- Default:
requirements/nfrsin this repo (as before) - Override to point at another project:
export NFR_BASE_PATH=/path/to/other/repo/requirements/nfrs
cd nfrs-toolkit/src/nfrs-maintenance
npm install
npm run devThe UI will read and write NFR YAMLs under NFR_BASE_PATH, so you can use this toolkit against any repo that follows the same requirements/nfrs structure.
Alternatively, you can use the helper script in this repo:
./nfrs-toolkit/nfrs-cli.sh serve # uses this repo's requirements/nfrs
./nfrs-toolkit/nfrs-cli.sh serve --nfr-root /path/to/other/requirements/nfrs
./nfrs-toolkit/nfrs-cli.sh generate # build all NFR markdown pages
export CONFLUENCE_BASE_URL=...
export CONFLUENCE_SPACE=...
export CONFLUENCE_AUTH=bearer
export CONFLUENCE_USER=...
export CONFLUENCE_TOKEN=...
export JIRA_BASE_URL=...
export JIRA_AUTH=bearer
export JIRA_USER=...
export JIRA_TOKEN=...
./nfrs-toolkit/nfrs-cli.sh publish-confluence # publish to Confluence- Extract this folder (plus the referenced
services/andscripts/paths) into a standalonenfrs-toolkitrepo. - Add a thin CLI wrapper (e.g.
nfrs serve,nfrs generate,nfrs publish-confluence) that calls into the existing scripts with the correct environment.
You can run the toolkit in a container without installing Node/Python locally.
-
Build the image from the
nfrs-toolkitdirectory:docker build -t nfrs-toolkit . -
Run the maintenance UI/API (
serve), mounting the NFR repo you want to edit:docker run --rm \ -p 5174:5174 \ -p 3001:3001 \ -e NFR_BASE_PATH=/data/requirements/nfrs \ -v /path/to/your/repo/requirements/nfrs:/data/requirements/nfrs \ nfrs-toolkit serve
- The UI will be available at http://localhost:5174
- The API will be available at http://localhost:3001
- Adjust the host path in the
-vmount to point at therequirements/nfrsdirectory of the project you want to manage.
-
Generate NFR markdown/pages inside the container (
generate):docker run --rm \ -e NFR_BASE_PATH=/data/requirements/nfrs \ -v /path/to/your/repo:/data \ nfrs-toolkit generate
- Expects the mounted repo to contain
requirements/nfrsanddocs/nfrsas per this toolkit.
- Expects the mounted repo to contain
-
Publish generated markdown to Confluence (
publish-confluence):docker run --rm \ -e NFR_BASE_PATH=/data/requirements/nfrs \ -e CONFLUENCE_BASE_URL=... \ -e CONFLUENCE_SPACE=... \ -e CONFLUENCE_USER=... \ -e CONFLUENCE_TOKEN=... \ -v /path/to/your/repo:/data \ nfrs-toolkit publish-confluence
- The exact environment variables depend on your existing
scripts/nfr/confluence_env.shandscripts/nfr/confluence_secrets.shsetup; replicate those values as-eflags when running the container.
- The exact environment variables depend on your existing
You can also run the toolkit via docker compose using docker-compose.yml.
-
Set the host repo path (the repo that contains
requirements/nfrs):export NFR_HOST_REPO=/absolute/path/to/your/repo # Inside the container the repo is mounted at /data and the # NFRs live under /data/requirements/nfrs by default. export NFR_BASE_PATH=/data/requirements/nfrs
or create a
.envfile alongsidedocker-compose.yml:echo "NFR_HOST_REPO=/absolute/path/to/your/repo" > .env echo "NFR_BASE_PATH=/data/requirements/nfrs" >> .env
-
Start the maintenance UI/API:
docker compose up
-
Run one-off commands (init / generate / publish-confluence):
# Initialise baseline NFRs into the mounted repo docker compose run --rm nfrs-toolkit init --target /data/requirements/nfrs # Generate markdown/pages docker compose run --rm nfrs-toolkit generate # Publish to Confluence docker compose run --rm nfrs-toolkit publish-confluence
-
Convenience Make targets
You can use the provided Makefile as a thin wrapper around the docker compose commands:
# Initialise baseline NFRs into the mounted repo make init # Build/update the toolkit image (pick up script changes) docker compose build nfrs-toolkit # Preview what would be published (no changes made) make publish-confluence-dry-run # Publish markdown to Confluence make publish-confluence
These targets assume your
.env(or shell) exportsNFR_HOST_REPO,NFR_BASE_PATH, and the requiredCONFLUENCE_*/JIRA_*variables. -
Automated solution assurance (experimental)
There is an experimental helper to run "solution assurance" on a single Jira ticket using an LLM (for example GitHub Copilot / GitHub Models via an OpenAI-compatible endpoint). It fetches the issue JSON from Jira, summarises the key fields and attachments, then asks the model to return a JSON verdict (score, issues, recommendations).
From the repo root, after configuring Jira and LLM env vars:
# Jira configuration (can also come from .env) export JIRA_BASE_URL=https://nhsd-jira.digital.nhs.uk export JIRA_AUTH=basic export JIRA_USER=your.user export JIRA_TOKEN=your-token # LLM configuration (point at your Copilot / model endpoint) export ASSURE_LLM_ENDPOINT=https://your-openai-compatible-endpoint export ASSURE_LLM_API_KEY=your-api-key export ASSURE_LLM_MODEL=gpt-4.1-mini # Run assurance for a single ticket .venv/bin/python -m scripts.jira.assure_issue FTRS-1234 --format markdown # Or just print the constructed prompt (no LLM call): .venv/bin/python -m scripts.jira.assure_issue FTRS-1234 --dry-run --format markdown