Architecture Decision Records (ADRs) solve a specific remote work problem: when you make a technical decision asynchronously, the reasoning evaporates unless it’s written down. In 8 months, nobody remembers why you chose Kafka over RabbitMQ. This guide compares the three practical approaches for async ADR workflows in remote teams.
Option 1: Log4brains (In-Repo, Browsable)
Log4brains stores ADRs as Markdown files in your repository and generates a browsable web interface. The decision history is version-controlled alongside the code that implements it.
Setup:
# Install
npm install -g @thomvaill/log4brains
# Initialize in your repo
cd your-project
log4brains init
# This creates:
# docs/adr/
# docs/adr/template.md
# .log4brains.yml
Configuration:
# .log4brains.yml
project:
name: "API Platform"
basePath: /adr
packages:
- name: API Platform
basePath: .
adrFolder: docs/adr
Create a new ADR:
log4brains adr new "Use PostgreSQL over DynamoDB for user data"
# Creates: docs/adr/YYYYMMDD-use-postgresql-over-dynamodb-for-user-data.md
ADR template (Log4brains default + async additions):
# Use PostgreSQL over DynamoDB for user data
- Status: [proposed | accepted | deprecated | superseded]
- Date: 2026-03-22
- Deciders: @alice, @bob, @carol
- Reviewed-by: @david (async review, closed 2026-03-25)
## Context and Problem Statement
We need a primary data store for user profiles, preferences, and session data.
The team is evaluating PostgreSQL (relational) vs DynamoDB (NoSQL key-value).
## Decision Drivers
- Query flexibility: we need complex queries for analytics and admin tools
- Team expertise: all 6 engineers have PostgreSQL experience; 2 have DynamoDB
- Cost: projected 100k users, moderate write volume
- Operational burden: managed vs self-managed
## Considered Options
- PostgreSQL (AWS RDS or self-managed)
- DynamoDB
## Decision Outcome
**Chosen option: PostgreSQL on RDS**
Primary reason: team expertise and query flexibility outweigh DynamoDB's
operational simplicity at our scale. DynamoDB's query model would require
significant data modeling effort for our analytics requirements.
### Positive Consequences
- All engineers can contribute to DB-related code from day one
- Complex joins for admin reports work without additional tooling
- Standard SQL tooling for migrations (Alembic)
### Negative Consequences
- We manage connection pooling (PgBouncer) ourselves
- Horizontal sharding requires more effort if we exceed ~10M users
- Cost will be higher than DynamoDB at very high scale
## Async Review Notes
Review period: 2026-03-22 to 2026-03-25
**@david (2026-03-23):** Agree on PostgreSQL. One question: have we evaluated
Aurora Serverless to reduce operational overhead on RDS?
**@alice response:** Good point. Aurora Serverless v2 is compatible — we can
migrate to it if connection management becomes a burden. Adding as a follow-up item.
**@bob:** Voted: approve ✅
**@carol:** Voted: approve ✅. Note: document connection limit as known constraint.
Build and view:
# Local preview
log4brains preview
# Build static site for hosting
log4brains build
# Deploy to GitHub Pages (add to CI)
# Output: out/ directory
Strengths: Version history matches code history; searchable; PR-based review works naturally. Weaknesses: Engineers need to know the CLI; no WYSIWYG editing.
Option 2: GitHub Discussions
GitHub Discussions provides a structured forum directly in your repository without extra tooling.
Setup:
Repo Settings → Features → Discussions → Enable
Create category: "Architecture Decisions"
Description: "Proposed and accepted ADRs for [project name]"
Format: Announcement (only maintainers can create, others reply)
ADR as a Discussion:
**Title:** [ADR-042] Use OpenTelemetry over Datadog native SDK
**Status:** Proposed → Under Review → Accepted
**Context:**
We're adding distributed tracing. We can use Datadog's native tracing SDK
or OpenTelemetry with Datadog as an exporter.
**Decision:**
OpenTelemetry (vendor-neutral) with Datadog as the current exporter.
Rationale:
- If we switch APM vendors, we change the exporter, not the instrumentation code
- OpenTelemetry is the CNCF standard; engineers joining from other companies know it
- Datadog's OTel support is production-ready as of their 2025 agent update
**Dissent / Alternatives considered:**
@marcus raised that the Datadog native SDK has richer auto-instrumentation
for Python than OTel currently. We accept this tradeoff.
**Review:** Open until 2026-03-29. Comment with your vote or concerns.
Teammates vote with emoji reactions (👍 approve, 🤔 concerns) and comment in threads.
Strengths: Zero setup; everyone already uses GitHub; easy to link from PRs. Weaknesses: No structured browsing (only search); no static site output; easy to lose decisions in a long Discussions list.
Option 3: Notion
Notion works best when your team already uses it for documentation and wants ADRs integrated with other knowledge.
Database setup:
Create a new Database in Notion titled "Architecture Decisions"
Properties:
- Title (text)
- Status (select): Proposed | Under Review | Accepted | Deprecated | Superseded
- Date (date): decision date
- Deciders (person): multiple
- Category (select): Data | Infrastructure | API | Frontend | Security
- ADR Number (number): auto-increment manually or via formula
- Supersedes (relation): links to deprecated ADR
Views to create:
- All ADRs (table view, sorted by Date descending)
- By Category (board view, grouped by Category)
- Active Decisions (filtered: Status = Accepted)
- Review Queue (filtered: Status = Proposed or Under Review)
Notion ADR template:
## [ADR-NNN] Title
**Status:** Proposed
**Date:** 2026-03-22
**Deciders:** @person, @person
**Review Deadline:** 2026-03-29
---
### Context
[What is the situation and why is a decision needed?]
## Table of Contents
- [Comparison](#comparison)
- [The Right Choice](#the-right-choice)
- [Running the Async Review Process](#running-the-async-review-process)
- [Related Reading](#related-reading)
### Options Considered
Option A: ...
Option B: ...
### Decision
**Chosen:** [option]
**Rationale:** [why]
### Consequences
[What becomes easier? What becomes harder?]
---
### Async Review Thread
Use the @comments below. Vote with 👍 or 👎 and close with "Approved ✅" or "Concerns 🤔"
Reminder: all engineers review by [deadline]. @mention someone if you need their specific input.
Strengths: Rich formatting; linked with other docs; non-engineers can read and comment easily. Weaknesses: Not version-controlled; editable after acceptance (can lose history); costs money.
Comparison
| Factor | Log4brains | GitHub Discussions | Notion |
|---|---|---|---|
| Version history | In git | PR history | Notion page history |
| Setup effort | Medium | Low | Medium |
| Search quality | Good | GitHub search | Notion search |
| Non-engineer access | Build required | GitHub account | Notion account |
| Cost | Free | Free | $8+/user/mo |
| Code linkage | Best (same repo) | Good (same repo) | Poor (external link) |
| Async review | PR comments | Discussion comments | Page comments |
The Right Choice
Use Log4brains if: your team is engineering-heavy, ADRs should live with code, and you want version-controlled decisions.
Use GitHub Discussions if: you want zero-setup, your team is already in GitHub all day, and you don’t need structured browsing.
Use Notion if: non-technical stakeholders need to read or comment on ADRs, or your team already uses Notion for all documentation.
Running the Async Review Process
Whichever tool you pick, the async review process matters more than the tooling. A well-run ADR review prevents the common failure mode where decisions happen in Slack threads and the ADR is written after the fact to document what was already decided — at which point nobody challenges it because the decision already happened.
A practical async review workflow:
Step 1: Author publishes the ADR as “Proposed” and posts in Slack with a clear deadline: “ADR-043: Use Redis for session storage. Review open until Friday EOD. Comments on the Notion page / GitHub Discussion / PR.”
Step 2: Set a 3-5 day review window. Shorter than 3 days doesn’t give distributed team members across time zones a fair chance to review. Longer than 5 days causes context loss.
Step 3: Require explicit votes, not just silence. Default approval (no objection = approved) works poorly in remote teams where people miss notifications. Ask each named reviewer to explicitly comment with their vote. Use the “Deciders” field to track who needs to respond.
Step 4: Resolve dissent in writing. If a reviewer raises a concern, the author responds in the ADR document itself (not Slack), updating the “alternatives considered” section if the dissent surfaces a new option. This keeps the decision reasoning in one place.
Step 5: Author changes status to “Accepted” once the review period closes and all deciders have voted. Link the PR that implements the decision from the ADR.
The most common gap in ADR processes is step 4 — dissent gets handled in Slack and the ADR stays unchanged. Over time this creates a false picture where every decision looks consensus-based and easy. Write disagreements and minority positions into the document explicitly, so engineers joining the team six months later understand what tradeoffs were consciously accepted and what concerns were noted but overruled. An ADR without documented dissent is often an incomplete record of the actual decision.
Related Reading
- ADR Tools for Remote Engineering Teams
- Async Decision Making with RFC Documents for Engineering Teams
- Async Engineering Proposal Process Using GitHub Discussions
- How to Create Remote Team Architecture Decision Record
Related Articles
- ADR Tools for Remote Engineering Teams
- How to Document Architecture Decisions for Remote Teams
- How to Create Remote Team Architecture Decision Record
- Remote Team Architecture Decision Record Template for Async
- How to Create Decision Log Documentation for Remote Teams Built by theluckystrike — More at zovo.one