The Goal
A lot of authentication tutorials stop too early.
They show a login form, return a token, and call the job finished.
Real systems are messier than that.
A browser client needs route protection, session recovery, logout handling, and CSRF protection. An API needs token issuance, scope checks, rate limiting, and a place to persist credentials. A production deployment needs clear service boundaries and a repeatable local environment.
This article walks through a practical pattern for building that system with:
- a React single-page application using React Router
- an Express API written in TypeScript
- Redis for short-lived token lookups
- MongoDB as the durable source of truth
- Docker Compose to run the full stack together
The design here is based on a real production implementation, but the details are intentionally generalized. The point is to explain the architecture, not to publish a live system blueprint.
The High-Level Shape
At a high level, the system has five concerns:
- the browser application
- the authentication and business API
- durable user and token persistence
- fast token lookup
- reverse proxy and environment orchestration
That often ends up looking like this:
Browser
-> React SPA
-> reverse proxy / ingress
-> Express API
-> MongoDB
-> Redis
That is not a microservices story. It is a disciplined monolith with supporting infrastructure.
The frontend is responsible for:
- collecting credentials or handling provider callbacks
- remembering whether the user appears logged in
- bootstrapping the current user on page load
- guarding private routes
The backend is responsible for:
- verifying credentials or provider identity
- issuing local access tokens
- enforcing scopes on protected endpoints
- converting browser sessions into safe cookie-based auth
- supporting header-based auth for non-browser clients when needed
MongoDB stores the durable records:
- users
- OAuth clients
- issued tokens
- password reset data
- account state
Redis acts as the hot path for token resolution so every authenticated request does not need to hit the database first.
Why Use an OAuth Library Internally
There is an important distinction between:
- using OAuth to integrate with third-party identity providers
- using an OAuth2 server library internally to standardize token issuance and scope validation
This architecture does both.
For local username/password login, an OAuth2 server library handles:
- password grant mechanics
- token issuance
- expiration semantics
- scope validation during authentication
The application still owns the storage model. That means the code decides how to:
- load users
- validate passwords
- load clients
- persist tokens
- resolve access tokens on later requests
This is a good fit when business-specific authorization rules are more complex than a hosted identity tutorial expects.
The Browser Login Flow
The browser flow starts in the React application.
A typical pattern looks like this:
- the user submits username and password
- the frontend asks the API which scopes the user should receive
- the frontend requests a token from the OAuth endpoint using those scopes
- the backend verifies the credentials and issues a local access token
- the backend stores that token durably and also sets an
httpOnlyauth cookie - the frontend updates application state and treats the session as logged in
One subtle detail matters here: the browser does not just need a token. It needs a recoverable session model.
In the implementation this article is based on, the frontend stores login state for route decisions, while the backend also writes the token into an httpOnly cookie so browser requests can be authenticated without exposing the token to normal frontend JavaScript reads.
That lets the application support two useful behaviors at once:
- route-level login awareness in the SPA
- safer cookie-backed browser auth on API requests
React Router and Session Recovery
Single-page apps need a startup decision:
"Do I already have a valid session, or should I show the public entry path?"
That decision usually happens when the app loads.
In practice, the React app can:
- read any local session marker it maintains
- call a current-user endpoint
- hydrate the auth store with user data if the session is still valid
- clear local state if the API rejects the session
Private routes then become simple.
If the auth state says the user is logged in, render the protected route. Otherwise, redirect to the public login flow.
This is not just a UX concern. It keeps authorization decisions centralized:
- the router controls navigation
- the API still controls actual access
That is the right separation. Frontend route guards improve flow, but the backend remains authoritative.
Social Login Without Surrendering Local Control
Many applications need more than email and password.
A practical pattern for social sign-in is:
- the browser completes the provider-side step and obtains either an authorization code or provider access token
- the browser sends that artifact to the backend
- the backend exchanges or validates it directly with the provider
- the backend reads the provider identity payload
- the backend finds or creates the local user record
- the backend issues the same kind of local access token it would issue for a password login
That design is valuable because the application does not become dependent on provider tokens for its own authorization model.
The external provider proves identity. The local system still decides:
- which local account that maps to
- what scopes the user gets
- what internal token should be trusted by the API
This is one of the cleanest ways to combine social sign-in convenience with consistent internal authorization.
MongoDB as the Source of Truth
Redis is fast, but it should not be the system of record.
For authentication, the durable store needs to answer questions like:
- who is this user
- is this account active
- which scopes are allowed
- has this email been verified
- when does this token expire
MongoDB works well for this style of application because the records are document-shaped and evolve over time. User accounts, account states, reset workflows, and token records often fit naturally in document collections.
The token persistence layer usually benefits from storing:
- access token
- refresh token if used
- expiration timestamps
- client metadata
- user metadata needed for scope checks
Then, when a request arrives with a bearer token, the system can reconstruct the authorization context even if the cache has missed or expired.
Redis as a Token Cache, Not a Security Fantasy
It is tempting to think Redis makes authentication "stateless."
It does not.
What Redis does well here is reduce latency and database pressure.
A practical pattern is:
- save the token durably in MongoDB
- write the same token into Redis with a short TTL
- on authenticated requests, check Redis first
- if Redis misses, load from MongoDB and repopulate Redis
This gives you:
- fast token lookup on hot sessions
- durable recovery if Redis restarts or evicts entries
- less frequent database reads on every protected request
That design is much more resilient than pretending the cache is the only token store.
Scope-Based Authorization
Roles are often too blunt.
Scopes are usually better.
Instead of saying only "admin" or "user," an authenticated principal can carry a list of allowed capabilities such as:
- read profile
- edit profile
- access paid features
- access staff tools
- perform a specific write action
The API middleware can then authenticate the token and verify whether the requested scope is present before letting the route handler run.
This pattern scales better than hard-coded route conditionals because authorization stays declarative.
The endpoint says what it requires. The token says what it grants.
That is much easier to reason about during maintenance.
Cookie-Based Browser Auth Requires CSRF Protection
Once a browser starts authenticating with cookies, CSRF protection is no longer optional.
One pragmatic approach is the double-submit pattern:
- issue an
httpOnlyauth cookie for the session - issue a separate readable CSRF cookie
- require the browser to copy the CSRF value into a request header on state-changing requests
- compare cookie and header values server-side using timing-safe comparison
Why this works:
- an attacker can cause the browser to send cookies
- an attacker cannot usually read the CSRF token cookie value from your origin
- without the matching header, the state-changing request is rejected
This is especially useful in systems that support both browser and non-browser clients.
Browser sessions can use cookies plus CSRF.
Non-browser clients can use explicit Authorization headers and skip CSRF checks because the threat model is different.
Rate Limiting and Sensitive Endpoints
Authentication endpoints are not like normal read endpoints.
They need tighter controls because they attract:
- brute force attempts
- credential stuffing
- account enumeration
- bot-driven account creation
- password reset abuse
The safest pattern is layered rate limiting:
- reverse proxy limits for obvious path-level abuse
- application-level limits for intent-aware controls
- optional challenge checks on high-risk flows like registration and password reset
This architecture uses that layered model rather than relying on a single control.
That matters because account systems fail at the edges first.
Docker Compose as an Integration Tool
Authentication systems are hard to reason about when every dependency is mocked.
That is where Docker Compose helps.
Even if you do not run your production environment with Compose, it is still useful locally because it lets you start a realistic stack:
- MongoDB
- Redis
- API
- frontend
- optional reverse proxy
That gives you a better place to test:
- session bootstrapping
- cookie behavior
- cross-service environment configuration
- provider callback handling
- token cache behavior
- route protection and logout
Authentication bugs are often integration bugs. Compose is good at surfacing those early.
A Generic Request Lifecycle
Once the system is in place, an authenticated browser request often looks like this:
- the user signs in through local credentials or a social provider
- the backend issues a local access token
- the backend persists the token in MongoDB and caches it in Redis
- the backend sets an auth cookie and a CSRF cookie
- the React app updates auth state and unlocks private routes
- subsequent browser requests automatically include the auth cookie
- state-changing requests also include the CSRF header
- the API resolves the token, verifies required scopes, and serves the request
That is the full system, not just the login screen.
Design Lessons
- OAuth mechanics and application authorization rules should be separated cleanly.
- Browser and non-browser clients should not be forced into the same auth transport.
- Redis is excellent for token lookup acceleration, but MongoDB should remain the durable source of truth.
- Route guards in React Router improve UX, but the API must remain authoritative.
- Cookie-based browser auth must be paired with CSRF protection.
- Social login is simplest when provider identity is translated into local tokens and local scopes.
- Docker Compose is valuable because auth bugs tend to appear between services, not inside isolated functions.
Final Thought
The most useful authentication architecture is rarely the most fashionable one.
It is the one that makes trust boundaries easy to explain.
If you can describe:
- where identity is proven
- where tokens are issued
- where authorization is enforced
- where state is persisted
- and how browser-specific risks are handled
then the system is probably on the right track.
That is a much better goal than simply saying, "we use OAuth."
What This Architecture Does Not Claim To Be
It is important to be precise about what this kind of implementation represents.
The architecture in this article demonstrates a real application-level authentication and authorization system:
- local token issuance through an OAuth2 server library
- scope-based authorization on API routes
- account lifecycle handling for signup, verification, password reset, logout, and deletion
- social sign-in mapped into local identities
- browser-safe session handling with cookies and CSRF protection
- operational hardening through layered rate limiting and proxy controls
It also has a reasonable basis for horizontal scaling because:
- token lookups can be served from Redis across multiple API instances
- durable user and token state lives outside the application process
- the frontend, API, cache, and database are cleanly separated into independently deployable services
That said, this is still not the same thing as IAM at scale in the enterprise sense.
It does not attempt to solve problems such as:
- enterprise identity directory integration with systems like Okta or Azure AD
- SAML or OIDC federation for workforce access
- centralized entitlement review workflows
- just-in-time access grants or approval-based internal access flows
- large-scale identity lifecycle automation for employees and internal systems
- audit and compliance-oriented access governance across many services and teams
So the honest framing is this:
This architecture is a strong example of production-minded OAuth2 and application authorization design for a web platform. It shows meaningful understanding of scopes, trust boundaries, session safety, and identity translation. But it does not, by itself, constitute a full IAM-at-scale platform.
That distinction matters.
There is real overlap between the two domains:
- defining scopes and permissions carefully
- enforcing trust boundaries consistently
- translating external identity into local authorization
- handling account state changes safely
- making the safe path the default path
But enterprise IAM adds another layer of complexity around organizational identity, internal access governance, compliance, and access operations at much larger scale.
If you are building a web application, the architecture in this article is a practical and credible foundation.
If you are building internal identity infrastructure for a large organization, this is only one part of the problem.