Monorepo with frontend (React + Vite + Tailwind) and backend (Express + TypeScript).
Demo: https://show-to-brand.onrender.com/
Prerequisites:
- Node.js 20+
- pnpm 9+ (
npm i -g pnpm)
Environment:
- Copy
.env.exampleto.envand adjust as needed. OPENAI_API_KEYis optional. Users can supply their own key in the UI; the frontend sends it asX-OpenAI-Keyand the backend uses it per request.- Set
ALLOWED_ORIGINSto include your frontend origin in dev (e.g.,http://localhost:5173).
Install:
pnpm -w install
Run:
pnpm dev
URLs:
- Frontend (Vite): http://localhost:5173 (port may vary)
- Backend health: http://localhost:8080/api/health
In production, the backend serves the built SPA from app/frontend/dist and exposes the API under /api/*.
NODE_ENV=productionOPENAI_API_KEY=sk-...(optional; only needed if you want a server-side default when the client doesn't sendX-OpenAI-Key)PORT=8080(or the platform-provided port)ALLOWED_ORIGINS=https://your-domain(use your public URL; for local prod usehttp://localhost:8080)
pnpm -w install
pnpm --filter ./app/frontend build
pnpm --filter ./app/backend build
export NODE_ENV=production
# Optional: server-side default key if the client does not pass X-OpenAI-Key
# export OPENAI_API_KEY=sk-...
export ALLOWED_ORIGINS=http://localhost:8080
export PORT=8080
node app/backend/dist/server.js
Open: http://localhost:8080
Health: http://localhost:8080/api/health
OpenAI check: http://localhost:8080/api/openai/check
- Clone the repository.
- Create a “Web Service” on Render and connect the repo.
- Build command:
pnpm -w install && pnpm --filter ./app/frontend build && pnpm --filter ./app/backend build - Start command:
node app/backend/dist/server.js - Environment in Render:
NODE_ENV=productionALLOWED_ORIGINS=https://<your-render-url>- (Optional)
OPENAI_API_KEY=sk-...if you want a server default - Leave
PORTunset (Render injects it; the server readsprocess.env.PORT).
- (Optional) Health check path:
/api/health.
Create a Dockerfile at the repo root (example):
# syntax=docker/dockerfile:1
FROM node:20-slim AS base
WORKDIR /app
# Install pnpm
RUN corepack enable && corepack prepare [email protected] --activate
# Copy workspace files
COPY pnpm-workspace.yaml package.json .npmrc* .env.example* ./
COPY app ./app
# Install and build
RUN pnpm -w install --frozen-lockfile
RUN pnpm --filter ./app/frontend build && pnpm --filter ./app/backend build
# --- Runtime image ---
FROM node:20-slim
ENV NODE_ENV=production
WORKDIR /app
COPY --from=base /app/app/backend/dist ./app/backend/dist
COPY --from=base /app/app/frontend/dist ./app/frontend/dist
COPY package.json pnpm-workspace.yaml ./
# Expose port
ENV PORT=8080
EXPOSE 8080
# Start server
CMD ["node", "app/backend/dist/server.js"]
Build and run:
docker build -t brand-analyzer:prod .
docker run --rm -p 8080:8080 \
-e NODE_ENV=production \
-e ALLOWED_ORIGINS=http://localhost:8080 \
brand-analyzer:prod
Open: http://localhost:8080
- Paste your OpenAI API key in the header field (or skip if the server has a default
OPENAI_API_KEY). - Click "Check key" to verify connectivity.
- Provide input via Upload (.txt/.srt), URL, or Text.
- Toggle “Use LLM” for OpenAI-based extraction (requires a valid key via header or env).
- Click Analyze and view results.
pnpm dev– run frontend + backend in watch modepnpm build– build all workspacespnpm lint– lint all workspacespnpm test– run tests (placeholder)
LLM extractor:
- Backend file:
app/backend/src/services/extractor/llm.ts - Model:
gpt-4o-mini(changeable in code) - Toggle via UI (Use LLM) or send
useLLM: truein request - Uses
X-OpenAI-Keyper request; falls back toOPENAI_API_KEYenv if header is missing
Troubleshooting:
- Workspaces warning: ensure
pnpm-workspace.yamlexists withpackages:\n - app/* - CORS issues: include frontend origin in
ALLOWED_ORIGINS. The backend allowsX-OpenAI-Keyheader. - LLM errors: ensure the browser sends
X-OpenAI-Keyor setOPENAI_API_KEYserver-side; verify OpenAI network access - Port conflicts: adjust
PORT(backend) or Vite port (app/frontend/vite.config.ts)
- Rule-based extractor refinements and tests
- Full UI polish (shadcn/ui) and accessibility
- Unit/integration tests (Vitest + Supertest)
- CI with lint/test/build