diff --git a/.dev/README.md b/.dev/README.md new file mode 100644 index 00000000..81eb18e0 --- /dev/null +++ b/.dev/README.md @@ -0,0 +1,237 @@ +# Development Patches (.dev) + +This directory contains development-only patches that enhance the local development experience but should **never be committed** to the main codebase. + +## Setup (run once after cloning) + +```bash +bash .dev/setup.sh +``` + +This adds git aliases for convenient patch management. + +## Quick Start + +```bash +# Check patch status +bash .dev/dev-patch.sh status + +# Apply/remove ALL dev patches at once +bash .dev/dev-patch.sh all-on +bash .dev/dev-patch.sh all-off + +# Apply/remove individual patches +bash .dev/dev-patch.sh on dev-login-bypass +bash .dev/dev-patch.sh off dev-login-bypass +bash .dev/dev-patch.sh toggle mock-storage +``` + +## Directory Structure + +``` +.dev/ +├── README.md # This file +├── setup.sh # Run once to add git aliases +├── dev-patch.sh # Patch management script +├── patches.yaml # Patch state tracking (on/off) +├── patches/ # Patch files +│ ├── dev-login-bypass.patch +│ └── mock-storage.patch +└── hooks/ # Git hooks for safety + └── pre-commit-patch-guard.sh +``` + +## Available Patches + +### dev-login-bypass + +Adds a floating dev menu for quick user switching and creation without Google OAuth. + +**Features:** +- Floating bug icon (draggable to any corner) +- Create test users with random emails/names +- Quick switch between dev users +- Toggle admin/premium status per user + +**Files affected:** +- `src/components/dev-menu/dev-floating-menu.tsx` (new) +- `src/components/dev-menu/dev-user-card.tsx` (new) +- `src/components/dev-menu/random-email-generator.ts` (new) +- `src/fn/dev-auth.ts` (new) +- `src/routes/dev-login.tsx` (new) +- `src/routes/__root.tsx` (modified) +- `src/routes/api/login/google/index.ts` (modified) + +**Requires:** `DEV_BYPASS_AUTH=true` in `.env` + +### mock-storage + +Bypasses R2/S3 storage when credentials aren't configured. Uses sample videos and images for development. + +**Features:** +- No R2/S3 connection required +- Sample videos from Google Storage (~2MB each, fast loading) +- Thumbnails from Unsplash (tech/coding themed) +- Hash-based selection ensures consistent content per segment +- Console logging for storage operations + +**Files affected:** +- `src/utils/storage/mock-storage.ts` (new) +- `src/utils/storage/index.ts` (modified) +- `src/fn/video-transcoding.ts` (modified) +- `src/routes/-components/hero.tsx` (modified) +- `src/utils/video-transcoding.ts` (modified) + +**Requires:** `DEV_MOCK_STORAGE=true` in `.env` + +## Patch Management Script + +The `dev-patch.sh` script provides a unified interface for managing patches: + +```bash +# Commands +bash .dev/dev-patch.sh status # Show all patches status +bash .dev/dev-patch.sh on # Apply a patch +bash .dev/dev-patch.sh off # Remove a patch +bash .dev/dev-patch.sh toggle # Toggle a patch on/off +bash .dev/dev-patch.sh all-on # Apply all patches +bash .dev/dev-patch.sh all-off # Remove all patches +bash .dev/dev-patch.sh rebuild # Rebuild patches from yaml state +bash .dev/dev-patch.sh check # Check if any patches are on (for pre-commit) +``` + +### How It Works + +1. **State tracking**: `patches.yaml` tracks which patches are on/off +2. **Rebuild approach**: When changing patches, all files are reset to clean state, then all "on" patches are applied in order +3. **Patch ordering**: Patches are applied in the order listed in `patches.yaml` (dev-login-bypass first, then mock-storage) + +## Git Aliases + +After running `bash .dev/setup.sh`, these commands are available: + +```bash +git dev-status # Show patch status +git dev-on # Apply all patches +git dev-off # Remove all patches +git dev-patch # Full patch management + +# Individual patches +git login-bypass-on # Apply dev-login-bypass +git login-bypass-off # Remove dev-login-bypass +git mock-storage-on # Apply mock-storage +git mock-storage-off # Remove mock-storage +``` + +## Pre-commit Hook Setup + +The hook prevents accidentally committing when dev patches are active. + +### Manual Setup + +```bash +cp .dev/hooks/pre-commit-patch-guard.sh .git/hooks/pre-commit +chmod +x .git/hooks/pre-commit +``` + +### With Husky + +Add to `.husky/pre-commit`: +```bash +.dev/hooks/pre-commit-patch-guard.sh +``` + +### With Lefthook + +Add to `lefthook.yml`: +```yaml +pre-commit: + commands: + patch-guard: + run: .dev/hooks/pre-commit-patch-guard.sh +``` + +## Creating New Patches + +### 1. Start from clean state + +```bash +bash .dev/dev-patch.sh all-off +``` + +### 2. Make your dev-only changes + +Edit files as needed for your new dev feature. + +### 3. Generate the patch + +```bash +# For tracked file changes +git diff > .dev/patches/.patch + +# For new files, add them first +git add -N +git diff > .dev/patches/.patch +``` + +### 4. Add to patches.yaml + +```yaml +patches: + dev-login-bypass: off + mock-storage: off + your-new-patch: off # Add this line +``` + +### 5. Reset and test + +```bash +git checkout -- . +bash .dev/dev-patch.sh on your-new-patch +``` + +## Troubleshooting + +### Patch won't apply + +```bash +# Check what's blocking +git apply --check .dev/patches/.patch + +# Try with whitespace ignore +git apply --ignore-whitespace .dev/patches/.patch + +# If conflicts, rebuild from clean state +bash .dev/dev-patch.sh rebuild +``` + +### CRLF/LF issues (Windows) + +The script uses `--ignore-whitespace` to handle line ending differences. If you still have issues: + +```bash +# Convert file to LF +dos2unix .dev/patches/.patch +``` + +### Patches conflict with each other + +Patches are applied in order. If adding a new patch that modifies files also modified by earlier patches, make sure your patch assumes earlier patches are already applied. + +### State out of sync + +If `patches.yaml` says patches are on but files don't reflect it: + +```bash +bash .dev/dev-patch.sh rebuild +``` + +### Regenerating a patch after codebase changes + +If the base code changes and a patch no longer applies: + +1. Start from clean state: `bash .dev/dev-patch.sh all-off` +2. Apply other patches that should come before: `bash .dev/dev-patch.sh on ` +3. Manually make the changes for the broken patch +4. Generate new patch: `git diff > .dev/patches/.patch` +5. For new files: include them with `git diff --no-index /dev/null >> .dev/patches/.patch` diff --git a/.dev/dev-patch.sh b/.dev/dev-patch.sh new file mode 100644 index 00000000..2aa828e6 --- /dev/null +++ b/.dev/dev-patch.sh @@ -0,0 +1,137 @@ +#!/bin/bash +# Dev Patch Manager +# Usage: .dev/dev-patch.sh [patch-name] + +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +PATCHES_DIR="$SCRIPT_DIR/patches" +STATE_FILE="$SCRIPT_DIR/patches.yaml" + +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +NC='\033[0m' + +get_all_patches() { + grep -E "^ [a-z-]+:" "$STATE_FILE" | sed 's/:.*//' | tr -d ' ' +} + +get_state() { + local patch="$1" + grep -E "^ $patch:" "$STATE_FILE" | sed 's/.*: //' | tr -d ' ' +} + +set_state() { + local patch="$1" + local state="$2" + sed -i "s/^ $patch: .*/ $patch: $state/" "$STATE_FILE" +} + +get_patch_files() { + local patch_file="$1" + grep -E "^diff --git" "$patch_file" | sed 's|diff --git a/\([^ ]*\) b/.*|\1|' +} + +reset_patch_files() { + for patch in $(get_all_patches); do + local patch_file="$PATCHES_DIR/$patch.patch" + if [ -f "$patch_file" ]; then + # Use git apply --reverse to cleanly remove patch (handles both modified and new files) + git apply --reverse --ignore-whitespace "$patch_file" 2>/dev/null || true + fi + done +} + +apply_patch_raw() { + local patch="$1" + local patch_file="$PATCHES_DIR/$patch.patch" + + [ ! -f "$patch_file" ] && echo -e "${RED}Patch not found: $patch_file${NC}" && return 1 + + git apply --ignore-whitespace "$patch_file" 2>/dev/null || \ + git apply --ignore-whitespace --3way "$patch_file" 2>/dev/null || { + echo -e "${RED}Failed to apply patch: $patch${NC}" + return 1 + } +} + +rebuild_patches() { + reset_patch_files + + for patch in $(get_all_patches); do + if [ "$(get_state "$patch")" = "on" ]; then + echo -e "${GREEN}Applying patch: $patch${NC}" + apply_patch_raw "$patch" || return 1 + fi + done +} + +apply_patch() { + local patch="$1" + [ "$(get_state "$patch")" = "on" ] && echo -e "${YELLOW}Patch '$patch' is already on${NC}" && return 0 + set_state "$patch" "on" + rebuild_patches +} + +remove_patch() { + local patch="$1" + [ "$(get_state "$patch")" = "off" ] && echo -e "${YELLOW}Patch '$patch' is already off${NC}" && return 0 + set_state "$patch" "off" + rebuild_patches +} + +status() { + echo "Dev Patches Status:" + echo "-------------------" + for patch in $(get_all_patches); do + local state=$(get_state "$patch") + [ "$state" = "on" ] && echo -e " ${GREEN}$patch: $state${NC}" || echo -e " $patch: $state" + done +} + +check_any_on() { + grep -q ": on" "$STATE_FILE" +} + +case "${1:-}" in + on) apply_patch "$2" ;; + off) remove_patch "$2" ;; + toggle) [ "$(get_state "$2")" = "on" ] && remove_patch "$2" || apply_patch "$2" ;; + status) status ;; + check) + if check_any_on; then + echo -e "${RED}ERROR: Dev patches are active. Run 'git dev-off' before committing.${NC}" + status + exit 1 + fi + ;; + all-on) + for patch in $(get_all_patches); do set_state "$patch" "on"; done + rebuild_patches + echo "" + echo -e "${GREEN}All dev patches applied successfully!${NC}" + status + ;; + all-off) + for patch in $(get_all_patches); do set_state "$patch" "off"; done + rebuild_patches + echo "" + echo -e "${GREEN}All dev patches removed successfully!${NC}" + status + ;; + rebuild) rebuild_patches ;; + *) + echo "Usage: dev-patch.sh [patch-name]" + echo "" + echo "Commands:" + echo " on Apply a patch" + echo " off Remove a patch" + echo " toggle Toggle a patch on/off" + echo " status Show all patches status" + echo " check Check if any patches are on (for pre-commit)" + echo " all-on Apply all patches" + echo " all-off Remove all patches" + echo " rebuild Rebuild patches from current state" + ;; +esac diff --git a/.dev/hooks/pre-commit-patch-guard.sh b/.dev/hooks/pre-commit-patch-guard.sh new file mode 100644 index 00000000..1611d199 --- /dev/null +++ b/.dev/hooks/pre-commit-patch-guard.sh @@ -0,0 +1,13 @@ +#!/bin/bash +# Pre-commit hook to prevent committing with dev patches active +# Add to .git/hooks/pre-commit or use with husky/lefthook + +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +DEV_PATCH="$SCRIPT_DIR/../dev-patch.sh" + +# Use the dev-patch check command +if [ -f "$DEV_PATCH" ]; then + bash "$DEV_PATCH" check +fi diff --git a/.dev/patches.yaml b/.dev/patches.yaml new file mode 100644 index 00000000..81c1e01f --- /dev/null +++ b/.dev/patches.yaml @@ -0,0 +1,6 @@ +# Dev Patches State +# Toggle patches on/off with: git dev-patch toggle + +patches: + dev-login-bypass: off + mock-storage: off diff --git a/.dev/patches/dev-login-bypass.patch b/.dev/patches/dev-login-bypass.patch new file mode 100644 index 00000000..d60e9cb6 --- /dev/null +++ b/.dev/patches/dev-login-bypass.patch @@ -0,0 +1,871 @@ +diff --git a/src/routes/__root.tsx b/src/routes/__root.tsx +index ca49a35..9d1b7af 100644 +--- a/src/routes/__root.tsx ++++ b/src/routes/__root.tsx +@@ -22,6 +22,8 @@ import "nprogress/nprogress.css"; + import { shouldShowEarlyAccessFn } from "~/fn/early-access"; + import { useAnalytics } from "~/hooks/use-analytics"; + import { publicEnv } from "~/utils/env-public"; ++import { getDevMenuConfigFn } from "~/fn/dev-auth"; ++import { DevFloatingMenu } from "~/components/dev-menu/dev-floating-menu"; + + // OpenGraph image configuration + const OG_IMAGE_PATH = "/marketing.png"; +@@ -39,8 +41,11 @@ export const Route = createRootRouteWithContext<{ queryClient: QueryClient }>()( + } + }, + loader: async () => { +- const shouldShowEarlyAccess = await shouldShowEarlyAccessFn(); +- return { shouldShowEarlyAccess }; ++ const [shouldShowEarlyAccess, devMenuConfig] = await Promise.all([ ++ shouldShowEarlyAccessFn(), ++ getDevMenuConfigFn(), ++ ]); ++ return { shouldShowEarlyAccess, devMenuConfig }; + }, + head: () => ({ + meta: [ +@@ -119,6 +124,10 @@ function RootDocument({ children }: { children: React.ReactNode }) { + const routerState = useRouterState(); + const loaderData = Route.useLoaderData(); + const shouldShowEarlyAccess = loaderData?.shouldShowEarlyAccess ?? false; ++ const devMenuConfig = loaderData?.devMenuConfig ?? { ++ isEnabled: false, ++ currentUserId: null, ++ }; + const showFooter = + !routerState.location.pathname.startsWith("/learn") && + !routerState.location.pathname.startsWith("/admin") && +@@ -222,6 +231,9 @@ function RootDocument({ children }: { children: React.ReactNode }) { + + )} + ++ {devMenuConfig.isEnabled && ( ++ ++ )} + {/* + */} + +diff --git a/src/routes/api/login/google/index.ts b/src/routes/api/login/google/index.ts +index 7da2c25..3b53418 100644 +--- a/src/routes/api/login/google/index.ts ++++ b/src/routes/api/login/google/index.ts +@@ -12,6 +12,13 @@ export const Route = createFileRoute("/api/login/google/")({ + const url = new URL(request.url); + const redirectUri = url.searchParams.get("redirect_uri") || "/"; + ++ // Dev bypass - show dev login form instead of Google OAuth ++ if (process.env.DEV_BYPASS_AUTH === "true") { ++ const devLoginUrl = new URL("/dev-login", url.origin); ++ devLoginUrl.searchParams.set("redirect_uri", redirectUri); ++ return Response.redirect(devLoginUrl.href); ++ } ++ + const state = generateState(); + const codeVerifier = generateCodeVerifier(); + const authorizationInfo = googleAuth.createAuthorizationURL( +diff --git a/src/components/dev-menu/dev-floating-menu.tsx b/src/components/dev-menu/dev-floating-menu.tsx +new file mode 100644 +index 0000000..21dd6a0 +--- /dev/null ++++ b/src/components/dev-menu/dev-floating-menu.tsx +@@ -0,0 +1,324 @@ ++import { useState, useEffect, useCallback } from "react"; ++import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; ++import { Bug, Plus, RefreshCw, X } from "lucide-react"; ++import { Button } from "~/components/ui/button"; ++import { Input } from "~/components/ui/input"; ++import { Label } from "~/components/ui/label"; ++import { Checkbox } from "~/components/ui/checkbox"; ++import { DevUserCard, type DevUser } from "./dev-user-card"; ++import { ++ generateRandomEmail, ++ generateRandomName, ++} from "./random-email-generator"; ++import { getDevUsersFn, devLoginFn, switchDevUserFn } from "~/fn/dev-auth"; ++ ++type Corner = "top-left" | "top-right" | "bottom-left" | "bottom-right"; ++ ++const STORAGE_KEY = "dev-menu-corner"; ++ ++const cornerStyles: Record = { ++ "top-left": "top-4 left-4", ++ "top-right": "top-4 right-4", ++ "bottom-left": "bottom-4 left-4", ++ "bottom-right": "bottom-4 right-4", ++}; ++ ++type DevFloatingMenuProps = { ++ currentUserId: number | null; ++}; ++ ++export function DevFloatingMenu({ currentUserId }: DevFloatingMenuProps) { ++ const [isExpanded, setIsExpanded] = useState(false); ++ const [corner, setCorner] = useState("bottom-right"); ++ const [isDragging, setIsDragging] = useState(false); ++ const [showCreateForm, setShowCreateForm] = useState(false); ++ const [newUser, setNewUser] = useState({ ++ email: "", ++ name: "", ++ isAdmin: false, ++ isPremium: false, ++ }); ++ ++ const queryClient = useQueryClient(); ++ ++ // Load corner from localStorage ++ useEffect(() => { ++ const saved = localStorage.getItem(STORAGE_KEY); ++ if (saved && Object.keys(cornerStyles).includes(saved)) { ++ setCorner(saved as Corner); ++ } ++ }, []); ++ ++ // Save corner to localStorage ++ const saveCorner = useCallback((newCorner: Corner) => { ++ setCorner(newCorner); ++ localStorage.setItem(STORAGE_KEY, newCorner); ++ }, []); ++ ++ // Determine corner from drag end position ++ const handleDragEnd = useCallback( ++ (e: React.DragEvent) => { ++ setIsDragging(false); ++ const { clientX, clientY } = e; ++ const { innerWidth, innerHeight } = window; ++ ++ const isLeft = clientX < innerWidth / 2; ++ const isTop = clientY < innerHeight / 2; ++ ++ const newCorner: Corner = isTop ++ ? isLeft ++ ? "top-left" ++ : "top-right" ++ : isLeft ++ ? "bottom-left" ++ : "bottom-right"; ++ ++ saveCorner(newCorner); ++ }, ++ [saveCorner] ++ ); ++ ++ const { data: devUsers = [], isLoading: isLoadingUsers } = useQuery({ ++ queryKey: ["dev-users"], ++ queryFn: () => getDevUsersFn(), ++ enabled: isExpanded, ++ }); ++ ++ const switchMutation = useMutation({ ++ mutationFn: (userId: number) => switchDevUserFn({ data: { userId } }), ++ onSuccess: () => { ++ window.location.reload(); ++ }, ++ }); ++ ++ const createMutation = useMutation({ ++ mutationFn: (data: { ++ email: string; ++ name: string; ++ isAdmin: boolean; ++ isPremium: boolean; ++ }) => devLoginFn({ data }), ++ onSuccess: () => { ++ window.location.reload(); ++ }, ++ }); ++ ++ const handleCreateUser = () => { ++ if (!newUser.email || !newUser.name) return; ++ createMutation.mutate(newUser); ++ }; ++ ++ const handleRandomize = () => { ++ setNewUser((prev) => ({ ++ ...prev, ++ email: generateRandomEmail(), ++ name: generateRandomName(), ++ })); ++ }; ++ ++ // Initialize random values when form opens ++ useEffect(() => { ++ if (showCreateForm && !newUser.email) { ++ handleRandomize(); ++ } ++ }, [showCreateForm, newUser.email]); ++ ++ return ( ++
setIsDragging(true)} ++ onDragEnd={handleDragEnd} ++ > ++ {!isExpanded ? ( ++ ++ ) : ( ++
++
++
++ ++ Dev Menu ++
++ ++
++ ++
++ {!showCreateForm ? ( ++ <> ++
++ ++ ++
++ ++
++
++ {devUsers.map((user: DevUser) => ( ++ switchMutation.mutate(id)} ++ isLoading={switchMutation.isPending} ++ /> ++ ))} ++ {devUsers.length === 0 && !isLoadingUsers && ( ++
++ No dev users yet ++
++ )} ++
++
++ ++ ++ ++ ) : ( ++
++
++ ++ ++
++ ++
++
++ ++ ++ setNewUser((prev) => ({ ...prev, email: e.target.value })) ++ } ++ className="h-8 text-sm" ++ /> ++
++
++ ++ ++ setNewUser((prev) => ({ ...prev, name: e.target.value })) ++ } ++ className="h-8 text-sm" ++ /> ++
++
++ ++
++
++ ++ setNewUser((prev) => ({ ++ ...prev, ++ isAdmin: checked === true, ++ })) ++ } ++ /> ++ ++
++
++ ++ setNewUser((prev) => ({ ++ ...prev, ++ isPremium: checked === true, ++ })) ++ } ++ /> ++ ++
++
++ ++
++ ++ ++
++
++ )} ++
++ ++
++ Drag to reposition ++
++
++ )} ++
++ ); ++} +diff --git a/src/components/dev-menu/dev-user-card.tsx b/src/components/dev-menu/dev-user-card.tsx +new file mode 100644 +index 0000000..c04a458 +--- /dev/null ++++ b/src/components/dev-menu/dev-user-card.tsx +@@ -0,0 +1,75 @@ ++import { Badge } from "~/components/ui/badge"; ++import { Avatar, AvatarFallback, AvatarImage } from "~/components/ui/avatar"; ++ ++export type DevUser = { ++ id: number; ++ email: string; ++ name: string; ++ image: string; ++ isAdmin: boolean; ++ isPremium: boolean; ++}; ++ ++type DevUserCardProps = { ++ user: DevUser; ++ isCurrentUser: boolean; ++ onSwitch: (userId: number) => void; ++ isLoading: boolean; ++}; ++ ++export function DevUserCard({ ++ user, ++ isCurrentUser, ++ onSwitch, ++ isLoading, ++}: DevUserCardProps) { ++ return ( ++ ++ ); ++} +diff --git a/src/components/dev-menu/random-email-generator.ts b/src/components/dev-menu/random-email-generator.ts +new file mode 100644 +index 0000000..bf0b105 +--- /dev/null ++++ b/src/components/dev-menu/random-email-generator.ts +@@ -0,0 +1,68 @@ ++const adjectives = [ ++ "happy", ++ "swift", ++ "clever", ++ "bright", ++ "cool", ++ "wild", ++ "calm", ++ "bold", ++ "keen", ++ "wise", ++ "quick", ++ "brave", ++ "sharp", ++ "fresh", ++ "warm", ++ "quiet", ++ "loud", ++ "soft", ++ "dark", ++ "light", ++]; ++ ++const nouns = [ ++ "tiger", ++ "cloud", ++ "pixel", ++ "river", ++ "flame", ++ "storm", ++ "frost", ++ "wave", ++ "spark", ++ "stone", ++ "leaf", ++ "wind", ++ "star", ++ "moon", ++ "sun", ++ "tree", ++ "bird", ++ "fish", ++ "wolf", ++ "bear", ++]; ++ ++function randomItem(arr: T[]): T { ++ return arr[Math.floor(Math.random() * arr.length)]; ++} ++ ++function randomNumber(min: number, max: number): number { ++ return Math.floor(Math.random() * (max - min + 1)) + min; ++} ++ ++export function generateRandomEmail(): string { ++ const adj = randomItem(adjectives); ++ const noun = randomItem(nouns); ++ const num = randomNumber(10, 99); ++ return `${adj}-${noun}-${num}@localhost.test`; ++} ++ ++export function generateRandomName(): string { ++ const adj = randomItem(adjectives); ++ const noun = randomItem(nouns); ++ // Capitalize first letter of each word ++ const capitalize = (s: string) => s.charAt(0).toUpperCase() + s.slice(1); ++ return `${capitalize(adj)} ${capitalize(noun)}`; ++} +diff --git a/src/fn/dev-auth.ts b/src/fn/dev-auth.ts +new file mode 100644 +index 0000000..5a84b9d +--- /dev/null ++++ b/src/fn/dev-auth.ts +@@ -0,0 +1,180 @@ ++import { createServerFn } from "@tanstack/react-start"; ++import { database } from "~/db"; ++import { accounts, profiles, users } from "~/db/schema"; ++import { eq, like } from "drizzle-orm"; ++import { setSession, getCurrentUser } from "~/utils/session"; ++import { z } from "zod"; ++import { GoogleUser } from "~/use-cases/types"; ++import { getAccountByGoogleIdUseCase } from "~/use-cases/accounts"; ++import { createGoogleUserUseCase } from "~/use-cases/users"; ++ ++type DevLoginInput = { ++ email: string; ++ name: string; ++ isAdmin: boolean; ++ isPremium: boolean; ++}; ++ ++// DiceBear avatar styles that work reliably ++const DICEBEAR_STYLES = [ ++ "lorelei", ++ "avataaars", ++ "bottts", ++ "fun-emoji", ++ "notionists", ++ "open-peeps", ++ "personas", ++ "pixel-art", ++]; ++ ++// Simple hash for consistent style selection ++function simpleHash(str: string): number { ++ let hash = 0; ++ for (let i = 0; i < str.length; i++) { ++ hash = (hash << 5) - hash + str.charCodeAt(i); ++ hash = hash & hash; ++ } ++ return Math.abs(hash); ++} ++ ++// Generate a consistent DiceBear avatar URL based on email ++function getDevAvatarUrl(email: string): string { ++ const style = DICEBEAR_STYLES[simpleHash(email) % DICEBEAR_STYLES.length]; ++ return `https://api.dicebear.com/7.x/${style}/svg?seed=${encodeURIComponent(email)}&size=200`; ++} ++ ++// Create mock GoogleUser from dev login input ++function createMockGoogleUser(email: string, name: string): GoogleUser { ++ const nameParts = name.split(" "); ++ const givenName = nameParts[0] || "Dev"; ++ const familyName = nameParts.slice(1).join(" ") || "User"; ++ ++ return { ++ sub: `dev-${email.replace(/[^a-z0-9]/gi, "-")}`, ++ name, ++ given_name: givenName, ++ family_name: familyName, ++ picture: getDevAvatarUrl(email), ++ email, ++ email_verified: true, ++ locale: "en", ++ }; ++} ++ ++export const devLoginFn = createServerFn({ method: "POST" }) ++ .inputValidator((data: DevLoginInput) => data) ++ .handler(async ({ data }) => { ++ // Only allow in dev mode ++ if (process.env.DEV_BYPASS_AUTH !== "true") { ++ throw new Error("Dev login is disabled"); ++ } ++ ++ const { email, name, isAdmin, isPremium } = data; ++ ++ // Create mock Google user - same structure as real OAuth ++ const mockGoogleUser = createMockGoogleUser(email, name); ++ ++ // Use the SAME flow as real OAuth callback ++ const existingAccount = await getAccountByGoogleIdUseCase(mockGoogleUser.sub); ++ ++ if (existingAccount) { ++ // Update user flags (dev-only feature) ++ await database ++ .update(users) ++ .set({ isAdmin, isPremium }) ++ .where(eq(users.id, existingAccount.userId)); ++ ++ await setSession(existingAccount.userId); ++ return { success: true, userId: existingAccount.userId }; ++ } ++ ++ // Create user through the SAME use case as real OAuth ++ const userId = await createGoogleUserUseCase(mockGoogleUser); ++ ++ // Update user flags after creation (dev-only feature) ++ await database ++ .update(users) ++ .set({ isAdmin, isPremium }) ++ .where(eq(users.id, userId)); ++ ++ await setSession(userId); ++ return { success: true, userId }; ++ }); ++ ++// Get all dev users (users with googleId starting with "dev-") ++export const getDevUsersFn = createServerFn({ method: "GET" }).handler( ++ async () => { ++ if (process.env.DEV_BYPASS_AUTH !== "true") { ++ throw new Error("Dev features are disabled"); ++ } ++ ++ const devAccounts = await database.query.accounts.findMany({ ++ where: like(accounts.googleId, "dev-%"), ++ }); ++ ++ const userIds = devAccounts.map((a) => a.userId); ++ if (userIds.length === 0) { ++ return []; ++ } ++ ++ const devUsers = await Promise.all( ++ userIds.map(async (userId) => { ++ const user = await database.query.users.findFirst({ ++ where: eq(users.id, userId), ++ }); ++ const profile = await database.query.profiles.findFirst({ ++ where: eq(profiles.userId, userId), ++ }); ++ return { ++ id: userId, ++ email: user?.email ?? "", ++ name: profile?.displayName ?? "", ++ image: profile?.image ?? "", ++ isAdmin: user?.isAdmin ?? false, ++ isPremium: user?.isPremium ?? false, ++ }; ++ }) ++ ); ++ ++ return devUsers; ++ } ++); ++ ++// Switch to an existing dev user by userId ++export const switchDevUserFn = createServerFn({ method: "POST" }) ++ .inputValidator(z.object({ userId: z.number() })) ++ .handler(async ({ data }) => { ++ if (process.env.DEV_BYPASS_AUTH !== "true") { ++ throw new Error("Dev features are disabled"); ++ } ++ ++ const { userId } = data; ++ ++ // Verify this is a dev user ++ const account = await database.query.accounts.findFirst({ ++ where: eq(accounts.userId, userId), ++ }); ++ ++ if (!account || !account.googleId?.startsWith("dev-")) { ++ throw new Error("Not a dev user"); ++ } ++ ++ await setSession(userId); ++ return { success: true }; ++ }); ++ ++// Get dev menu configuration (enabled status and current user ID) ++export const getDevMenuConfigFn = createServerFn({ method: "GET" }).handler( ++ async () => { ++ const isEnabled = process.env.DEV_BYPASS_AUTH === "true"; ++ if (!isEnabled) { ++ return { isEnabled: false, currentUserId: null }; ++ } ++ ++ const user = await getCurrentUser(); ++ return { ++ isEnabled: true, ++ currentUserId: user?.id ?? null, ++ }; ++ } ++); +diff --git a/src/routes/dev-login.tsx b/src/routes/dev-login.tsx +new file mode 100644 +index 0000000..350f6e9 +--- /dev/null ++++ b/src/routes/dev-login.tsx +@@ -0,0 +1,128 @@ ++import { createFileRoute } from "@tanstack/react-router"; ++import { useState } from "react"; ++import { Button } from "~/components/ui/button"; ++import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "~/components/ui/card"; ++import { Input } from "~/components/ui/input"; ++import { Label } from "~/components/ui/label"; ++import { Checkbox } from "~/components/ui/checkbox"; ++import { devLoginFn } from "~/fn/dev-auth"; ++ ++export const Route = createFileRoute("/dev-login")({ ++ validateSearch: (search: Record) => ({ ++ redirect_uri: (search.redirect_uri as string) || "/", ++ }), ++ component: DevLoginPage, ++}); ++ ++function DevLoginPage() { ++ const { redirect_uri: redirectUri } = Route.useSearch(); ++ ++ const [formData, setFormData] = useState({ ++ email: "premium@localhost.test", ++ name: "Premium User", ++ isAdmin: false, ++ isPremium: true, ++ }); ++ const [isLoading, setIsLoading] = useState(false); ++ ++ const updateUserType = (isAdmin: boolean, isPremium: boolean) => { ++ let email = "user@localhost.test"; ++ let name = "Basic User"; ++ ++ if (isAdmin && isPremium) { ++ email = "admin-premium@localhost.test"; ++ name = "Admin Premium"; ++ } else if (isAdmin) { ++ email = "admin@localhost.test"; ++ name = "Admin User"; ++ } else if (isPremium) { ++ email = "premium@localhost.test"; ++ name = "Premium User"; ++ } ++ ++ setFormData({ email, name, isAdmin, isPremium }); ++ }; ++ ++ const handleSubmit = async (e: React.FormEvent) => { ++ e.preventDefault(); ++ setIsLoading(true); ++ ++ try { ++ await devLoginFn({ data: formData }); ++ // Hard redirect to pick up session cookie ++ window.location.href = redirectUri; ++ } catch (error) { ++ console.error("Dev login failed:", error); ++ setIsLoading(false); ++ } ++ }; ++ ++ return ( ++
++ ++ ++ ++ DEV ++ Dev Login ++ ++ ++ Bypass OAuth for local development. This only works when DEV_BYPASS_AUTH=true. ++ ++ ++ ++
++
++ ++ setFormData({ ...formData, email: e.target.value })} ++ placeholder="dev@localhost.test" ++ /> ++
++ ++
++ ++ setFormData({ ...formData, name: e.target.value })} ++ placeholder="Dev User" ++ /> ++
++ ++
++
++ ++ updateUserType(checked === true, formData.isPremium) ++ } ++ /> ++ ++
++ ++
++ ++ updateUserType(formData.isAdmin, checked === true) ++ } ++ /> ++ ++
++
++ ++ ++
++
++
++
++ ); ++} diff --git a/.dev/patches/mock-storage.patch b/.dev/patches/mock-storage.patch new file mode 100644 index 00000000..533f8a50 --- /dev/null +++ b/.dev/patches/mock-storage.patch @@ -0,0 +1,887 @@ +diff --git a/src/fn/video-transcoding.ts b/src/fn/video-transcoding.ts +index 5f742fe..b3095ff 100644 +--- a/src/fn/video-transcoding.ts ++++ b/src/fn/video-transcoding.ts +@@ -169,7 +169,7 @@ export const getAvailableQualitiesFn = createServerFn({ method: "GET" }) + // Generate presigned URLs for all available qualities + const urls: Record = {}; + for (const quality of qualities) { +- if (type === "r2") { ++ if (type === "r2" || type === "mock") { + urls[quality.quality] = await storage.getPresignedUrl(quality.key); + } else { + // For non-R2 storage, return the API route +@@ -198,8 +198,8 @@ export const getThumbnailUrlFn = createServerFn({ method: "GET" }) + const { segmentId } = data; + const { storage, type } = getStorage(); + +- // Only support R2 storage for thumbnails +- if (type !== "r2") { ++ // Only support R2/mock storage for thumbnails ++ if (type !== "r2" && type !== "mock") { + return { thumbnailUrl: null }; + } + +diff --git a/src/routes/-components/hero.tsx b/src/routes/-components/hero.tsx +index e060bdc..a628940 100644 +--- a/src/routes/-components/hero.tsx ++++ b/src/routes/-components/hero.tsx +@@ -1,194 +1,194 @@ +-import { Link } from "@tanstack/react-router"; +-import { useContinueSlug } from "~/hooks/use-continue-slug"; +-import { createServerFn } from "@tanstack/react-start"; +-import { VideoPlayer } from "~/routes/learn/-components/video-player"; +-import { useQuery } from "@tanstack/react-query"; +-import { Play, ShoppingCart } from "lucide-react"; +-import { getStorage } from "~/utils/storage"; +-import { getThumbnailKey } from "~/utils/video-transcoding"; +-import { database } from "~/db"; +-import { segments, modules } from "~/db/schema"; +-import { eq } from "drizzle-orm"; +- +-const getFirstVideoSegmentFn = createServerFn().handler(async () => { +- // Get segments ordered by module order, then segment order +- const result = await database +- .select({ +- segment: segments, +- moduleOrder: modules.order, +- }) +- .from(segments) +- .innerJoin(modules, eq(segments.moduleId, modules.id)) +- .orderBy(modules.order, segments.order); +- +- // Find the first segment that has a video and is not premium +- // (Landing page should show free preview content) +- const firstVideoSegment = result +- .map((row) => row.segment) +- .find( +- (segment) => +- segment.videoKey && !segment.isPremium && !segment.isComingSoon +- ); +- +- // Get thumbnail URL server-side if available +- let thumbnailUrl: string | null = null; +- if (firstVideoSegment?.videoKey) { +- const { storage, type } = getStorage(); +- if (type === "r2") { +- const thumbnailKey = +- firstVideoSegment.thumbnailKey || +- getThumbnailKey(firstVideoSegment.videoKey); +- const exists = await storage.exists(thumbnailKey); +- if (exists) { +- thumbnailUrl = await storage.getPresignedUrl(thumbnailKey); +- } +- } +- } +- +- return { segment: firstVideoSegment, thumbnailUrl }; +-}); +- +-export function HeroSection() { +- const continueSlug = useContinueSlug(); +- +- const { +- data: firstVideoData, +- isLoading, +- error, +- } = useQuery({ +- queryKey: ["first-video-segment"], +- queryFn: () => getFirstVideoSegmentFn(), +- staleTime: 1000 * 60 * 5, // 5 minutes +- gcTime: 1000 * 60 * 10, // 10 minutes +- }); +- +- const firstVideoSegment = firstVideoData?.segment; +- const thumbnailUrl = firstVideoData?.thumbnailUrl; +- +- return ( +-
+- {/* Modern AI-themed gradient background */} +-
+-
+- +- {/* AI circuit pattern overlay */} +-
+-
+-
+- +- {/* AI-themed floating elements */} +-
+-
+-
+-
+-
+-
+- +- {/* Content */} +-
+-
+-
+-
+- {/* Left side - Content */} +-
+- {/* Badge */} +-
+- +- Agentic Coding Mastery Course +-
+- +-

+- Coding is Changing, +- Master{" "} +- Agentic Coding{" "} +-

+- +-

+- Master AI-first development with Cursor IDE, Claude Code CLI, +- and advanced AI models. Learn how to leverage Opus 4.5, +- Composer1, GPT-5.1 Codex, and cutting-edge agentic programming +- techniques to accelerate your development workflow and build +- applications 10x faster than traditional programming methods. +-

+- +-
+- { +- if (e.key === " ") { +- e.preventDefault(); +- e.currentTarget.click(); +- } +- }} +- > +- +- Buy Now +- +- { +- if (e.key === " ") { +- e.preventDefault(); +- e.currentTarget.click(); +- } +- }} +- > +- +- Start Learning +- +-
+-
+- +- {/* Right side - Video player */} +-
+- {isLoading ? ( +-
+-
+- Loading video... +-
+-
+- ) : error ? ( +-
+-
+- Unable to load video +-
+-
+- ) : firstVideoSegment ? ( +-
+- {/* Video container with glass morphism effect */} +-
+-
+- +-
+- +- {/* Decorative elements - using theme colors */} +-
+-
+-
+-
+- ) : ( +-
+-
+- No video available +-
+-
+- )} +-
+-
+-
+-
+-
+- +- {/* Bottom gradient fade with theme accent */} +-
+-
+-
+- ); +-} ++import { Link } from "@tanstack/react-router"; ++import { useContinueSlug } from "~/hooks/use-continue-slug"; ++import { createServerFn } from "@tanstack/react-start"; ++import { VideoPlayer } from "~/routes/learn/-components/video-player"; ++import { useQuery } from "@tanstack/react-query"; ++import { Play, ShoppingCart } from "lucide-react"; ++import { getStorage } from "~/utils/storage"; ++import { getThumbnailKey } from "~/utils/video-transcoding"; ++import { database } from "~/db"; ++import { segments, modules } from "~/db/schema"; ++import { eq } from "drizzle-orm"; ++ ++const getFirstVideoSegmentFn = createServerFn().handler(async () => { ++ // Get segments ordered by module order, then segment order ++ const result = await database ++ .select({ ++ segment: segments, ++ moduleOrder: modules.order, ++ }) ++ .from(segments) ++ .innerJoin(modules, eq(segments.moduleId, modules.id)) ++ .orderBy(modules.order, segments.order); ++ ++ // Find the first segment that has a video and is not premium ++ // (Landing page should show free preview content) ++ const firstVideoSegment = result ++ .map((row) => row.segment) ++ .find( ++ (segment) => ++ segment.videoKey && !segment.isPremium && !segment.isComingSoon ++ ); ++ ++ // Get thumbnail URL server-side if available ++ let thumbnailUrl: string | null = null; ++ if (firstVideoSegment?.videoKey) { ++ const { storage, type } = getStorage(); ++ if (type === "r2" || type === "mock") { ++ const thumbnailKey = ++ firstVideoSegment.thumbnailKey || ++ getThumbnailKey(firstVideoSegment.videoKey); ++ const exists = await storage.exists(thumbnailKey); ++ if (exists) { ++ thumbnailUrl = await storage.getPresignedUrl(thumbnailKey); ++ } ++ } ++ } ++ ++ return { segment: firstVideoSegment, thumbnailUrl }; ++}); ++ ++export function HeroSection() { ++ const continueSlug = useContinueSlug(); ++ ++ const { ++ data: firstVideoData, ++ isLoading, ++ error, ++ } = useQuery({ ++ queryKey: ["first-video-segment"], ++ queryFn: () => getFirstVideoSegmentFn(), ++ staleTime: 1000 * 60 * 5, // 5 minutes ++ gcTime: 1000 * 60 * 10, // 10 minutes ++ }); ++ ++ const firstVideoSegment = firstVideoData?.segment; ++ const thumbnailUrl = firstVideoData?.thumbnailUrl; ++ ++ return ( ++
++ {/* Modern AI-themed gradient background */} ++
++
++ ++ {/* AI circuit pattern overlay */} ++
++
++
++ ++ {/* AI-themed floating elements */} ++
++
++
++
++
++
++ ++ {/* Content */} ++
++
++
++
++ {/* Left side - Content */} ++
++ {/* Badge */} ++
++ ++ Agentic Coding Mastery Course ++
++ ++

++ Coding is Changing, ++ Master{" "} ++ Agentic Coding{" "} ++

++ ++

++ Master AI-first development with Cursor IDE, Claude Code CLI, ++ and advanced AI models. Learn how to leverage Opus 4.5, ++ Composer1, GPT-5.1 Codex, and cutting-edge agentic programming ++ techniques to accelerate your development workflow and build ++ applications 10x faster than traditional programming methods. ++

++ ++
++ { ++ if (e.key === " ") { ++ e.preventDefault(); ++ e.currentTarget.click(); ++ } ++ }} ++ > ++ ++ Buy Now ++ ++ { ++ if (e.key === " ") { ++ e.preventDefault(); ++ e.currentTarget.click(); ++ } ++ }} ++ > ++ ++ Start Learning ++ ++
++
++ ++ {/* Right side - Video player */} ++
++ {isLoading ? ( ++
++
++ Loading video... ++
++
++ ) : error ? ( ++
++
++ Unable to load video ++
++
++ ) : firstVideoSegment ? ( ++
++ {/* Video container with glass morphism effect */} ++
++
++ ++
++ ++ {/* Decorative elements - using theme colors */} ++
++
++
++
++ ) : ( ++
++
++ No video available ++
++
++ )} ++
++
++
++
++
++ ++ {/* Bottom gradient fade with theme accent */} ++
++
++
++ ); ++} +diff --git a/src/utils/storage/index.ts b/src/utils/storage/index.ts +index aa47e5f..6fc87aa 100644 +--- a/src/utils/storage/index.ts ++++ b/src/utils/storage/index.ts +@@ -1,13 +1,21 @@ + import type { IStorage } from "./storage.interface"; + import { R2Storage } from "./r2"; ++import { MockStorage } from "./mock-storage"; + + let storage: IStorage | null = null; + +-// Storage provider factory/singleton - R2 only +-export function getStorage(): { storage: IStorage; type: "r2" } { ++type StorageType = "r2" | "mock"; ++ ++// Storage provider factory/singleton ++export function getStorage(): { storage: IStorage; type: StorageType } { + if (!storage) { +- storage = new R2Storage(); ++ if (process.env.DEV_MOCK_STORAGE === "true") { ++ console.log("[Storage] Using MockStorage"); ++ storage = new MockStorage(); ++ } else { ++ storage = new R2Storage(); ++ } + } + +- return { storage, type: "r2" }; ++ return { storage, type: storage instanceof MockStorage ? "mock" : "r2" }; + } +diff --git a/src/utils/video-transcoding.ts b/src/utils/video-transcoding.ts +index 71d53ee..3111246 100644 +--- a/src/utils/video-transcoding.ts ++++ b/src/utils/video-transcoding.ts +@@ -1,166 +1,169 @@ +-import { exec } from "node:child_process"; +-import { promisify } from "node:util"; +-import { writeFile, unlink, readFile } from "node:fs/promises"; +-import { join } from "node:path"; +-import { tmpdir } from "node:os"; +- +-const execAsync = promisify(exec); +- +-export type VideoQuality = "720p" | "480p"; +- +-export interface ThumbnailOptions { +- inputPath: string; +- outputPath: string; +- width?: number; +- seekTime?: number; +-} +- +-export interface TranscodeOptions { +- inputPath: string; +- outputPath: string; +- quality: VideoQuality; +-} +- +-const FFMPEG_PRESET = "medium"; +-const FFMPEG_CRF = "23"; +- +-/** +- * Transcodes a video file to the specified quality using ffmpeg +- */ +-export async function transcodeVideo(options: TranscodeOptions): Promise { +- const { inputPath, outputPath, quality } = options; +- +- // Determine target height based on quality +- const targetHeight = quality === "720p" ? "720" : "480"; +- +- // Build ffmpeg command +- // -vf "scale=-2:HEIGHT" maintains aspect ratio, sets height +- // -c:v libx264 uses H.264 codec +- // -preset medium balances speed vs compression +- // -crf 23 provides good quality (lower = better quality, 18-28 is typical range) +- // -c:a aac uses AAC audio codec +- const command = `ffmpeg -i "${inputPath}" -vf "scale=-2:${targetHeight}" -c:v libx264 -preset ${FFMPEG_PRESET} -crf ${FFMPEG_CRF} -c:a aac -y "${outputPath}"`; +- +- try { +- await execAsync(command); +- } catch (error) { +- throw new Error( +- `Failed to transcode video to ${quality}: ${error instanceof Error ? error.message : String(error)}` +- ); +- } +-} +- +-/** +- * Creates a temporary file path for video processing +- */ +-export function createTempVideoPath( +- prefix: string, +- suffix: string = ".mp4" +-): string { +- const timestamp = Date.now(); +- const random = Math.random().toString(36).substring(2, 9); +- return join(tmpdir(), `${prefix}_${timestamp}_${random}${suffix}`); +-} +- +-/** +- * Cleans up temporary files +- */ +-export async function cleanupTempFiles(...paths: string[]): Promise { +- await Promise.allSettled( +- paths.map(async (path) => { +- try { +- await unlink(path); +- } catch (error) { +- // Ignore errors if file doesn't exist +- if ((error as NodeJS.ErrnoException).code !== "ENOENT") { +- console.error(`Failed to delete temp file ${path}:`, error); +- } +- } +- }) +- ); +-} +- +-/** +- * Writes a buffer to a temporary file +- */ +-export async function writeBufferToTempFile( +- buffer: Buffer, +- prefix: string, +- suffix: string = ".mp4" +-): Promise { +- const tempPath = createTempVideoPath(prefix, suffix); +- await writeFile(tempPath, buffer); +- return tempPath; +-} +- +-/** +- * Extracts a thumbnail from a video file using ffmpeg +- * @param options - Thumbnail extraction options +- * @returns Buffer containing the JPEG image data +- */ +-export async function extractThumbnail( +- options: ThumbnailOptions +-): Promise { +- const { inputPath, outputPath, width = 640, seekTime = 1 } = options; +- +- // Create a temporary file for the initial extraction +- // Use a unique temp path to avoid conflicts +- const tempOutputPath = createTempThumbnailPath("temp_thumb"); +- +- // Build ffmpeg command to extract thumbnail +- // -ss 1 seeks to 1 second into the video (avoids black frames at start) +- // -vframes 1 extracts only 1 frame +- // -vf "scale=WIDTH:-1" scales to specified width maintaining aspect ratio +- // -q:v 2 sets JPEG quality (1-31, lower is better) +- const ffmpegCommand = `ffmpeg -ss ${seekTime} -i "${inputPath}" -vframes 1 -vf "scale=${width}:-1" -q:v 2 -y "${tempOutputPath}"`; +- +- try { +- // Extract thumbnail with ffmpeg +- await execAsync(ffmpegCommand); +- +- // Convert to progressive JPEG using ImageMagick for faster perceived loading +- // Progressive JPEGs display a low-quality version first and progressively improve +- // This makes the thumbnail appear faster even while still downloading +- const convertCommand = `convert "${tempOutputPath}" -interlace Plane -quality 85 "${outputPath}"`; +- await execAsync(convertCommand); +- +- // Clean up temp file +- try { +- await unlink(tempOutputPath); +- } catch { +- // Ignore cleanup errors +- } +- +- // Read the generated progressive JPEG thumbnail +- const thumbnailBuffer = await readFile(outputPath); +- return thumbnailBuffer; +- } catch (error) { +- // Clean up temp file on error +- try { +- await unlink(tempOutputPath); +- } catch { +- // Ignore cleanup errors +- } +- throw new Error( +- `Failed to extract thumbnail: ${error instanceof Error ? error.message : String(error)}` +- ); +- } +-} +- +-/** +- * Creates a temporary file path for thumbnail processing +- */ +-export function createTempThumbnailPath(prefix: string): string { +- const timestamp = Date.now(); +- const random = Math.random().toString(36).substring(2, 9); +- return join(tmpdir(), `${prefix}_${timestamp}_${random}.jpg`); +-} +- +-/** +- * Generates a thumbnail key from a base video key +- * @param baseKey - The original video key (e.g., "abc123.mp4") +- * @returns The thumbnail key (e.g., "abc123_thumb.jpg") +- */ +-export function getThumbnailKey(baseKey: string): string { +- return baseKey.replace(".mp4", "_thumb.jpg"); +-} ++import { exec } from "node:child_process"; ++import { promisify } from "node:util"; ++import { writeFile, unlink, readFile } from "node:fs/promises"; ++import { join } from "node:path"; ++import { tmpdir } from "node:os"; ++ ++const execAsync = promisify(exec); ++ ++export type VideoQuality = "720p" | "480p"; ++ ++export interface ThumbnailOptions { ++ inputPath: string; ++ outputPath: string; ++ width?: number; ++ seekTime?: number; ++} ++ ++export interface TranscodeOptions { ++ inputPath: string; ++ outputPath: string; ++ quality: VideoQuality; ++} ++ ++const FFMPEG_PRESET = "medium"; ++const FFMPEG_CRF = "23"; ++ ++/** ++ * Transcodes a video file to the specified quality using ffmpeg ++ */ ++export async function transcodeVideo(options: TranscodeOptions): Promise { ++ const { inputPath, outputPath, quality } = options; ++ ++ // Determine target height based on quality ++ const targetHeight = quality === "720p" ? "720" : "480"; ++ ++ // Build ffmpeg command ++ // -vf "scale=-2:HEIGHT" maintains aspect ratio, sets height ++ // -c:v libx264 uses H.264 codec ++ // -preset medium balances speed vs compression ++ // -crf 23 provides good quality (lower = better quality, 18-28 is typical range) ++ // -c:a aac uses AAC audio codec ++ const command = `ffmpeg -i "${inputPath}" -vf "scale=-2:${targetHeight}" -c:v libx264 -preset ${FFMPEG_PRESET} -crf ${FFMPEG_CRF} -c:a aac -y "${outputPath}"`; ++ ++ try { ++ await execAsync(command); ++ } catch (error) { ++ throw new Error( ++ `Failed to transcode video to ${quality}: ${error instanceof Error ? error.message : String(error)}` ++ ); ++ } ++} ++ ++/** ++ * Creates a temporary file path for video processing ++ */ ++export function createTempVideoPath( ++ prefix: string, ++ suffix: string = ".mp4" ++): string { ++ const timestamp = Date.now(); ++ const random = Math.random().toString(36).substring(2, 9); ++ return join(tmpdir(), `${prefix}_${timestamp}_${random}${suffix}`); ++} ++ ++/** ++ * Cleans up temporary files ++ */ ++export async function cleanupTempFiles(...paths: string[]): Promise { ++ await Promise.allSettled( ++ paths.map(async (path) => { ++ try { ++ await unlink(path); ++ } catch (error) { ++ // Ignore errors if file doesn't exist ++ if ((error as NodeJS.ErrnoException).code !== "ENOENT") { ++ console.error(`Failed to delete temp file ${path}:`, error); ++ } ++ } ++ }) ++ ); ++} ++ ++/** ++ * Writes a buffer to a temporary file ++ */ ++export async function writeBufferToTempFile( ++ buffer: Buffer, ++ prefix: string, ++ suffix: string = ".mp4" ++): Promise { ++ const tempPath = createTempVideoPath(prefix, suffix); ++ await writeFile(tempPath, buffer); ++ return tempPath; ++} ++ ++/** ++ * Extracts a thumbnail from a video file using ffmpeg ++ * @param options - Thumbnail extraction options ++ * @returns Buffer containing the JPEG image data ++ */ ++export async function extractThumbnail( ++ options: ThumbnailOptions ++): Promise { ++ const { inputPath, outputPath, width = 640, seekTime = 1 } = options; ++ ++ // Create a temporary file for the initial extraction ++ // Use a unique temp path to avoid conflicts ++ const tempOutputPath = createTempThumbnailPath("temp_thumb"); ++ ++ // Build ffmpeg command to extract thumbnail ++ // -ss 1 seeks to 1 second into the video (avoids black frames at start) ++ // -vframes 1 extracts only 1 frame ++ // -vf "scale=WIDTH:-1" scales to specified width maintaining aspect ratio ++ // -q:v 2 sets JPEG quality (1-31, lower is better) ++ const ffmpegCommand = `ffmpeg -ss ${seekTime} -i "${inputPath}" -vframes 1 -vf "scale=${width}:-1" -q:v 2 -y "${tempOutputPath}"`; ++ ++ try { ++ // Extract thumbnail with ffmpeg ++ await execAsync(ffmpegCommand); ++ ++ // Convert to progressive JPEG using ImageMagick for faster perceived loading ++ // Progressive JPEGs display a low-quality version first and progressively improve ++ // This makes the thumbnail appear faster even while still downloading ++ const convertCommand = `convert "${tempOutputPath}" -interlace Plane -quality 85 "${outputPath}"`; ++ await execAsync(convertCommand); ++ ++ // Clean up temp file ++ try { ++ await unlink(tempOutputPath); ++ } catch { ++ // Ignore cleanup errors ++ } ++ ++ // Read the generated progressive JPEG thumbnail ++ const thumbnailBuffer = await readFile(outputPath); ++ return thumbnailBuffer; ++ } catch (error) { ++ // Clean up temp file on error ++ try { ++ await unlink(tempOutputPath); ++ } catch { ++ // Ignore cleanup errors ++ } ++ throw new Error( ++ `Failed to extract thumbnail: ${error instanceof Error ? error.message : String(error)}` ++ ); ++ } ++} ++ ++/** ++ * Creates a temporary file path for thumbnail processing ++ */ ++export function createTempThumbnailPath(prefix: string): string { ++ const timestamp = Date.now(); ++ const random = Math.random().toString(36).substring(2, 9); ++ return join(tmpdir(), `${prefix}_${timestamp}_${random}.jpg`); ++} ++ ++/** ++ * Generates a thumbnail key from a base video key ++ * @param baseKey - The original video key (e.g., "abc123.mp4") ++ * @returns The thumbnail key (e.g., "abc123_thumb.jpg") ++ */ ++export function getThumbnailKey(baseKey: string): string { ++ if (baseKey.endsWith(".mp4")) { ++ return baseKey.replace(".mp4", "_thumb.jpg"); ++ } ++ return `${baseKey}_thumb.jpg`; ++} +diff --git a/src/utils/storage/mock-storage.ts b/src/utils/storage/mock-storage.ts +new file mode 100644 +index 0000000..bddcaaf +--- /dev/null ++++ b/src/utils/storage/mock-storage.ts +@@ -0,0 +1,94 @@ ++import type { ++ IStorage, ++ StreamFileResponse, ++} from "./storage.interface"; ++ ++/** ++ * Mock storage for development when R2/S3 is unavailable. ++ * Returns placeholder video/image URLs (no referrer restrictions). ++ */ ++export class MockStorage implements IStorage { ++ // Free sample videos from Google (~2MB each, fast loading) ++ private readonly SAMPLE_VIDEOS = [ ++ "https://storage.googleapis.com/gtv-videos-bucket/sample/ForBiggerBlazes.mp4", ++ "https://storage.googleapis.com/gtv-videos-bucket/sample/ForBiggerEscapes.mp4", ++ "https://storage.googleapis.com/gtv-videos-bucket/sample/ForBiggerJoyrides.mp4", ++ "https://storage.googleapis.com/gtv-videos-bucket/sample/ForBiggerMeltdowns.mp4", ++ ]; ++ ++ // Thumbnail images from Unsplash (tech/coding themed) ++ private readonly SAMPLE_IMAGES = [ ++ "https://images.unsplash.com/photo-1516321318423-f06f85e504b3?w=640&h=360&fit=crop", // code on screen ++ "https://images.unsplash.com/photo-1517694712202-14dd9538aa97?w=640&h=360&fit=crop", // laptop coding ++ "https://images.unsplash.com/photo-1555066931-4365d14bab8c?w=640&h=360&fit=crop", // code editor ++ "https://images.unsplash.com/photo-1461749280684-dccba630e2f6?w=640&h=360&fit=crop", // programming ++ ]; ++ ++ // Store "uploaded" files in memory ++ private readonly files = new Map(); ++ ++ private isImageKey(key: string): boolean { ++ return /\.(jpg|jpeg|png|gif|webp)$/i.test(key) || key.includes("_thumb"); ++ } ++ ++ private getConsistentIndex(key: string, arrayLength: number): number { ++ // Use key hash for consistent results (same key = same video/image) ++ let hash = 0; ++ for (let i = 0; i < key.length; i++) { ++ hash = ((hash << 5) - hash) + key.charCodeAt(i); ++ hash |= 0; ++ } ++ return Math.abs(hash) % arrayLength; ++ } ++ ++ async upload(key: string, data: Buffer, contentType: string = "video/mp4") { ++ console.log(`[MockStorage] Simulated upload: ${key} (${data.length} bytes)`); ++ this.files.set(key, { data, contentType }); ++ } ++ ++ async delete(key: string) { ++ console.log(`[MockStorage] Simulated delete: ${key}`); ++ this.files.delete(key); ++ } ++ ++ async exists(key: string): Promise { ++ return true; ++ } ++ ++ async getStream( ++ key: string, ++ rangeHeader: string | null ++ ): Promise { ++ throw new Error( ++ "[MockStorage] getStream not supported. Use getPresignedUrl instead." ++ ); ++ } ++ ++ async getPresignedUrl(key: string): Promise { ++ if (this.isImageKey(key)) { ++ const index = this.getConsistentIndex(key, this.SAMPLE_IMAGES.length); ++ const url = this.SAMPLE_IMAGES[index]; ++ console.log(`[MockStorage] Image: ${key} -> index ${index} -> ${url.split('/').pop()}`); ++ return url; ++ } ++ ++ const index = this.getConsistentIndex(key, this.SAMPLE_VIDEOS.length); ++ const url = this.SAMPLE_VIDEOS[index]; ++ console.log(`[MockStorage] Video: ${key} -> index ${index} -> ${url.split('/').pop()}`); ++ return url; ++ } ++ ++ async getPresignedUploadUrl(key: string, contentType: string = "video/mp4"): Promise { ++ console.log(`[MockStorage] Returning mock upload URL for: ${key}`); ++ return `http://localhost:4000/api/mock-upload?key=${encodeURIComponent(key)}`; ++ } ++ ++ async getBuffer(key: string): Promise { ++ const file = this.files.get(key); ++ if (file) { ++ return file.data; ++ } ++ console.log(`[MockStorage] Returning empty buffer for: ${key}`); ++ return Buffer.alloc(0); ++ } ++} diff --git a/.dev/setup.sh b/.dev/setup.sh new file mode 100644 index 00000000..86898be6 --- /dev/null +++ b/.dev/setup.sh @@ -0,0 +1,30 @@ +#!/bin/bash +# Dev Patches Setup Script +# Run once after cloning to set up git aliases + +set -e + +echo "Setting up dev patches git aliases..." + +# Main commands +git config --local alias.dev-status '! bash .dev/dev-patch.sh status' +git config --local alias.dev-on '! bash .dev/dev-patch.sh all-on' +git config --local alias.dev-off '! bash .dev/dev-patch.sh all-off' +git config --local alias.dev-patch '! bash .dev/dev-patch.sh' + +# Individual patch shortcuts +git config --local alias.login-bypass-on '! bash .dev/dev-patch.sh on dev-login-bypass' +git config --local alias.login-bypass-off '! bash .dev/dev-patch.sh off dev-login-bypass' +git config --local alias.mock-storage-on '! bash .dev/dev-patch.sh on mock-storage' +git config --local alias.mock-storage-off '! bash .dev/dev-patch.sh off mock-storage' + +echo "" +echo "Done! Available commands:" +echo " git dev-status - Show patch status" +echo " git dev-on - Apply all patches" +echo " git dev-off - Remove all patches" +echo " git dev-patch - Full patch management" +echo "" +echo "Individual patches:" +echo " git login-bypass-on/off" +echo " git mock-storage-on/off" diff --git a/.env.sample b/.env.sample index f32c8263..743b8e18 100644 --- a/.env.sample +++ b/.env.sample @@ -30,4 +30,12 @@ AWS_SES_ACCESS_KEY_ID= AWS_SES_SECRET_ACCESS_KEY= AWS_SES_REGION=us-east-1 -OPENAI_API_KEY= \ No newline at end of file +OPENAI_API_KEY= +# ============================================================================ +# Dev Patches (optional - for local development only) +# ============================================================================ +# These variables enable dev-only features via the .dev patch system. +# See .dev/README.md for details on available patches. +# +# DEV_BYPASS_AUTH=true # Skip Google OAuth, use dev login menu instead +# DEV_MOCK_STORAGE=true # Use mock storage instead of R2/S3 diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 00000000..1aef31b4 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,32 @@ +# Auto detect text files and normalize to LF +* text=auto eol=lf + +# Force LF for shell scripts (critical for Linux/Mac) +*.sh text eol=lf + +# Force LF for common text files +*.ts text eol=lf +*.tsx text eol=lf +*.js text eol=lf +*.jsx text eol=lf +*.json text eol=lf +*.md text eol=lf +*.yml text eol=lf +*.yaml text eol=lf +*.css text eol=lf +*.html text eol=lf +*.sql text eol=lf +*.patch text eol=lf + +# Binary files +*.png binary +*.jpg binary +*.jpeg binary +*.gif binary +*.webp binary +*.ico binary +*.woff binary +*.woff2 binary +*.ttf binary +*.eot binary +*.pdf binary