npm install @brinpage/cpm

CPM SDK for Next.js

Modular prompts. One interface. Full control. A lightweight SDK to build AI-powered workflows in your Next.js apps — with modular prompts, unified API routing and zero backend config.

CPM modular context graph

Install in under a minute

Three steps: add the SDK, link to BrinPage Cloud, and launch the local dashboard.

1

Add the SDK

bash
# Install
npm install @brinpage/cpm
2

Link to BrinPage Cloud (.env)

env
BRINPAGE_API_KEY=your-cloud-key
BRINPAGE_BASE_URL=https://cloud.brinpage.com

No provider keys in your project. Requests are routed via BrinPage Cloud.

3

Run the dashboard

bash
npx brinpage cpm
# runs on http://localhost:3027

Minimal setup for Next.js (App Router). All configuration—providers, models, temperature, max tokens, embeddings—is managed in the CPM Dashboard and synced via BrinPage Cloud. Your code just initializes the client and calls cpm.chat() or cpm.ask().

/app/api/cpm/ask/route.tsserver
import { NextResponse } from 'next/server'
import { createClient, type ChatMessage } from '@brinpage/cpm'

export const runtime = 'edge' // optional

// The CPM Dashboard (synced with BrinPage Cloud) controls provider, model, temperature, and limits.
const cpm = createClient()

export async function POST(req: Request) {
  const body = await req.json()

  // Accept either "messages" (ChatML) or a simple "q" string
  const messages: ChatMessage[] = Array.isArray(body?.messages)
    ? body.messages
    : [{ role: 'user', content: String(body?.q ?? '') }]

  // Call CPM — routing & config handled by the dashboard + Cloud
  const result = await cpm.chat({ messages })

  return NextResponse.json(result)
}
client example
// Send chat messages to your API route
const res = await fetch('/api/cpm/ask', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({
    messages: [
      { role: 'user', content: 'Give me a 2-line summary of CPM.' },
    ],
  }),
})

const { output } = await res.json()
console.log(output)
minimal call
// Simpler form — just send a string
await fetch('/api/cpm/ask', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({ q: 'Summarize CPM in one sentence.' }),
})

Want the full setup, tips, and troubleshooting? Visit the CPM Installation guide.

Open Installation Guide

ConnectConnectonce.once.BuildBuildanywhere.anywhere.

Modular knowledge for zero guesswork.

CPM treats prompts as components: a Base + stackable Modules. Declare what your task needs; CPM builds the final prompt deterministically—the same inputs always produce the same output prompt.

  • • No more giant strings or copy-paste drift.
  • • Attach only the modules a task actually needs.
  • • Smaller, focused prompts → fewer tokens, clearer model behavior.
BrinPage CPM Base Prompt with stackable Context Modules — showing facts, rules, tone and workflows composing the final AI prompt.

Your control room on localhost.

Run the dashboard on :3027. Switch models, tune parameters, and inspect every request — all connected through BrinPage Cloud. No provider setup. No lock-in.

  • • Unified routing — OpenAI ↔︎ Gemini (more soon), managed via BrinPage Cloud.
  • • Tuning from the UI — model, temperature, and caps without touching code.
  • • Request inspector — diff modules and check the final JSON payload before execution.

Efficiency by design.

Less repeated context. Smarter routing. Fewer retries thanks to preflight validation.

Cost ≈ (prompt + response) × retries. CPM minimizes both.

BrinPage CPM Dashboard — Localhost control panel showing model routing, module inspection and runtime context.