Search
Code187
1. Click [**Remix**](/?intent=remix)2. Add environment variables: - `OPENAI_API_KEY` — for AI lead qualification - `GITHUB_TOKEN` — for accessing GitHub API ([create one here](https://github.com/settings/tokens))app.post("/", async (c) => { // Early API key check to avoid confusing when using the tester ui if (!Deno.env.get("OPENAI_API_KEY")) { return c.json( { error: "Add OPENAI_API_KEY in Environment Variables" }, 503, );// Test endpoint for the welcome UIapp.post("/test", async (c) => { if (!Deno.env.get("OPENAI_API_KEY")) { return c.json( { error: "Add OPENAI_API_KEY in Environment Variables" }, 503, );// Reanalyze a leadapp.post("/lead/:id/reanalyze", async (c) => { if (!Deno.env.get("OPENAI_API_KEY")) { return c.json( { error: "Add OPENAI_API_KEY in Environment Variables" }, 503, );import { readFile } from "https://esm.town/v/std/utils/index.ts";import { Agent, run, RunResult, webSearchTool } from "npm:@openai/agents@0.3.0";import { getLeadById, storeLead, updateLeadOutput } from "./db.ts";import { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();function escapeHtml(s: string) { async start(controller) { try { const completion = await openai.chat.completions.create({ model: "gpt-5-nano", stream: true, }); // The Val std/openai client typically returns an async iterable for stream:true for await (const evt of completion as any) { const delta = evt?.choices?.[0]?.delta?.content ?? ""; */async function nonStreamingRunbook(notes: string): Promise<string> { const completion = await openai.chat.completions.create({ model: "gpt-5-nano", max_tokens: 900,Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" }, options = {},) => { // Initialize OpenAI API stub with custom configuration for iFlow const { Configuration, OpenAIApi } = await import( "https://esm.sh/openai@3.3.0" ); const configuration = new Configuration({ apiKey: process.env.OPENAI_API_KEY || process.env.OPENAI, basePath: process.env.OPENAI_BASE_URL || undefined, }); const openai = new OpenAIApi(configuration); // Request chat completion const messages = typeof prompt === "string" : prompt; const model = process.env.MODEL_NAME || "gpt-3.5-turbo-0613"; const { data } = await openai.createChatCompletion({ model, messages: messages as any, handler- Environment variables via `Deno.env.get()`- Val Town std libraries used: `email` (forwarding), `openai` (LLM fallback)## Code Standards- **Geocoder**: US Census Bureau (`geocoding.geo.census.gov`) — free, no API key, handles DC intersections- **LLM fallback**: OpenAI gpt-4o-mini via Val Town `std/openai` for locations that resist deterministic parsing- **Email format**: Location is always in intersections well)- **LLM fallback**: When deterministic parsing can't produce a geocodable address, OpenAI gpt-4o-mini rewrites the location string before a second geocode attempt- **Email parsing**: Location is extracted from the patternUsers
No users found
Docs
No docs found