Search
criticEmail
@paulkinlan
An interactive, runnable TypeScript val by paulkinlan
Email
import { email } from "https://esm.town/v/std/email";
import { extractValInfo } from "https://esm.town/v/stevekrouse/extractValInfo";
import { OpenAI } from "npm:openai";
function stripHtmlBackticks(html: string): string {
return html.replace(/^```html\n?/, "").replace(/\n?```$/, "");
export default async function(e: Email) {
const openai = new OpenAI();
console.log(`from: ${e.from} to: ${e.to} subject: ${e.subject}, cc: ${e.cc}, bcc: ${e.bcc}`);
let { from, to, subject } = e;

emailSummaryHandler
@stevekrouse
Email Summary Service This val is an email handler replies to emails it recieves with an LLM-generated summary. To use, forward an email to paulkinlan.emailSummaryHandler@valtown.email Blog post: https://paul.kinlan.me/projects/email-summary-service/
Email
import { email } from "https://esm.town/v/std/email";
import { OpenAI } from "https://esm.town/v/std/openai";
import { extractValInfo } from "https://esm.town/v/stevekrouse/extractValInfo";
console.log(e.from, e.subject);
const openai = new OpenAI();
const summary = await openai.chat.completions.create({
messages: [

yc_finder
@stevekrouse
YC Company Finder In an effort to get more companies using Val Town, I wanted to see which of our existing userbase was part of a YC company. I found this wonderful
google sheet
with all YC companies, and had Townie make me this interface to
join it with an exported CSV of our users table. All the compute happens client-side, so it's safe for you to run on your customer lists too. If you want any changes, fork it & have Townie customize it for you. Feel free to send me a pull request or leave a comment :)
HTTP
brian@airbnb.com,Brian Chesky
drew@dropbox.com,Drew Houston
sam@openai.com,Sam Altman
tim@apple.com,Tim Cook
jeff@amazon.com,Jeff Bezos
Storyweaver
@aioe0x417a
@jsxImportSource https://esm.sh/react
HTTP
new Uint8Array(arrayBuffer).reduce((data, byte) => data + String.fromCharCode(byte), '')
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
try {
const imageAnalysis = await withTimeout(openai.chat.completions.create({
model: "gpt-4o",
? `Previous story summary: ${previousStory[previousStory.length - 1].content}`
const story = await withTimeout(openai.chat.completions.create({
model: "gpt-4o-mini",

cabinAdjacentTweets
@jonbo
scans tweets and then uses an llm to decide whether to and where to send it to forked from https://www.val.town/v/stevekrouse/twitterAlert
Cron
import { OpenAI } from "https://esm.town/v/std/openai";
import { sqlite } from "https://esm.town/v/std/sqlite";
"Jay_Pitter",
const openai = new OpenAI();
async function isTweetProcessed(tweetId: string): Promise<boolean> {
const completion = await retryWithBackoff(() =>
openai.chat.completions.create({
model: "gpt-4o-mini",
emailSummaryHandler
@dazzag24
Email Summary Service This val is an email handler replies to emails it recieves with an LLM-generated summary. To use, forward an email to paulkinlan.emailSummaryHandler@valtown.email Blog post: https://paul.kinlan.me/projects/email-summary-service/
Email
import { email } from "https://esm.town/v/std/email";
import { OpenAI } from "https://esm.town/v/std/openai";
import { extractValInfo } from "https://esm.town/v/stevekrouse/extractValInfo";
console.log(e.from, e.subject);
const openai = new OpenAI();
const summary = await openai.chat.completions.create({
messages: [
habitTrackerApp
@sdevanair
@jsxImportSource https://esm.sh/react@18.2.0
HTTP
try {
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
const motivationPrompt = `Generate a highly personalized, inspiring message for someone who has ${userProgress.totalHabits} habits, ${userProgress.completedHabits} completed habits, and an average streak of ${userProgress.averageStreak} days. Make it motivational and specific.`;
const response = await openai.chat.completions.create({
model: "gpt-4o-mini",
multilingualchatroom
@trob
A simple chat room to share with friends of other languages, where each user can talk in their own language and see others' messages in their own language.
Click and hold a translated message to see the original message.
Open the app in a new window to start your own, unique chatroom that you can share with your friends via the room URL. TODO: BUG: fix the issue that keeps old usernames in the "[User] is typing" section after a user changes their name. BUG: Username edit backspaces is glitchy. UI: Update the title for each unique chatroom to make the difference clear. UI: mobile friendly. Feature: the ability for the message receiver to select a part of a translation that is confusing and the author will see that highlight of the confusing words and have the opportunity to reword the message or... Feature: bump a translation to a higher LLM for more accurate translation. Feature: use prior chat context for more accurate translations. Feature: Add video feed for non-verbals while chatting.
HTTP
const { blob } = await import("https://esm.town/v/std/blob");
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
// Simple rate limiting
try {
const completion = await openai.chat.completions.create({
messages: [
twitterAlert
@eseidel
Twitter/𝕏 keyword alerts Custom notifications for when you, your company, or anything you care about is mentioned on Twitter. If you believe in Twitter/𝕏-driven development, you want to get notified
when anyone is talking about your tech, even if they're not tagging you. To get this Twitter Alert bot running for you,
fork this val and modify the query and where the notification gets delivered. 1. Query Change the keywords for what you want to get notified for
and the excludes for what you don't want to get notified for. You can use Twitter's search operators to customize your query, for some collection of keywords, filtering out others, and much more! 2. Notification Below I'm sending these mentions to a public channel in our company Discord, but you can customize that to whatever you want, @std/email, Slack, Telegram, whatever. Twitter Data & Limitations The Twitter API has become unusable. This val gets Twitter data via SocialData ,
an affordable Twitter scraping API. In order to make this val easy for
you to fork & use without signing up for another API, I am proxying
SocialData via @stevekrouse/socialDataProxy. Val Town Pro users can call this proxy
100 times per day, so be sure not to set this cron to run more than once every 15 min. If you want to run it more, get your own SocialData
API token and pay for it directly.
Cron
import { zodResponseFormat } from "https://esm.sh/openai/helpers/zod";
import { z } from "https://esm.sh/zod";
import { OpenAI } from "https://esm.town/v/std/openai";
import { discordWebhook } from "https://esm.town/v/stevekrouse/discordWebhook";
.join(" OR ") + " " + excludes;
const openai = new OpenAI();
const RelevanceSchema = z.object({
try {
const completion = await openai.beta.chat.completions.parse({
model: "gpt-4o-mini",
} catch (error) {
console.error("Error parsing OpenAI response:", error);
return { isRelevant: false, confidence: 0, reason: "Error in processing" };
contentTemplateApp
@awhitter
ok
HTTP
if (req.method === 'POST') {
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
try {
Full Content: ${fullContent}`;
const completion = await openai.chat.completions.create({
messages: [{ role: "user", content: prompt }],
add_to_habitify_from_todoist_w_ai
@nerdymomocat
// if you added new items to habitify after running this script
Cron
import Jimp from "npm:jimp";
import OpenAI from "npm:openai";
import { z } from "npm:zod";
const HABITIFY_API_KEY = process.env.HABITIFY_API_KEY;
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const DEF_TIMEZONE = "America/Los_Angeles"; // Get your timezone from here: https://stackoverflow.com/a/54500197
const todoistapi = new TodoistApi(TODOIST_API_KEY);
const oai = new OpenAI({
apiKey: OPENAI_API_KEY ?? undefined,
const client = Instructor({
IndirectionAPI
@jxnblk
// Test without using this?
Script
const tabooResponse = await fetchOpenAIText(tabooPrompt);
return fetchOpenAIStream(cluePrompt);
async function fetchOpenAIText(content: string): string {
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
export async function fetchOpenAIStream(content: string): Promise<Response> {
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
const completion = await openai.chat.completions.create({

cron_client_side_script_fork
@stevekrouse
CronGPT This is a minisite to help you create cron expressions, particularly for crons on Val Town. It was inspired by Cron Prompt , but also does the timezone conversion from wherever you are to UTC (typically the server timezone). Tech Hono for routing ( GET / and POST /compile .) Hono JSX HTML (probably overcomplicates things; should remove) @stevekrouse/openai, which is a light wrapper around @std/openai
HTTP
* HTML (probably overcomplicates things; should remove)
* @stevekrouse/openai, which is a light wrapper around @std/openai
/** @jsxImportSource npm:hono@3/jsx */
import { chat } from "https://esm.town/v/stevekrouse/openai";
import cronstrue from "npm:cronstrue";
import { Hono } from "npm:hono@3";