Search

Results include substring matches and semantically similar vals. Learn more
jeffreyyoung avatar
poisedOrangeJellyfish
@jeffreyyoung
An interactive, runnable TypeScript val by jeffreyyoung
HTTP
Yes! The stocks are going up.
console.log(EventSource);
async function* forward(query: QueryRequest, botName: string, accessKey: string) {
console.log(query, botName, accessKey);
const req = new Request(`https://api.poe.com/bot/${botName}`, {
for await (const event of fetchEventSource(req)) {
console.log(event);
async function* fetchEventSource(request: Request) {
const response = await fetch(request);
console.log(response.status, response.statusText);
nisargio avatar
browserlessPuppeteerExample
@nisargio
An interactive, runnable TypeScript val by nisargio
Script
browserWSEndpoint: `wss://chrome.browserless.io?token=${process.env.browserlessKey}`,
const page = await browser.newPage();
await page.goto("https://en.wikipedia.org/wiki/OpenAI");
const intro = await page.evaluate(
`document.querySelector('p:nth-of-type(2)').innerText`,
arthrod avatar
tryingciceroagain
@arthrod
@jsxImportSource https://esm.sh/react
HTTP
transition: opacity 2s ease, background-color 0.5s ease;
function AnimatedHeadline() {
const [animationStarted, setAnimationStarted] = useState(false);
transition: width 2.5s linear;
function SubheadlineAnimation() {
const [animationStarted, setAnimationStarted] = useState(false);
opacity: 0.8;
function App() {
const [isNavOpen, setIsNavOpen] = React.useState(false);
</Footer>
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") { client(); }
export default async function server(request: Request): Promise<Response> {
return new Response(
baiheinet avatar
myApi
@baiheinet
An interactive, runnable TypeScript val by baiheinet
Script
export function myApi(name) {
return "hi " + name;
janpaul123 avatar
valwriter
@janpaul123
[ ] streaming [ ] send the code of the valwriter back to gpt (only if it's related, might need some threads, maybe a custom gpt would be a better fix, of course, could do it as a proxy...) [ ] make it easy to send errors back to gpt [ ] make it easy to get screenshots of the output back to gpt
HTTP
import { fetchText } from "https://esm.town/v/stevekrouse/fetchText";
import { chat } from "https://esm.town/v/stevekrouse/openai";
import cronstrue from "npm:cronstrue";
content: `/** @jsxImportSource npm:react */
export default function() {
return <h1>{new Date().toLocaleTimeString()}</h1>;
</html>,
export async function compile(description: string) {
const messages = [
await email({ subject: "Subject line", text: "Body of message" });
// OpenAI
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
stevekrouse avatar
aqi
@stevekrouse
AQI Alerts Get email alerts when AQI is unhealthy near you. Set up Click Fork Change location (Line 4) to describe your location. It accepts fairly flexible English descriptions which it turns into locations via nominatim's geocoder API . Click Run Background This val uses nominatim's geocoder to get your lat, lon, and air quality data from OpenAQ. It uses EPA's NowCast AQI Index calculation and severity levels. Learn more: https://www.val.town/v/stevekrouse.easyAQI
Cron
import { email } from "https://esm.town/v/std/email?v=9";
import { easyAQI } from "https://esm.town/v/stevekrouse/easyAQI?v=5";
export async function aqi(interval: Interval) {
const location = "downtown brooklyn"; // <-- change to place, city, or zip code
const data = await easyAQI({ location });
dhvanil avatar
web_O5apQ7aQOz
@dhvanil
An interactive, runnable TypeScript val by dhvanil
HTTP
export async function web_O5apQ7aQOz(req) {
return new Response(`<!DOCTYPE html>
drops[x] = 1;
function draw() {
ctx.fillStyle = 'rgba(0, 0, 0, 0.05)';
// Reality test calculations
function runRealityTest() {
const pi = Math.PI;
cephalization avatar
anthropicProxy
@cephalization
https://simonwillison.net/2024/Aug/23/anthropic-dangerous-direct-browser-access/ THIS IS NO LONGER NECESSARY This Val will proxy anthropic HTTP requests from some frontend client, like langchain, so that you can utilize anthropic apis from the browser. Convert it to an HTTP val in order to use it (you may want to setup an ENV var / header to protect the endpoint with a secret key)
Script
import Anthropic from "npm:@anthropic-ai/sdk@0.24.3";
export default async function(req: Request): Promise<Response> {
if (req.method === "OPTIONS") {
return new Response(null, {
webup avatar
chatSampleSystemRoleExpert
@webup
An interactive, runnable TypeScript val by webup
Script
## Initialization:
const prompt =
"张海立,驭势科技 UISEE 云脑研发总监。KubeSphere Ambassador,CNCF OpenFunction TOC Member,信通院“汽车云工作组”首批技术专家";
return await chat([
{ role: "system", content: system },
janpaul123 avatar
valle_tmp_54846529024345792066850117206065
@janpaul123
@jsxImportSource https://esm.sh/react
HTTP
import OpenAI from "npm:openai";
unless strictly necessary, for example use APIs that don't require a key, prefer internal function
functions where possible. Unless specified, don't add error handling,
The val should create a "export default async function main" which is the main function that gets
// The val should create a "export default async function main() {" which
// is the main function that gets executed, without any arguments. Don't return a Response object,
function write(text) {
function updateValName(valName) {
function saveVal() {
function openTab(tab) {
janpaul123 avatar
indexValsNeon
@janpaul123
Part of Val Town Semantic Search . Generates OpenAI embeddings for all public vals, and stores them in Neon , using the pg_vector extension. Create the vals_embeddings table in Neon if it doesn't already exist. Get all val names from the database of public vals , made by Achille Lacoin . Get all val names from the vals_embeddings table and compute the difference (which ones are missing). Iterate through all missing vals, get their code, get embeddings from OpenAI, and store the result in Neon. Can now be searched using janpaul123/semanticSearchNeon .
Cron
*Part of [Val Town Semantic Search](https://www.val.town/v/janpaul123/valtownsemanticsearch).*
Generates OpenAI embeddings for all public vals, and stores them in [Neon](https://neon.tech/), using the [pg_vector](https:/
- Create the `vals_embeddings` table in Neon if it doesn't already exist.
- Get all val names from the `vals_embeddings` table and compute the difference (which ones are missing).
- Iterate through all missing vals, get their code, get embeddings from OpenAI, and store the result in Neon.
- Can now be searched using [janpaul123/semanticSearchNeon](https://www.val.town/v/janpaul123/semanticSearchNeon).
import { blob } from "https://esm.town/v/std/blob";
import OpenAI from "npm:openai";
import { truncateMessage } from "npm:openai-tokens";
// CREATE TABLE vals_embeddings (id TEXT PRIMARY KEY, embedding VECTOR(1536));
export default async function() {
const dimensions = 1536;
(await client.queryObject`SELECT id FROM vals_embeddings`).rows.map(row => row.id),
function idForVal(val: any): string {
return `${val.author_username}!!${val.name}!!${val.version}`;
newValsBatches.push(currentBatch);
const openai = new OpenAI();
for (const newValsBatch of newValsBatches) {
const code = getValCode(val);
const embedding = await openai.embeddings.create({
model: "text-embedding-3-small",
dhvanil avatar
val_gFodtl1Mdm
@dhvanil
An interactive, runnable TypeScript val by dhvanil
HTTP
export async function val_gFodtl1Mdm(req) {
try {
// Execute the code directly and capture its result
// Attempt to exploit the rational number correction mechanism
const exploitRationalPrecision = () => {
// Create a function that forces rational interpretation
const forceRational = (num, denominator = 1000000) => {
// Convert any number to a rational approximation
andrewn avatar
myApi
@andrewn
An interactive, runnable TypeScript val by andrewn
Script
export function myApi() {
return { something: "is happening" };
stevekrouse avatar
poembuilder3
@stevekrouse
@jsxImportSource npm:hono@3/jsx
HTTP
/** @jsxImportSource npm:hono@3/jsx */
import { OpenAI } from "https://esm.town/v/std/openai?v=2";
import { sqlite } from "https://esm.town/v/std/sqlite?v=5";
import { Hono } from "npm:hono@3";
cotr avatar
extract_opengraph_data
@cotr
An interactive, runnable TypeScript val by cotr
HTTP
import axios from "npm:axios";
import * as cheerio from "npm:cheerio";
export async function extractOpenGraphTags(request: Request) {
try {
const body = await request.json();