Search
BreathingInstructional
@nemonicui
This is an instructional when there is negative stress to help with relaxation.
HTTP
"Repeat for 5-10 cycles"
function App() {
const [selectedTechnique, setSelectedTechnique] = useState(null);
cursor: 'pointer'
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") { client(); }
export default async function server(request: Request): Promise<Response> {
return new Response(`
inspiringTomatoFlea
@abhinavtavildar
// Fetches a random joke.
Script
// ... imports ...
// Fetches a random joke.
async function fetchRandomJoke() {
const response = await fetch(
"https://official-joke-api.appspot.com/random_joke",

notify
@adjectiveallison
An interactive, runnable TypeScript val by adjectiveallison
HTTP
import { fetch } from "https://esm.town/v/std/fetch";
import process from "node:process";
export async function notify(request: Request) {
if (request.method === "OPTIONS") {
return new Response("", {
valle_tmp_071643302414527117543908170364975
@janpaul123
// This val will respond to any request with an HTML "Hello, world!" message with some fun CSS styles
HTTP
// This val will respond to any request with an HTML "Hello, world!" message with some fun CSS styles
export default async function(req: Request): Promise<Response> {
const html = `
<html>
redFerret
@maxm
An interactive, runnable TypeScript val by maxm
HTTP
export default async function (req: Request): Promise<Response> {
return Response.json({ ok: true })

feedback
@lawrencewu
An interactive, runnable TypeScript val by lawrencewu
Script
export function feedback(content) {
console.email("Feedback: " + content);
return "ok";
knowledgeExplorer
@sharanbabu
* This val creates a modern, stylish knowledge explorer using the Cerebras LLM API.
* It allows users to enter a topic or select from suggestions, displays information in a centered card,
* and enables exploration of related topics or deeper dives using arrow keys or buttons.
HTTP
const SUGGESTIONS = ["Quantum Computing", "Renaissance Art", "Climate Change", "Artificial Intelligence", "Space Exploration"];
function App() {
const [topic, setTopic] = useState("");
</div>
function client() {
createRoot(document.getElementById("root")).render(<App />);
client();
async function server(request: Request): Promise<Response> {
if (request.method === "POST" && new URL(request.url).pathname === "/generate") {
} catch (error) {
console.error("Error calling Cerebras API:", error);
return new Response(JSON.stringify({ error: "Failed to generate content" }), {
prosperousYellowMule
@gokulnpc
Cerebras Inference template This val shows you how you can deploy an app using Cerebras Inference on Val Town
in seconds. What is Cerebras? Cerebras is an American chip manufacturer that
produces large wafer chips that deliver mind-blowing LLM inference speeds.
As of this writing on Jan 17, 2025, Cerebras Inference
provides Llama 3.18b, 3.1 70b, and 3.370b at a jaw-dropping 2k tokens per second –
that's 50x faster than what the frontier labs produce. Llama 3.370b at 2k tokens
per second is particularly noteworthy because it is
a GPT-4-class model . This level
of intelligence at that level of speed will unlock whole new classes of applications. Quick start Set up Cerebras Sign up for Cerebras Get a Cerebras API Key Save it in a Val Town environment variable called CEREBRAS_API_KEY Once Cerebras is set up in your Val Town account, there are two ways to get started: Fork this app and customize it (or ask Townie AI to customize it) Start a new chat with Townie AI and copy & paste
the following instructions: . Use Cerebras for AI on the backend like so:
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: Deno.env.get("CEREBRAS_API_KEY"),
baseURL: "https://api.cerebras.ai/v1"
});
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
messages: [],
});
const generatedText = response.choices[0].message.content; For example, the val in this template was created by asking
Townie AI to "Make a chatgpt clone",
then I hit shift-enter twice, and then pasted in the instructions on
how to use Cerebras from above, then hit enter. Townie built this app on its first try, in about 20 seconds. Sample apps Cerebras Searcher - a Perplexity clone that uses the SerpAPI to do RAG
and summaries with Cerebras ( requires a SerpAPI key ) Cerebras Coder - an app that
generates websites in a second with Cerebras Cerebras Debater - an
app that truly shows Cerebras's speed: it's Cerebras talking to Cerebras in a debate
HTTP
Use Cerebras for AI on the backend like so:
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: Deno.env.get("CEREBRAS_API_KEY"),
return this.props.children;
function APIGeneratorApp() {
const [prompt, setPrompt] = useState("");
</div>
function App() {
return (
</ErrorBoundary>
function client() {
const rootElement = document.getElementById("root");
client();
export default async function server(request: Request): Promise<Response> {
if (request.method === "POST" && new URL(request.url).pathname === "/generate-api") {
const { OpenAI } = await import("https://esm.sh/openai");
try {
headers: { "Content-Type": "application/json" }
const client = new OpenAI({
apiKey: apiKey,
- Include error handling
- Demonstrate key functionality`;
const response = await client.chat.completions.create({
modestPurpleLouse
@gavin_nittoli_nu
// Fetches a random joke.
Script
import { email } from "https://esm.town/v/std/email?v=9";
// Fetches a random joke.
async function fetchRandomJoke() {
const response = await fetch(
"https://official-joke-api.appspot.com/random_joke",
sqliteUniverse
@postpostscript
sqliteUniverse: make queries against multiple vals or endpoints at the same time! Example: @postpostscript/sqliteUniverseExample Todo [ ] tests‼️ [ ] update to support following syntax: SELECT * FROM "@example/endpoint".someTable or SELECT * FROM "@example/endpoint:::someTable"
Script
batch,
export function sqliteUniverseWithOptions(options: SqliteUniverseOptions) {
return {
return batch(statements, options);
async function execute(
statement: InStatement,
return res;
async function batch(
statements: InStatement[],
return sqlite.batch(normalized);
async function createSqliteFromEndpointTables(
endpointTableMap: EndpointTableMap,
let sqlite = await interfaces.exact?.[endpoint];
if (sqlite instanceof Function) {
sqlite = await sqlite({ endpoint, tables });
qualityIvoryShrew
@ankitvaltown
@jsxImportSource https://esm.sh/react@18.2.0
HTTP
import { createRoot } from "https://esm.sh/react-dom@18.2.0/client";
function App() {
const [view, setView] = useState('login');
</div>
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") { client(); }
export default async function server(request: Request): Promise<Response> {
const { sqlite } = await import("https://esm.town/v/stevekrouse/sqlite");
formLogic
@iamseeley
// Adding event listeners after the window loads
Script
const promptKey = 'imagePromptValue';
export function updateForm() {
const model = document.getElementById('model').value;
document.getElementById('prompt').value = savedPrompt;
export async function handleFormSubmit(event) {
event.preventDefault();
handleImageResponse(data, imageSize, model);
function handleImageResponse(data, imageSize, model) {
const resultDiv = document.getElementById('result');
resultDiv.appendChild(errorElement);
export function handleReset() {
const resultDiv = document.getElementById('result');

API_URL
@pomdtr
Val Town API URL When Val Town code is run on Val Town servers we use a local URL so we can save time by skipping a roundtrip to the public internet. However, if you want to run your vals that use our API, ie std library vals, locally, you'll want to use our public API's URL, https://api.val.town . We recommend importing and using std/API_URL whenever you use our API so that you are always using the most efficient route. Example Usage import { API_URL } from "https://esm.town/v/std/API_URL";
const response = await fetch(`${API_URL}/v1/me`, {
headers: {
Authorization: `Bearer ${Deno.env.get("valtown")}`,
Accept: "application/json",
},
});
const data = await response.json();
console.log(data)
Script
function envOrUndefined(key: string): string | undefined {
// try/catch prevents crashes if the script doesn't have env access
try {
return Deno.env.get("VALTOWN_API_URL");
} catch {}
export function getApiUrl(): string {
return envOrUndefined("VALTOWN_API_URL") ?? "https://api.val.town";
export const API_URL = getApiUrl();
GDI_HelloWorldService
@rozek
This val is part of a series of examples to introduce "val.town" in my computer science course at
Stuttgart University of Applied Sciences . The idea is to motivate even first-semester students not to wait but to put their
ideas into practice from the very beginning and implement web apps with
frontend and backend. It contains a very simple HTTP end point responding with a static "Hello, World!".
To make it less boring, the response is rendered as ASCII art. In order to use it, send a request similar to the following https://rozek-gdi_helloworldservice.web.val.run/ The code was created using Townie - with only very few small manual corrections. This val is licensed under the MIT License.
HTTP
export default async function (req: Request): Promise<Response> {
const asciiArt = `
| _ | __/ | | (_) | \\ V V / (_) | | | | (_| |_|