Search
countGithubLOCUI
@g
GitHub Line Counter π GitHub Line Counter , live on val.town ; Ever wondered how many lines of code are in a GitHub repo without the hassle of cloning it? Say hello to GitHub Line Counter β your friendly, web-based LOC inspector! π What It Does This simple tool fetches the GitHub repository as a ZIP file,
decompresses it on the fly (thanks to fflate π¨),
and counts the lines of code β on val.town
with an incredible bandwith and speed. Built with Vanilla JS on the frontend and
powered by val.town on the backend,
itβs lightweight, fast, and ready to use! How to Use Just paste the repo URL, hit "Count Lines," and watch the magic happen! β¨
HTTP
* This val creates a website to count lines of code in GitHub repositories.
* It uses a server-side API to fetch the data and a client-side UI for user interaction.
* It now includes sharing functionality and URL manipulation for better user experience.
import countGithubLOC from 'https://esm.town/v/g/countGithubLOC';
SongTagger
@benlenarts
Spotify Playback A val to setting up some endpoints for Spotify play state.
HTTP
export let spotifyRequestToken = ({ client_id, client_secret, code, redirect_uri }) =>
"Authorization": "Basic " + (new Buffer(client_id + ":" + client_secret).toString("base64")),
export let spotifyRefreshToken = async ({ refresh_token, client_id, client_secret }) =>
"Authorization": "Basic " + (new Buffer(client_id + ":" + client_secret).toString("base64")),
`https://accounts.spotify.com/authorize?response_type=code&client_id=${
encodeURIComponent(Deno.env.get("SPOTIFY_CLIENT_ID"))
client_id: Deno.env.get("SPOTIFY_CLIENT_ID"),
client_secret: Deno.env.get("SPOTIFY_CLIENT_SECRET"),
client_id: Deno.env.get("SPOTIFY_CLIENT_ID"),
client_secret: Deno.env.get("SPOTIFY_CLIENT_SECRET"),
startup_clicker
@jaredsilver
@jsxImportSource https://esm.sh/react
HTTP
/** @jsxImportSource https://esm.sh/react */
import React, { useState, useCallback, useEffect, useRef } from "https://esm.sh/react";
import { createRoot } from "https://esm.sh/react-dom/client";
function App() {
</div>
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") { client(); }
export default async function server(request: Request): Promise<Response> {
sqliteExplorerApp
@robaggio
SQLite Explorer View and interact with your Val Town SQLite data. It's based off Steve's excellent SQLite Admin val, adding the ability to run SQLite queries directly in the interface. This new version has a revised UI and that's heavily inspired by LibSQL Studio by invisal . This is now more an SPA, with tables, queries and results showing up on the same page. Install Install the latest stable version (v86) by forking this val: Authentication Login to your SQLite Explorer with password authentication with your Val Town API Token as the password. Todos / Plans [ ] improve error handling [ ] improve table formatting [ ] sticky table headers [x] add codemirror [ ] add loading indication to the run button (initial version shipped) [ ] add ability to favorite queries [ ] add saving of last query run for a table (started) [ ] add visible output for non-query statements [ ] add schema viewing [ ] add refresh to table list sidebar after CREATE/DROP/ALTER statements [ ] add automatic execution of initial select query on double click [x] add views to the sidebar [ ] add triggers to sidebar [ ] add upload from SQL, CSV and JSON [ ] add ability to connect to a non-val town Turso database [x] fix wonky sidebar separator height problem (thanks to @stevekrouse) [x] make result tables scrollable [x] add export to CSV, and JSON (CSV and JSON helper functions written in this val . Thanks to @pomdtr for merging the initial version!) [x] add listener for cmd+enter to submit query
HTTP
- [ ] add schema viewing
- [ ] add refresh to table list sidebar after `CREATE/DROP/ALTER` statements
- [ ] add automatic execution of initial select query on double click
- [x] add views to the sidebar
- [ ] add triggers to sidebar
- [ ] add upload from SQL, CSV and JSON
- [ ] add ability to connect to a non-val town Turso database
- [x] fix wonky sidebar separator height problem (thanks to @stevekrouse)
- [x] make result tables scrollable
return c.render(
<main class="sidebar-layout">
<div class="sidebar">
<TablesList tables={[...tables, ...views]}></TablesList>
<Separator direction="horizontal"></Separator>
<div class="not-sidebar">
<EditorSection />
sqliteExplorerApp_DEV
@nbbaier
SQLite Explorer (Dev Branch) View and interact with your Val Town SQLite data. It's based off Steve's excellent SQLite Admin val, adding the ability to run SQLite queries directly in the interface. This new version has a revised UI and that's heavily inspired by LibSQL Studio by invisal . This is now more an SPA, with tables, queries and results showing up on the same page. Install Install the latest stable version (v66) by forking this val: Authentication Login to your SQLite Explorer with password authentication with your Val Town API Token as the password. Todos / Plans [ ] improve error handling [ ] improve table formatting [ ] sticky table headers [x] add codemirror [ ] add loading indication to the run button (initial version shipped) [ ] add ability to favorite queries [ ] add saving of last query run for a table (started) [ ] add visible output for non-query statements [ ] add schema viewing [ ] add refresh to table list sidebar after CREATE/DROP/ALTER statements [ ] add automatic execution of initial select query on double click [x] add views to the sidebar [ ] add triggers to sidebar [ ] add upload from SQL, CSV and JSON [ ] add ability to connect to a non-val town Turso database [x] fix wonky sidebar separator height problem (thanks to @stevekrouse) [x] make result tables scrollable [x] add export to CSV, and JSON (CSV and JSON helper functions written in this val . Thanks to @pomdtr for merging the initial version!) [x] add listener for cmd+enter to submit query
HTTP
- [ ] add schema viewing
- [ ] add refresh to table list sidebar after `CREATE/DROP/ALTER` statements
- [ ] add automatic execution of initial select query on double click
- [x] add views to the sidebar
- [ ] add triggers to sidebar
- [ ] add upload from SQL, CSV and JSON
- [ ] add ability to connect to a non-val town Turso database
- [x] fix wonky sidebar separator height problem (thanks to @stevekrouse)
- [x] make result tables scrollable
return c.render(
<main class="sidebar-layout">
<div class="sidebar">
<TablesList tables={[...tables, ...views]}></TablesList>
<Separator direction="horizontal"></Separator>
<div class="not-sidebar">
<EditorSection />
PriestGPTClient
@mjweaver01
PriestGPT Client Client for https://www.val.town/v/mjweaver01/PriestGPT Ask him about books or verses in the bible, and he will be sure to give you a short homily about it!
HTTP
# PriestGPT Client
Client for https://www.val.town/v/mjweaver01/PriestGPT
Ask him about books or verses in the bible, and he will be sure to give you a short homily about it!
</h1>
<p class="mb-4">Client for the <a class="underline" href="https://www.val.town/v/mjweaver01/PriestGPT">PriestGPT API</a></p>
<p class="mb-4">Ask him about books or verses in the bible, and he will be sure to give you a short sermon about it!</p>
masterfulPeachHookworm
@stevekrouse
Cerebras Inference template This val shows you how you can deploy an app using Cerebras Inference on Val Town
in seconds. What is Cerebras? Cerebras is an American chip manufacturer that
produces large wafer chips that deliver mind-blowing LLM inference speeds.
As of this writing on Jan 17, 2025, Cerebras Inference
provides Llama 3.18b, 3.1 70b, and 3.370b at a jaw-dropping 2k tokens per second β
that's 50x faster than what the frontier labs produce. Llama 3.370b at 2k tokens
per second is particularly noteworthy because it is
a GPT-4-class model . This level
of intelligence at that level of speed will unlock whole new classes of applications. Quick start There are two ways to get started: Fork this app and customize it (or ask Townie AI to customize it) Start a new chat with Townie AI and copy & paste
the following instructions: . Use Cerebras for AI on the backend like so:
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: "YOUR_CEREBRAS_API_KEY",
baseURL: "https://api.cerebras.ai/v1"
});
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
messages: [],
});
const generatedText = response.choices[0].message.content; For example, the val in this template was created by asking
Townie AI to "Make a chatgpt clone",
then I hit shift-enter twice, and then pasted in the instructions on
how to use Cerebras from above, then hit enter. Townie built this app on its first try, in about 20 seconds. Sample apps Cerebras Searcher - a Perplexity clone that uses the SerpAPI to do RAG
and summaries with Cerebras ( requires a SerpAPI key ) Cerebras Coder - an app that
generates websites in a second with Cerebras Cerebras Debater - an
app that truly shows Cerebras's speed: it's Cerebras talking to Cerebras in a debate
HTTP
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: "YOUR_CEREBRAS_API_KEY",
baseURL: "https://api.cerebras.ai/v1"
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
/** @jsxImportSource https://esm.sh/react@18.2.0 */
import { createRoot } from "https://esm.sh/react-dom@18.2.0/client";
import React, { useState } from "https://esm.sh/react@18.2.0";
function App() {
</div>
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") {
client();
export default async function server(request: Request): Promise<Response> {
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: Deno.env.get("CEREBRAS_API_KEY"),
try {
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
generateframeImage
@michaelwschultz
Gathers information and returns an image of this val Why I'm using this val for my 3-color e-ink display run by a Raspberry Pi Zero W. The Pi runs a cron job that tell's it
to fetch this url twice a day and render it to the display. Works like a charm. Right now I'm not displaying much but I'm going to keep iterating on what type of information I want to display. How I'll post more info on my set up here later if you want to try something similar. But I assume this workflow could
be used for lot's of different projects that don't have a ton of compute or where you don't want to learn how to
actually draw things to a screen like all the e-ink display libraries.
HTTP
/** @jsxImportSource https://esm.sh/react */
import React from "https://esm.sh/react";
import { renderToString } from "https://esm.sh/react-dom/server";
async function getWeather(latitude: number, longitude: number) {
valle_tmp_1232975987134350426195802232196885
@janpaul123
@jsxImportSource https://esm.sh/react
HTTP
/** @jsxImportSource https://esm.sh/react */
import valleGetValsContextWindow from "https://esm.town/v/janpaul123/valleGetValsContextWindow";
import archiveVal from "https://esm.town/v/nbbaier/archiveVal?v=10";
import _ from "npm:lodash@4";
import OpenAI from "npm:openai";
import { renderToString } from "npm:react-dom/server";
// Set these to your own
const username = "janpaul123";
spotify
@ejfox
// Thank you for the feedback! You're right, we need to handle cases where playlist images might be missing.
HTTP
import { Hono } from "npm:hono@3";
const app = new Hono();
const CLIENT_ID = "605b681271294a05ba5a574f790ac409";
const REDIRECT_URI = "https://ejfox-spotify.web.val.run/callback";
app.get("/", (c) => {
<div class="bg-gray-800 p-10 rounded-xl shadow-2xl max-w-md w-full">
<h1 class="text-4xl font-bold mb-8 text-center text-green-400">Spotify Playlist Viewer</h1>
<a href="https://accounts.spotify.com/authorize?client_id=${CLIENT_ID}&response_type=token&redirect_uri=${encodeURIComponent(REDIRECT_URI)}&scope=playlist-read-private"
class="block w-full text-center bg-green-500 hover:bg-green-600 text-white font-semibold py-3 px-6 rounded-full transition duration-300 ease-in-out transform hover:scale-105 focus:outline-none focus:ring-2 focus:ring-green-400 focus:ring-opacity-50">
Login with Spotify
logicalBeigeSkink
@tecidax272
@jsxImportSource https://esm.sh/react
HTTP
/** @jsxImportSource https://esm.sh/react **/
import { renderToString } from "https://esm.sh/react-dom@18/server";
export default (req: Request) => {
return new Response(
adeptCoralKite
@stevekrouse
Cerebras Inference template This val shows you how you can deploy an app using Cerebras Inference on Val Town
in seconds. What is Cerebras? Cerebras is an American chip manufacturer that
produces large wafer chips that deliver mind-blowing LLM inference speeds.
As of this writing on Jan 17, 2025, Cerebras Inference
provides Llama 3.18b, 3.1 70b, and 3.370b at a jaw-dropping 2k tokens per second β
that's 50x faster than what the frontier labs produce. Llama 3.370b at 2k tokens
per second is particularly noteworthy because it is
a GPT-4-class model . This level
of intelligence at that level of speed will unlock whole new classes of applications. Quick start There are two ways to get started: Fork this app and customize it (or ask Townie AI to customize it) Start a new chat with Townie AI and copy & paste
the following instructions: . Use Cerebras for AI on the backend like so:
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: "YOUR_CEREBRAS_API_KEY",
baseURL: "https://api.cerebras.ai/v1"
});
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
messages: [],
});
const generatedText = response.choices[0].message.content; For example, the val in this template was created by asking
Townie AI to "Make a chatgpt clone",
then I hit shift-enter twice, and then pasted in the instructions on
how to use Cerebras from above, then hit enter. Townie built this app on its first try, in about 20 seconds. Sample apps Cerebras Searcher - a Perplexity clone that uses the SerpAPI to do RAG
and summaries with Cerebras ( requires a SerpAPI key ) Cerebras Coder - an app that
generates websites in a second with Cerebras Cerebras Debater - an
app that truly shows Cerebras's speed: it's Cerebras talking to Cerebras in a debate
HTTP
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: "YOUR_CEREBRAS_API_KEY",
baseURL: "https://api.cerebras.ai/v1"
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
/** @jsxImportSource https://esm.sh/react@18.2.0 */
import { createRoot } from "https://esm.sh/react-dom@18.2.0/client";
import React, { useState } from "https://esm.sh/react@18.2.0";
function App() {
</div>
function client() {
createRoot(document.getElementById("root")).render(<App />);
if (typeof document !== "undefined") {
client();
export default async function server(request: Request): Promise<Response> {
const { OpenAI } = await import("https://esm.sh/openai");
const client = new OpenAI({
apiKey: Deno.env.get("CEREBRAS_API_KEY"),
try {
const response = await client.chat.completions.create({
model: "llama-3.3-70b",
looseTomatoCougar
@willthereader
@jsxImportSource https://esm.sh/react
HTTP
/** @jsxImportSource https://esm.sh/react */
import React, { useState } from "https://esm.sh/react";
import { createRoot } from "https://esm.sh/react-dom/client";
function App() {
getModelBuilder
@webup
An interactive, runnable TypeScript val by webup
Script
// Set up LangSmith tracer
const { Client } = await import("npm:langsmith");
const { LangChainTracer } = await import("npm:langchain/callbacks");
const client = new Client({
apiUrl: "https://api.smith.langchain.com",
apiKey: process.env.LANGSMITH,
const tracer = new LangChainTracer({ client });
const callbacks = options?.verbose !== false ? [tracer] : [];
aiImageGenerator
@jumptoai
@jsxImportSource https://esm.sh/react
Script
/** @jsxImportSource https://esm.sh/react */
import React, { useState } from "https://esm.sh/react";
import { createRoot } from "https://esm.sh/react-dom/client";
function ImageGenerator() {
const [error, setError] = useState<string | null>(null);
const generateImage = async (e: React.FormEvent) => {
e.preventDefault();
<script type="module">
import React from "https://esm.sh/react";
import { createRoot } from "https://esm.sh/react-dom/client";
function ImageGenerator() {
if (rootElement) {
createRoot(rootElement).render(React.createElement(ImageGenerator));
renderApp();