Search

Results include substring matches and semantically similar vals. Learn more
yawnxyz avatar
aiSimpleGroq
@yawnxyz
// set Deno.env.get("GROQ_API_KEY")
Script
// console.log(await ai("tell me a joke in Spanish"))
console.log(await ai("tell me a reddit joke", {
provider: "anthropic",
model: "claude-3-haiku-20240307",
spinningideas avatar
webpage_summarizer
@spinningideas
@jsxImportSource https://esm.sh/react@18.2.0
HTTP
const exampleUrls = [
"https://arxiv.org/abs/2411.07279",
"https://www.anthropic.com/news/github-copilot",
"https://huggingface.co/papers/2403.09629",
"https://arxiv.org/abs/2305.20050",
arash2060 avatar
VALLErun
@arash2060
The actual code for VALL-E: https://www.val.town/v/janpaul123/VALLE
HTTP
import { sleep } from "https://esm.town/v/stevekrouse/sleep?v=1";
import { anthropic } from "npm:@ai-sdk/anthropic";
import { openai } from "npm:@ai-sdk/openai";
} else {
vercelModel = anthropic(model);
// Anthropic doesn't support system messages not at the very start.
const systemRole = model.includes("gpt") ? "system" : "user";
maxTokens: 8192,
headers: { "anthropic-beta": "max-tokens-3-5-sonnet-2024-07-15" },
let messages = [
trantion avatar
excessPlumFrog
@trantion
VALL-E LLM code generation for vals! Make apps with a frontend, backend, and database. It's a bit of work to get this running, but it's worth it. Fork this val to your own profile. Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in tempValsParentFolderId . If you want to use OpenAI models you need to set the OPENAI_API_KEY env var . If you want to use Anthropic models you need to set the ANTHROPIC_API_KEY env var . Create a Val Town API token , open the browser preview of this val, and use the API token as the password to log in.
HTTP
* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-v
* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environ
* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API tok
arash2060 avatar
VALLE
@arash2060
VALL-E LLM code generation for vals! Make apps with a frontend, backend, and database. It's a bit of work to get this running, but it's worth it. Fork this val to your own profile. Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in tempValsParentFolderId . If you want to use OpenAI models you need to set the OPENAI_API_KEY env var . If you want to use Anthropic models you need to set the ANTHROPIC_API_KEY env var . Create a Val Town API token , open the browser preview of this val, and use the API token as the password to log in.
HTTP
* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-v
* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environ
* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API tok
MichaelNollox avatar
tenseRoseTiglon
@MichaelNollox
VALL-E LLM code generation for vals! Make apps with a frontend, backend, and database. It's a bit of work to get this running, but it's worth it. Fork this val to your own profile. Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in tempValsParentFolderId . If you want to use OpenAI models you need to set the OPENAI_API_KEY env var . If you want to use Anthropic models you need to set the ANTHROPIC_API_KEY env var . Create a Val Town API token , open the browser preview of this val, and use the API token as the password to log in.
HTTP
* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-v
* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environ
* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API tok
oijoijcoiejoijce avatar
VALLE
@oijoijcoiejoijce
VALL-E LLM code generation for vals! Make apps with a frontend, backend, and database. It's a bit of work to get this running, but it's worth it. Fork this val to your own profile. Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in tempValsParentFolderId . If you want to use OpenAI models you need to set the OPENAI_API_KEY env var . If you want to use Anthropic models you need to set the ANTHROPIC_API_KEY env var . Create a Val Town API token , open the browser preview of this val, and use the API token as the password to log in.
HTTP
* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-v
* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environ
* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API tok
MichaelNollox avatar
VALLE
@MichaelNollox
VALL-E LLM code generation for vals! Make apps with a frontend, backend, and database. It's a bit of work to get this running, but it's worth it. Fork this val to your own profile. Make a folder for the temporary vals that get generated, take the ID from the URL, and put it in tempValsParentFolderId . If you want to use OpenAI models you need to set the OPENAI_API_KEY env var . If you want to use Anthropic models you need to set the ANTHROPIC_API_KEY env var . Create a Val Town API token , open the browser preview of this val, and use the API token as the password to log in.
HTTP
* If you want to use OpenAI models you need to set the `OPENAI_API_KEY` [env var](https://www.val.town/settings/environment-v
* If you want to use Anthropic models you need to set the `ANTHROPIC_API_KEY` [env var](https://www.val.town/settings/environ
* Create a [Val Town API token](https://www.val.town/settings/api), open the browser preview of this val, and use the API tok
yawnxyz avatar
coverpatch
@yawnxyz
// import { covertest as test } from "../covertest/covertest.ts";
Script
const aiResponse = await ai(input,
provider: "anthropic",
model: "claude-3-sonnet-20240229",
* @param {Object} [options={}] - Options for the patching process.
* @param {string} [options.provider="anthropic"] - The AI provider to use.
* @param {string} [options.model="claude-3-5-sonnet-20240620"] - The AI model to use.
let {
provider = "anthropic",
model = "claude-3-5-sonnet-20240620",
* @param {Object} [options={}] - Options for the patching process.
* @param {string} [options.provider="anthropic"] - The AI provider to use.
* @param {string} [options.model="claude-3-5-sonnet-20240620"] - The AI model to use.
let {
provider = "anthropic",
model = "claude-3-5-sonnet-20240620",
elsif_maj avatar
topHNThreadByHour
@elsif_maj
// set at Thu Nov 30 2023 14:22:53 GMT+0000 (Coordinated Universal Time)
Email
// set at Thu Nov 30 2023 14:22:53 GMT+0000 (Coordinated Universal Time)
ofoiling passenger vessel [video]","Top thread on Hackernews for 17:00 is: Anthropic announces Claude 2.1 – 200k context, les
stevekrouse avatar
fileToDataURL
@stevekrouse
File to Data URL Helpers to convert files to base64 & base64-encoded data URLs, which are particularly helpful for sending images to LLMs like ChatGPT, Anthropic, and Google. ChatGPT Live example import { fileToDataURL } from "https://esm.town/v/stevekrouse/fileToDataURL"; const dataURL = await fileToDataURL(file); const response = await chat([ { role: "system", content: `You are an nutritionist. Estimate the calories. We only need a VERY ROUGH estimate. Respond ONLY in a JSON array with values conforming to: {ingredient: string, calories: number} `, }, { role: "user", content: [{ type: "image_url", image_url: { url: dataURL, }, }], }, ], { model: "gpt-4o", max_tokens: 200, }); Anthropic Live example import { fileToBase64 } from "https://esm.town/v/stevekrouse/fileToDataURL"; const base64File = await fileToBase64(file); let res = await anthropic.messages.create({ model: "claude-3-5-sonnet-20240620", max_tokens: 1024, messages: [ { "role": "user", "content": [ { "type": "image", "source": { "type": "base64", "media_type": file.type, "data": base64File, }, }, { "type": "text", "text": `Write an HTML email that evokes this photo in the funniest way possible, with code fences.`, }, ], }, ], }); Google Live example import { fileToBase64 } from "https://esm.town/v/stevekrouse/fileToDataURL"; const base64Image = await fileToBase64(image); const result = await model.generateContent([ "Write all the names and authors of these books in JSON format. The response should be a valid JSON array of objects, each with 'title' and 'author' properties.", { inlineData: { data: base64Image, mimeType: image.type, }, }, ]);
Script
which are particularly helpful for sending images to LLMs
like ChatGPT, Anthropic, and Google.
### ChatGPT
max_tokens: 200,
### Anthropic
[Live example](https://www.val.town/v/stevekrouse/roastbyEmail?v=14#L9-34)
const base64File = await fileToBase64(file);
let res = await anthropic.messages.create({
model: "claude-3-5-sonnet-20240620",
purplesquirrelmedia avatar
fileToDataURL
@purplesquirrelmedia
File to Data URL Helpers to convert files to base64 & base64-encoded data URLs, which are particularly helpful for sending images to LLMs like ChatGPT, Anthropic, and Google. ChatGPT Live example import { fileToDataURL } from "https://esm.town/v/stevekrouse/fileToDataURL"; const dataURL = await fileToDataURL(file); const response = await chat([ { role: "system", content: `You are an nutritionist. Estimate the calories. We only need a VERY ROUGH estimate. Respond ONLY in a JSON array with values conforming to: {ingredient: string, calories: number} `, }, { role: "user", content: [{ type: "image_url", image_url: { url: dataURL, }, }], }, ], { model: "gpt-4o", max_tokens: 200, }); Anthropic Live example import { fileToBase64 } from "https://esm.town/v/stevekrouse/fileToDataURL"; const base64File = await fileToBase64(file); let res = await anthropic.messages.create({ model: "claude-3-5-sonnet-20240620", max_tokens: 1024, messages: [ { "role": "user", "content": [ { "type": "image", "source": { "type": "base64", "media_type": file.type, "data": base64File, }, }, { "type": "text", "text": `Write an HTML email that evokes this photo in the funniest way possible, with code fences.`, }, ], }, ], }); Google Live example import { fileToBase64 } from "https://esm.town/v/stevekrouse/fileToDataURL"; const base64Image = await fileToBase64(image); const result = await model.generateContent([ "Write all the names and authors of these books in JSON format. The response should be a valid JSON array of objects, each with 'title' and 'author' properties.", { inlineData: { data: base64Image, mimeType: image.type, }, }, ]);
Script
which are particularly helpful for sending images to LLMs
like ChatGPT, Anthropic, and Google.
### ChatGPT
max_tokens: 200,
### Anthropic
[Live example](https://www.val.town/v/stevekrouse/roastbyEmail?v=14#L9-34)
const base64File = await fileToBase64(file);
let res = await anthropic.messages.create({
model: "claude-3-5-sonnet-20240620",
3
Next