A small Even Hub page for Even G2 glasses: you type or dictate a phrase, it turns into a stack of PNG slides (see src/signDimensions.json, currently 288×144 to use the SDK image cap) plus a status line, shipped through the official SDK. Navigation is charmingly retro—Prev, Next, Close—because your nose is not a trackpad.
This is a hub integration, not a replacement for a skilled interpreter. It does glossary words and fingerspelling-style slides; it will not win a prize for ASL grammar. It will put green-ish pixels where the hardware expects them.
- Node.js 18+
- The Even host app loading this page, or a normal browser with
?pc=1if you just want the UI and preview (no bridge, no guilt).
npm install
npm run devOpen the URL Vite prints. For keyboard cred: Ctrl+Enter sends from the textarea.
npm run dev— local Vite server (defaulthttp://localhost:5173).npm run sim— afternpm run dev, runsevenhub-simulatoragainst the dev URL (seepackage.json). Use this for a quick hub-store-style smoke of layout + bridge without real glasses.npm run hub:qr— prints a QR for sideloading the same URL via the Even hub CLI. Replacelocalhostwith your machine’s LAN IP or hostname when loading on a real device on the network;localhoston the phone refers to the phone, not your PC.app.json— Even Hub manifest at the repo root (package_id, edition, entrypoint, SDK floor, and so on). Keepversionin sync withpackage.jsonwhen you release.npm run pack:hub— runs a production build, thenevenhub pack app.json dist -o evensign.ehpk. Theevensign.ehpkfile appears at the repo root (gitignored). In Even Hub, create or import a project from that package to install on glasses for device testing.
Production
npm run buildDeploy dist/ to whatever URL your Even hub configuration uses.
package.jsonandapp.jsonversions should match (currently 1.1.0). Even Hub’sevenhub packrequiresapp.jsonversioninx.y.zform (no prerelease suffix).npm run test— Vitest unit tests undersrc/**/*.test.ts.npm run buildruns tests, then Typecheck + Vite.- CI —
.github/workflows/ci.ymlrunsnpm ciandnpm run buildon push/PR tomainormaster. - Fonts — JetBrains Mono is loaded from Google Fonts in
index.html(network + third party). Self-host for stricter offline policies. - Speech — the Speak button uses the Web Speech API; recognition may be handled by the browser/OS (sometimes cloud). See the in-app status line when listening.
- Phrases are tokenised and matched against
const WORDSinsrc/signSlides.ts. - By default the UI has word signs when available (same as “compact glossary”): one slide per known phrase when
public/signs/words/has art — better for real chats than spelling every letter. - Turn that off to spell glossary words letter-by-letter (practice / slow decoding), using
public/signs/alphabet/andpublic/signs/numbers/. - Unknown words are spelled out (A–Z / 0–9). Each slide gets a small status bar (LETTER / NUMBER / WORD, plus spell progress when you are inside a word) and the final bitmap is flattened to G2 green (R/B zero, G = luminance) in the browser so previews match glasses.
- Preview timing is a bit faster when the deck is only letters and digits (fingerspelling), slower when it mixes word slides.
- Close on the device swaps the nav row to Yes / No. Double-tap Yes exits; No restores Prev / Next / Close.
- After Send, Alt+← / Alt+→ (when focus is not in a text field) steps slides in the preview the same way as glasses Prev / Next—handy on desktop hub where there is no ring.
These are not bundled automatically; verify each file’s license on Commons before shipping.
- Category:ASL letters — many per-letter SVGs (often public domain). You can rasterise them with your own script into
public/signs/alphabet/(matchsrc/signDimensions.jsonand run throughscripts/sign-green.mjslogic or reusebuild-sign-assetspatterns). - Asl alphabet gallaudet ann.svg — CC0 full chart (sometimes cleaner than the classic Gallaudet file); save as
scripts/tmp-alphabet.svgand runnpm run build:signs. - Asl alphabet gallaudet.svg — the chart this repo’s crop script targets (CC0).
- ASL alphabet datasets (ML photos, variable quality): Kaggle ASL alphabet (CC0), Roboflow ASL letters (public domain). Useful if you train a custom exporter—not drop-in art for every word in ASL.
ASL “real word” illustrations (not fingerspelling) are rarely CC0 as a complete set. Sites like Lifeprint and Signing Savvy are usually not redistributable inside an app; prefer your own PNGs under public/signs/words/ at the same size as signDimensions.json, or commission / license art.
Download the Gallaudet alphabet SVG (see public/signs/ATTRIBUTIONS.md, CC0), save as scripts/tmp-alphabet.svg (gitignored), then:
npm run build:signs # alphabet + numbers PNGs from chart
npm run build:words # word strips from glyphs (see `const WORDS` in `src/signSlides.ts`)
# or
npm run build:signs:allGlossary — edit const WORDS in src/signSlides.ts, then run npm run build:words again.
| Path | Role |
|---|---|
src/main.ts |
Bridge wait, boot |
src/evenSignPage.ts |
Input, speech, preview, toggles |
src/evenSignBridge.ts |
SDK layout, list events, image queue |
src/signSlides.ts |
Phrase → slide list; const WORDS glossary |
src/signRender.ts |
PNG bytes per slide |
src/signConstants.ts |
Slide + glasses layout constants; assertGlassesLayout() |
src/signDimensions.json |
Authoritative SIGN_IMAGE_WIDTH / HEIGHT for app + npm run build:signs |
scripts/build-sign-assets.mjs |
Gallaudet SVG → alphabet/ + numbers/ PNGs |
scripts/build-word-signs.mjs |
Glossary composites (multi-row + min glyph height for G2 readability) |
scripts/sign-green.mjs |
R/B → 0, G ← luminance for G2 |
vitest.config.ts |
Unit test runner config |
.github/workflows/ci.yml |
CI build |
MIT — see LICENSE.
Sign artwork is derived from material credited in public/signs/ATTRIBUTIONS.md (CC0).