Commit 4d5aa35
README: lead with 7x context benefit, not implementation detail
Tagline changed from "Embeddable LLM inference in pure C" to
"LLM inference with 7x longer context — pure C, zero dependencies".
Adds "Lossless KV cache compression" to subtitle. Both EN and KO.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>1 parent dbabab0 commit 4d5aa35
2 files changed
Lines changed: 6 additions & 6 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2 | 2 | | |
3 | 3 | | |
4 | 4 | | |
5 | | - | |
| 5 | + | |
6 | 6 | | |
7 | 7 | | |
8 | | - | |
9 | | - | |
| 8 | + | |
| 9 | + | |
10 | 10 | | |
11 | 11 | | |
12 | 12 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
2 | 2 | | |
3 | 3 | | |
4 | 4 | | |
5 | | - | |
| 5 | + | |
6 | 6 | | |
7 | 7 | | |
8 | | - | |
9 | | - | |
| 8 | + | |
| 9 | + | |
10 | 10 | | |
11 | 11 | | |
12 | 12 | | |
| |||
0 commit comments