Skip to main content
Back to registry

long-context

davila7/claude-code-templates

Installs247
Install command
npx skills add https://github.com/davila7/claude-code-templates --skill long-context
Security audits
Gen Agent Trust HubPASS
SocketPASS
SnykWARN
About this skill
Use Long Context techniques when you need to: Key Techniques : RoPE (Rotary Position Embeddings), YaRN, ALiBi (Attention with Linear Biases), Position Interpolation Papers : RoFormer (arXiv 2104.09864), YaRN (arXiv 2309.00071), ALiBi (arXiv 2108.12409), Position Interpolation (arXiv 2306.15595) How it works: Mathematical formulation: Advantages: Key innovation: Parameters: Performance: Core idea: Formula: Advantages: Technique: Formula: Results: - Process long documents (32k, 64k, 128k+ tokens) with transformer models - Extend context windows of pre-trained models (LLaMA, Mistral, etc.) - Implement efficient positional encodings (RoPE, ALiBi) - Train models with length extrapolation capabilities - Deploy models that handle variable-length inputs efficiently - Fine-tune existing models for longer contexts with minimal compute - Encodes absolute position via rotation matrix - Provides relative position dependency in attention - Enables length extrapolation - Decaying inter-token dependency with distance - Compatible with linear attention - Better extrapolation than absolute position encodings - NTK-aware interpolation (Neural Tangent Kernel) - Attention temperature scaling - Efficient context extension (10× less tokens vs baselines) - Extends LLaMA to 128k tokens - 2.5× less training steps than baselines - State-of-the-art context window extension - No positional embeddings...

Source description provided by the upstream skill listing. Community reviews and install context appear in the sections below.

Community Reviews

Latest reviews

Sign in to review

No community reviews yet. Be the first to review.

Browse this skill in context
FAQ
What does long-context do?

long-context is listed in SkillJury, but the source summary is still sparse.

Is long-context good?

long-context does not have approved reviews yet, so SkillJury cannot publish a community verdict.

What agent does long-context work with?

long-context currently lists compatibility with codex, gemini-cli, opencode, cursor, github-copilot, claude-code.

What are alternatives to long-context?

Skills in the same category include telegram-bot-builder, flutter-app-size, sharp-edges, iterative-retrieval.

How do I install long-context?

npx skills add https://github.com/davila7/claude-code-templates --skill long-context

Related skills

More from davila7/claude-code-templates

Related skills

Alternatives in Software Engineering