Skip to main content
Back to registry

training-llms-megatron

davila7/claude-code-templates

Megatron-Core trains LLMs from 2B to 462B parameters with up to 47% Model FLOP Utilization on H100 GPUs through advanced parallelism strategies.

Installs176
Install command
npx skills add https://github.com/davila7/claude-code-templates --skill training-llms-megatron
Security audits
Gen Agent Trust HubPASS
SocketPASS
SnykPASS
Community Reviews

Latest reviews

Sign in to review

No community reviews yet. Be the first to review.

Browse this skill in context
FAQ
What does training-llms-megatron do?

Megatron-Core trains LLMs from 2B to 462B parameters with up to 47% Model FLOP Utilization on H100 GPUs through advanced parallelism strategies.

Is training-llms-megatron good?

training-llms-megatron does not have approved reviews yet, so SkillJury cannot publish a community verdict.

What agent does training-llms-megatron work with?

training-llms-megatron currently lists compatibility with codex, gemini-cli, opencode, cursor, github-copilot, claude-code.

What are alternatives to training-llms-megatron?

Skills in the same category include telegram-bot-builder, flutter-app-size, sharp-edges, iterative-retrieval.

How do I install training-llms-megatron?

npx skills add https://github.com/davila7/claude-code-templates --skill training-llms-megatron

Related skills

More from davila7/claude-code-templates

Related skills

Alternatives in Software Engineering