Content pfp
Content
@
0 reply
0 recast
0 reaction

𝚐𝔪𝟾𝚡𝚡𝟾 pfp
𝚐𝔪𝟾𝚡𝚡𝟾
@gm8xx8
OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models paper: arxiv.org/abs/2411.04905 project page: opencoder-llm.github.io OpenCoder is a family of open and reproducible code language models, available in 1.5B and 8B parameter sizes, supporting both English and Chinese. Pretrained on 2.5 trillion tokens (90% raw code, 10% code-related web data) and fine-tuned with 4.5M high-quality examples, it achieves performance comparable to top-tier models. OpenCoder provides complete model weights, inference code, training data, data processing pipeline, experimental results, and detailed training logs. > OpenCoder releases model weights, inference code, data-cleaning scripts, synthetic data, checkpoints, and over 4.5 million fine-tuning entries, making it one of the most transparent models available. > High-Quality Synthetic Data—Offers a synthetic data generation process and 4.5 million fine-tuning data entries, providing a strong foundation for training.
1 reply
0 recast
6 reactions

wizard not parzival pfp
wizard not parzival
@alexpaden
welcome back 👑
1 reply
0 recast
2 reactions