Current frame
LLM infrastructure for large-scale language and multimodal models at ByteDance Seed.
Atlas / People / Detail
Haibin Lin works on LLM infrastructure at ByteDance Seed, focusing on training systems for large language and multimodal models from pre-training to post-training.
Profile status: updated
LLM infrastructure for large-scale language and multimodal models at ByteDance Seed.
Haibin Lin is a ByteDance Seed researcher focused on LLM infrastructure and large-scale training systems. His public homepage describes work on optimizing training frameworks for LLMs and multimodal models, spanning large-scale pre-training and post-training reinforcement learning infrastructure. Public profiles and project links connect him with systems and infrastructure work around MegaScale, verl, BytePS, GluonNLP, and Apache MXNet, and his Google Scholar profile lists interests in database systems, machine learning systems, and natural language processing.