LLMpeople
Home People Organizations Reports Fields Schools
Public Atlas People first, reports as evidence, organizations as context.

Atlas / People / Detail

Liang Xiang

ByteDance Seed publicly identified Liang Xiang in December 2024 as Head of the Doubao Foundation Model Team. Public Seed and DBLP records also list him as a coauthor on work about LLM pre-training model merging and ByteDance LLM training infrastructure.

ByteDance Seed researcher working on large language models and AI infrastructure1 organizations2 reports

Profile status: updated

Liang Xiang portrait
Suggest a correction
Suggest a source

Trust signals

Profile completeness41%
Public sources3
Official sources1
Last reviewedMar 28, 2026
Scholar profile
updated 3 public sources
large language modelsAI infrastructuremodel trainingmodel merging

Current frame

ByteDance Seed researcher working on large language models and AI infrastructure

Organizations

core ByteDance Seed

Reports

Vision-Language Models Seed1.5-VL Technical Report Report Seed1.5-Thinking: Advancing Superb Reasoning Models with Reinforcement Learning

Official and primary sources

Robust LLM Training Infrastructure at ByteDance. Official source · dblp · DBLP

Supporting sources

Peking University-ByteDance "Doubao Large Model System Software Joint Laboratory" established, focusing on key issues in AI system software Supporting source · news · ByteDance Seed Team Model Merging in Pre-training of Large Language Models Supporting source · other · ByteDance Seed Team

LLMpeople is a public atlas for discovering frontier AI researchers with context, provenance, and respect.

Privacy · Terms