Current frame
ByteDance-affiliated researcher working on LLM systems, serving, and diffusion language models.
Atlas / People / Detail
Publicly listed on dblp as Xiaoying Jia 0005 with affiliation ByteDance and an attached ORCID, with 2025 publications on high-performance LLM serving, model merging in LLM pre-training, and Seed Diffusion; also listed as an author on the Seed1.5-VL and Seed1.5-Thinking technical reports.
Profile status: updated
ByteDance-affiliated researcher working on LLM systems, serving, and diffusion language models.
Xiaoying Jia is publicly associated with ByteDance through a homonym-disambiguated dblp profile and ByteDance Seed technical-report authorship. The dblp record lists 2025 work including LiquidGEMM on high-performance LLM serving, Model Merging in Pre-training of Large Language Models, and Seed Diffusion, supporting a conservative profile centered on LLM systems and inference-related research.