updated 1 public sources
large language modelsLLM servingmodel mergingdiffusion language models

Current frame

ByteDance-affiliated researcher working on LLM systems, serving, and diffusion language models.

Extended note

Xiaoying Jia is publicly associated with ByteDance through a homonym-disambiguated dblp profile and ByteDance Seed technical-report authorship. The dblp record lists 2025 work including LiquidGEMM on high-performance LLM serving, Model Merging in Pre-training of Large Language Models, and Seed Diffusion, supporting a conservative profile centered on LLM systems and inference-related research.