Vincent Liu


Associate Professor @ Penn CIS

Levine Hall North Rm 574
3330 Walnut Street
Philadelphia, PA 19104-6389

I am actively looking for motivated, passionate PhD students! If you are interested, please apply here.


News

[Nov 2025]
Happy to welcome Daphne Liu (class of 2044) to the group!
[Sep 2025]
Position paper on network-attached AI Accelerator disaggregation accepted to HotNets 2025.
[Sep 2025]
ParaFlex paper on multiplexed pipeline parallel LLMs accepted to SoCC 2025.
[Aug 2025]
CCEval was accepted into NSDI 2026.
[May 2025]
Congrats to Dr. Xinyi Chen and Dr. Kelvin Ng for completing their PhDs! They will be joining Meta and AWS, respectively, where they will both be working on software-hardware co-design for AI accelerators.
[Dec 2024]
Invisiflow and Multiverse LLM training simulation papers were accepted into NSDI 2025.
[Dec 2024]
CausalMesh was selected for the SIGMOD Research Highlight Award!

About Me

I am an associate professor in the Department of Computer and Information Science at the University of Pennsylvania, where I am the director of the Distributed Systems Lab (DSL) and also lead the PennNetworks group. My work has been recognized by an NSF CAREER Award, a VMWare Early Career Faculty Award, a Facebook Faculty Research Award, a Google Research Award, a SIGMOD Research Highlights Award, and several best paper awards at SIGCOMM and NSDI. Prior to Penn, I received my Ph.D. from the University of Washington working (primarily) with Tom Anderson and Arvind Krishnamurthy. Prior to that, my undergraduate research at the University of Texas at Austin was in the area of compilers and parallel systems.

My research interests are in the broad areas of distributed systems and networking, and my projects have touched on nearly every aspect of systems and networks. In each, my focus is on solving important, long-standing problems by introducing creative new approaches and bridging disciplines. Recent projects include, but are not limited to:

The exponential growth of large-scale ML models has created a fundamental mismatch between the granularity of hardware allocation and the increasingly dynamic requirements of inference and training workloads. Our work has made progress in improving the orchestration of resource provisioning (AlpaServe, OSDI '23; ParaFlex, SoCC '25), kernel dispatch (Paella, SOSP '23; Genie, HotNets '25), and collective communication (TE-CCL, SIGCOMM '24).

In principle, serverless computing offers the promise of infinite scalability and pay-per-use billing, all behind a simpler interface for users. In practice, however, modern serverless offerings can introduce subtle issues with correctness (c.f., Beldi, OSDI '20; μ2sls, POPL '23), performance (Mucache, NSDI '24), and pricing (λ-trim, ASPLOS '25).

As datacenter networks grow in complexity and AI training clusters reach scales of tens or hundreds of thousands of GPUs, the tools used to design and evaluate these networks have become a bottleneck. Our research explores techniques to increase the tractability of large cluster performance estimation by multiple orders of magnitude, using ML (MimicNet, SIGCOMM '21), Data-Oriented Design principles (DONS, SIGCOMM '23), GPU acceleration (Multiverse, NSDI '25), and statistical techniques (CCEval, NSDI '26).

While most modern applications can saturate increasingly high bandwidth links during peak periods (e.g., an AllReduce operation), the vast majority of network capacity remains underutilized during normal operation. Our work has explored creative ways to leverage programmable network hardware to reclaim this "wasted" capacity for the benefit of applications and network operators. Efforts in this space include Mantis (SIGCOMM '20), OrbWeaver (NSDI '22), PrintQueue (SIGCOMM '22), Cebinae (SIGCOMM '22), Cowbird (SIGCOMM '23), Beaver (OSDI '24), and InvisiFlow (NSDI '25).
All of this research is made possible by my fantastic students:

Students

Graduated Students
I also worked closely with Jaewan Hong (student of Ion Stoica), Charles Kazer (now a Instructor at Swarthmore College), John Sonchack (student of Johnathan Smith, now at Princeton), João Sedoc (student of Lyle Ungar, now at NYU Stern), and Max Demoulin (student of Boon Thau Loo and Linh T. X. Phan, now at DBOS).