Projects

FedLLM: Build Your Own Large Language Models on Proprietary Data using the FedML Platform

FedML AI platform is democratizing large language models (LLMs) by enabling enterprises to train their own models on proprietary data. Today, we release FedLLM, an MLOps-supported training pipeline that allows for building domain-specific LLMs on proprietary data. The platform enables data collaboration, computation collaboration, and model collaboration, and supporting training on centralized and geo-distributed GPU clusters, as well as federated learning for data silos. FedLLM is compatible with popular LLM libraries such as HuggingFace and DeepSpeed, and is designed to improve efficiency and security/privacy. To get started, FedML users & developers only need to add 100 lines of source code. The complex steps of deployment and orchestration of training in enterprise environments are all handled by FedML MLOps platform.

Learning with Less Labeling (LwLL)

We propose a novel algorithm for semi-supervised classification that achieves state-of-the-art performance on standard benchmarks and outperform previous works on transfer setting by a large margin.

Trust in Multi-Party Human-Robot Interaction

We designed and evaluated a novel framework for robot mediation of a support group

Telepresence Robot for K-12 Remote Education

We developed and evaluated various control methods and interfaces for mobile remote presence robots (we used [Ohmni](https://ohmnilabs.com/products/ohmnirobot/) robot) for remote K-12 education

Infant-Robot Interaction as an Early Intervention Strategy

Our goal is to develop a socially assistive, non-contact, infant-robot interaction system to provide contingent positive feedback to increase exploration and expand early movement practice