Research


  • High Accuracy and Efficient Training of Deep Neural Networks
    Distributed Data Intensive Systems Lab, Georgia Tech
    Aug 2018 – Present
    • Supervisor: Prof. Ling Liu
    • Focus: Deep Learning Training, Performance Optimization
    • Goal: Accelerate deep learning training and improve the training efficiency via automatic hyper-parameter tuning.

  • Deep Learning Framework Performance Analysis and Optimization
    Distributed Data Intensive Systems Lab, Georgia Tech
    Aug 2017 – Present
    • Supervisor: Prof. Ling Liu
    • Focus: Deep Learning Systems, Performance Analysis
    • Goal: Build an efficient data and computing platform for deep learning.

  • Performance Analysis of Deep Learning with High-performance Storage
    Storage Systems Research Group, IBM Research
    May 2019 – Aug 2019
    • Mentors: Lunna Xu, Dr. Daniel Waddington
    • Focus: Storage Systems, Deep Learning Frameworks
    • Achievement: Analyze the performance of Deep Learning with various storage backends, such as the persistent memory and SSD.
  • Accelerating Deep Learning with Direct-to-GPU Storage
    Storage Systems Research Group, IBM Research
    May 2018 – Aug 2018
    • Mentors: Amit Warke, Dr. Daniel Waddington
    • Focus: Storage Systems, Deep Learning Frameworks
    • Achievement: Integrated the Direct-to-GPU storage system into Caffe to obtain over 2× performance improvement by reducing the overhead of data transmission.

  • DeepEyes: A Deep Learning Powered Localization System with Multi-modal Sensors
    Distributed Data Intensive Systems Lab, Georgia Tech
    Aug 2017 – Present
    • Supervisor: Prof. Ling Liu
    • Focus: Localization, Deep Learning Systems
    • Goal: Implement an out-door localization system with the deep learning model.

  • Parallel Graph Search Algorithms Analysis & Design
    National High-Performance Computing Center (Hefei), USTC
    Feb 2017 – Aug 2017
    • Supervisor: Prof. Yun Xu
    • Focus: Parallel Graph Search Algorithms, Breadth-First-Search (BFS)
    • Achievement: Design a new parallel BFS algorithm with better performance and load balance.

  • Detecting Large-gap Code Clones
    National High-Performance Computing Center (Hefei), USTC
    Sep 2015 – Jul 2017

    • Supervisor: Prof. Yun Xu
    • Focus: Source Code Processing & Indexing, Edit Distance, Detection Algorithms
    • Achievement: CCAligner, a token based large-gap clone detector (ICSE’18).

  • Summer Research Internship on Automatic Verification
    School of Computer Science, University of Birmingham
    Jul 2016 – Aug 2016
    • Supervisor: Prof. David Parker
    • Focus: LTS (Labeled Transition Systems Model Checker, Game Model Checker
    • Achievement: Implement LTS model checker and Game model checker for PRISM, a widely applied probabilistic model checker for analysis of systems, to enable it to support non-probabilistic models further.

  • Optimization for Distributed Applications
    Advanced Computer System Architecture Laboratory, USTC
    Sep 2015 – Jun 2016
    • Supervisor: Prof. Hong An
    • Focus: Gromacs, WRF
    • Achievement: 1st on WRF benchmark (1st: 100%, 2nd: 64.83%) in Student Cluster Competition, ISC 2016