Compilers + ML/AI/LLM: 2 PhD positions
2 PhD studentships are available in the following general areas:
Transformer Optimization:
Transformer based large language models are increasingly popular but
require significant resources to train and deploy. This project will
explore compiler/algorithm co-design to dramatically reduce the cost
on real-world hardware. One potential direction is to build on our
prior work for neural architecture search [1][2].
Neural lifting:
Translating one language to another is a key requirement for the
development of new systems. This proposal aims to lift existing
source binaries to an intermediate representation (IR) using large
language models and then use existing compiler technology to lower the
IR. The main challenge here is accuracy and scale.
One potential direction is to build on our prior work [3]][4]
Flexibility
The exact details are flexible depending on the candidate's interests and
background.