AI Inference on FPGAs


AI Inference on FPGAs

Context

Field Programmable Gate Arrays (FPGAs) are flexible, programmable hardware components that are well suited for specialized computing tasks. In the area of AI inference, they can be a practical alternative to traditional CPUs and GPUs, especially when low power consumption is important. FPGAs allow efficient processing of tasks like object detection or speech recognition by enabling parallel operations and avoiding unnecessary computations. This makes them especially useful for applications where AI models need to run reliably and with low energy use.

Tasks
  • You will adapt existing neural network models currently running on GPUs to execute on an FPGA platform. The models will be evaluated with respect to accuracy, latency, and energy efficiency. This work supports applications such as object detection or speech recognition in a robotic system.
  • Quantization and Pruning of artificial neural networks
  • Toolchain-based deployment using FINN, Vitis AI or related tools
Prerequisites
  • Solid foundations in artificial neural networks
  • Familiarity with FPGAs is a plus