Abstract
背景:Compiling DNN models into high-efficiency executables is not easy: the compilation procedure often involves converting high-level model specifications into several different intermediate representations (IR), e.g., graph IR and operator IR, and performing rule-based or learning-based optimizations from both platform-independent and platform-dependent perspectives. 将 DNN 模型编译成高效的可执行文件并不容易:编译过程通常涉及将高级模型规范转换为几种不同的中间表示(IR),例如图 IR 和算子 IR,并执行基于规则或基于学习的优化
本文:MT-DLComp
Task: Fuzzing Deep Learning Compilers, Localize the bugs
Method: metamorphic relations, semantics-preserving mutations
实验:
对象:4个流行的编译器
效果:+4bugs