Tensor programs are of critical use in many domains. Existing frameworks, such as PyTorch, TensorFlow, and JAX, adopt operator-based programming to ease programming, increase performance, and perform automatic differentiation. However, as the rapid development of tensor programs, operator-based programming shows significant limitations for irregular patterns since a large amount of redundant computation or memory access is introduced.
In this work, we propose FreeTensor, a free-form domain specific language which supports redundancy-avoid programming by introducing fine-grained control flow. With optimizations including partial evaluation, dependence-aware transformation, and fine-grained automatic differentiation, FreeTensor is able to generate high performance gradient programs on both CPU and GPU. Experiments show a speedup over existing tensor programming frameworks up to 5.10× for without differentiation, and up to 127.74× after differentiation for typical irregular tensor programs.