I attended the 26th Information-Based Induction Sciences Workshop held in Kitakyushu, Japan from 10/29 to 11/1.


The tutorials, lectures, and sessions were very inspiring for me. Especially, I enjoyed the following:

  • Tutorial on ML × Physics Simulations: I learned that there are three ways to utilize ML in Physics Simulations. First, we can find an underlying equation that generates data. The found model can be either a concrete mathematical equation or a black box. Second, we can simulate a physical equation quickly with NN. I had thought that simulation by HPC is preferred because of its high accuracy, but I learned that simulation by ML can balance accuracy and speed. The third way is to integrate the first and second ones, i.e., outputting the value at each time and position given initial and boundary conditions.

  • Tutorial on Deep Learning Theory: I understand intuitively why the universal approximation theorem holds. I felt that the proof for the theorem is like a puzzle game or LEGO. I want to learn more about the theory. Maybe I will read “Mathematics for Deep Learning.”

  • Session on Optimal Transport: Making discrete problems differentiable can be applied to many areas. I felt that differencing ranking would lead to a new weakly supervised algorithm since ranking is a common weak supervision. I also got interested in applying optimal transport to distribution shift problems.

  • Session on Tensor Network: I learned about tensor networks for the first time. I guess there are some relationships between tensor decomposition and disentangled representation learning since both decompose data into components.

  • Lecture on GNN: I learned many interesting facts about GNN. For example, many NNs can be seen as a particular case of GNN. I will learn the basics of GNNs. Since graphs are related to causality, there might be a way to integrate GNN into ML × causality.