Many existing studies in deep generative modeling have designed the generative process as a function. However, the recently proposed diffusion models showed excellent performance by modeling the generation process as a stochastic process, improving the mode coverage performance of generative adversarial networks (GANs). However, this stochastic-process-based approach has a disadvantage since it requires a large number of discretization timesteps and thus has a slow sampling speed.
Among the attempts to tackle this problem, there are approaches based on the Schrödinger bridge (SB) problem that can construct a stochastic process between any two distributions. These SB-based approaches try to improve the disadvantage of the diffusion models by training bidirectional stochastic processes. However, compared to generative modeling such as GANs, they still have a slow sampling speed. And they require a relatively long training time because the bidirectional processes should be trained.
Therefore, this study tried to reduce the number of timesteps and training time required by applying regularization to the existing SB-based models. Regularization terms were proposed to make the bidirectional stochastic processes consistent with each other and maintain stable states with reduced number of timesteps. In addition, each regularization term was integrated into a single term to enable more efficient training in terms of computation time and memory usage. 
This regularized SB-based stochastic process was applied to various generation tasks. And the desired bidirectional processes between different distributions could be obtained. In addition, the possibility of using the pre-trained SB-based process for better GAN training was also explored.