Invertible Tabular GANs: Killing Two Birds with OneStone for Tabular Data Synthesis
- Jaehoon Lee
- Computer ScienceNeural Information Processing Systems
- 8 February 2022
A generalized GAN framework for tabular synthesis is presented, which combines the adversarial training of GANs and the negative log-density regularization of invertible neural networks to improve the synthesis quality.
OCT-GAN: Neural ODE-based Conditional Tabular GANs
- Jayoung Kim, Jinsung Jeon, Jaehoon Lee, Jihyeon Hyeong, Noseong Park
- Computer ScienceThe Web Conference
- 19 April 2021
This work significantly improves the utility of state-of-the-art tabular data synthesis methods by designing a generator and discriminator based on neural ordinary differential equations (NODEs) and conducting experiments with 13 datasets.
EXIT: Extrapolation and Interpolation-based Neural Controlled Differential Equations for Time-series Classification and Forecasting
- Sheo Yon Jhin, Jaehoon Lee, Noseong Park
- Computer ScienceThe Web Conference
- 19 April 2022
This work redesigns NCDEs by redesigning their core part, i.e., generating a continuous path from a discrete time-series input, and proposes to generate another latent continuous path using an encoder-decoder architecture, which corresponds to the interpolation process ofNCDEs.
LORD: Lower-Dimensional Embedding of Log-Signature in Neural Rough Differential Equations
- Jaehoon Lee, Jinsung Jeon, Noseong Park
- Computer ScienceInternational Conference on Learning…
- 19 April 2022
The encoder successfully combines the higher-depth and the lower-depth log-signature knowledge, which greatly stabilizes the training process and increases the model accuracy, and the improvement ratio by the method is up to 75\% in terms of various classification and forecasting evaluation metrics.
Scalable Graph Synthesis with Adj and 1 - Adj
- Jinsung Jeon, Jing Liu, S. Jajodia
- Computer ScienceSDM
- 1 January 2021
Time Series Forecasting with Hypernetworks Generating Parameters in Advance
- Jaehoon Lee, C. Kim, Noseong Park
- Computer ScienceArXiv
- 22 November 2022
This work builds a hypernetwork that generates other target models’ parameters expected to perform well on the future data, and shows that the HyperGPA outperforms other baselines.