About Me
I am an Assistant Professor in the School of Electrical Engineering at KAIST (Korea Advanced Institute of Science and Technology). I was an Applied Scientist at Adobe Firefly, and a Postdoctoral Fellow at the Yale Institute for Foundations of Data Science (FDS). I obtained my Ph.D. degree in the School of Electrical Engineering at KAIST, where I was advised by Jinwoo Shin. I received an M.S. in Electrical engineering and a B.S. in Electrical Engineering and Mathematics (minored) from KAIST. My research interests focus on approximate algorithm design and analysis for large-scale machine learning and its applications. In 2019, I was fortunate to be a recipient of Microsoft Research Asia Fellowship 2019. In 2024, with Amin Karbasi and Amir Zandieh, we were selected as recipients of 2024 Roberts Innovation Fund Award for research on making AI more powerful and cost-effective.
Here is my CV (Last update: Jun 2023).
Contact
Email : insu.han@kaist.ac.kr
Office: N1 914, 291 Daehak-ro, Daejeon, South Korea
Publications
QJL: 1-Bit Quantized JL Transform for KV Cache Quantization with Zero Overhead
[paper]
Amir Zandieh, Majid Daliri, Insu Han
Under Review
Cell2Sentence: Teaching Large Language Models the Language of Biology
[paper]
Daniel Levine, Syed A Rizvi, Sacha Lévy, Nazreen Pallikkavaliyaveetil, David Zhang, Xingyu Chen, Sina Ghadermarzi, Ruiming Wu, Zihe Zheng, Ivan Vrkic, Anna Zhong, Daphne Raskin, Insu Han, Antonio Henrique de Oliveira Fonseca, Josue Ortega Caro, Amin Karbasi, Rahul Madhav Dhodapkar, David van Dijk
International Conference on Machine Learning (ICML) 2024
SubGen: Token Generation in Sublinear Time and Memory
[paper]
Amir Zandieh*, Insu Han*, Vahab Mirrokni, Amin Karbasi (*Equal contribution)
Under Review
HyperAttention: Long-context Attention in Near-Linear Time
[paper]
(α-β) Insu Han, Rajesh Jayaram, Amin Karbasi, Vahab Mirrokni, David P. Woodruff, Amir Zandieh
International Conference on Learning Representations (ICLR) 2024
Near Optimal Reconstruction of Spherical Harmonic Expansions
[paper]
Amir Zandieh, Insu Han, Haim Avron
Advances in Neural Information Processing Systems (NeurIPS) 2023
KDEformer: Accelerating Transformers via Kernel Density Estimation
[paper]
Amir Zandieh, Insu Han*, Majid Daliri*, Amin Karbasi (*Equal contribution)
International Conference on Machine Learning (ICML) 2023
Fast Neural Kernel Embeddings for General Activations
[paper][code, also implemented in Neural Tangents library]
Insu Han, Amir Zandieh, Jaehoon Lee, Roman Novak, Lechao Xiao, Amin Karbasi
Advances in Neural Information Processing Systems (NeurIPS) 2022
Scalable MCMC Sampling for Nonsymmetric Determinantal Point Processes
[paper][code]
Insu Han, Mike Gartrell, Elvis Dohmatob, Amin Karbasi
International Conference on Machine Learning (ICML) 2022, Long Presentation (118/5630=2%)
Random Gegenbauer Features for Scalable Kernel Methods
[paper]
Insu Han*, Amir Zandieh*, Haim Avron (*Equal contribution)
International Conference on Machine Learning (ICML) 2022, Long Presentation (118/5630=2%)
Scalable Sampling for Nonsymmetric Determinantal Point Processes
[paper][code]
Insu Han, Mike Gartrell, Jennifer Gillenwater, Elvis Dohmatob, Amin Karbasi
International Conference on Learning Representations (ICLR) 2022, Spotlight Presentation
Scaling Neural Tangent Kernels via Sketching and Random Features
[paper][code]
Amir Zandieh*, Insu Han*, Haim Avron, Neta Shoham, Chaewon Kim, Jinwoo Shin(*Equal contribution)
Advances in Neural Information Processing Systems (NeurIPS) 2021
Scalable Learning and MAP Inference for Nonsymmetric Determinantal Point Processes
[paper][code]
Mike Gartrell, Insu Han, Elvis Dohmatob, Jennifer Gillenwater and Victor-Emmanuel Brunel
International Conference on Learning Representations (ICLR) 2021, Oral Presentation (58/2997=1.8%)
Polynomial Tensor Sketch for Element-wise Function of Low-Rank Matrix
[paper][code]
Insu Han, Haim Avron and Jinwoo Shin
International Conference on Machine Learning (ICML) 2020
MAP Inference for Customized Determinantal Point Processes via Maximum Inner Product Search
[paper][code]
Insu Han and Jennifer Gillenwater
International Conference on Artificial Intelligence and Statistics (AISTATS) 2020
Stochastic Chebyshev Gradient Descent for Spectral Optimization
[paper][poster][video]
Insu Han, Haim Avron and Jinwoo Shin
Neural Information Processing Systems (NeurIPS) 2018, Spotlight Presentation (168/4856=3.5%)
Faster Greedy MAP Inference for Determinantal Point Processes
[paper][code][video]
Insu Han, Prabhanjan Kambadur, Kyoungsoo Park and Jinwoo Shin
International Conference on Machine Learning (ICML) 2017
Approximating Spectral Sums of Large-scale Matrices using Stochastic Chebyshev Approximations
[paper]
Insu Han, Dmitry Malioutov, Haim Avron and Jinwoo Shin
SIAM Journal on Scientific Computing (SISC) 2017
Large-scale Log-determinant Computation through Stochastic Chebyshev Expansions
[paper][code][video]
Insu Han, Dmitry Malioutov, and Jinwoo Shin
International Conference on Machine Learning (ICML) 2015
Research experiences
- [Summer 2019] Research intern at Google New York City with Jennifer Gillenwater
- [Spring 2018] Visting student at Tel Aviv University with Haim Avron