shashin

Satoshi Hayakawa

This is Satoshi Hayakawa's research homepage (Japanese).

I joined Sony in March 2023. I was a DPhil student at Mathematical Institute, University of Oxford from October 2020 to January 2024 (thesis). My research interest is applied probability, including mathematical statistics, numerical analysis and machine learning; random convex hulls and kernel quadrature in particular.

e-mail: hayakawa [at] alumni.u-tokyo.ac.jp
Google Scholar: link

News

2024.10
Two preprints have been released:
  • Distillation of Discrete Diffusion through Dimensional Correlations (arXiv, will be presented at NeurIPS 2024 Workshop on Machine Learning and Compression)
  • Jump Your Steps: Optimizing Sampling Schedule of Discrete Diffusion Models (arXiv)
  • 2024.04
    I have received a DPhil Thesis Prize from the Mathematical Institute at the University of Oxford. (link)
    2024.03
    Joined Sony Group Corporation.
    2024.02
    Our paper "Policy Gradient with Kernel Quadrature" (joint work with Tetsuro Morimura), which summarizes my research internship at CyberAgent AI Lab, has been published in Transactions on Machine Learning Research.
    2024.01
    I passed my viva vose, final oral examination, on 30 Nov 2023 and I have officially been granted leave to supplicate for DPhil on 16 Jan 2024, which means my formal completion of a DPhil in Mathematics. The thesis is available online.
    2024.01
    Our paper "Adaptive Batch Sizes in Active Learning: A Probabilistic Numerics Approach" has been accepted for presentation at AISTATS 2024. (arXiv)
    2023.07
    Two papers have been published as part of the proceedings of ICML 2023:
  • Sampling-based Nyström Approximation and Kernel Quadrature (proceedings)
  • Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks with Quantum Computation (proceedings)
  • 2023.05
    Two papers have been published:
  • Hypercontractivity meets random convex hulls: analysis of randomized multivariate cubatures (Proceedings of the Royal Society A)
  • Convergence analysis of approximation formulas for analytic functions via duality for potential energy minimization (Japan Journal of Industrial and Applied Mathematics)
  • 2023.01
    Our paper "Estimating the probability that a given vector is in the convex hull of a random sample" has been published in Probability Theory and Related Fields.
    2022.11
    A blog-post (in Japanese) on my summer internship at Preferred Networks has been released.
    2022.09
    Two papers have been accepted for presentation at NeurIPS 2022:
    • Positively weighted kernel quadrature via subsampling (arXiv)
    • Fast Bayesian inference with batch Bayesian quadrature via kernel recombination (arXiv)

    Publication

    Preprints

    1. Satoshi Hayakawa, Yuhta Takida, Masaaki Imaizumi, Hiromi Wakaki, Yuki Mitsufuji. (2024). Distillation of discrete diffusion through dimensional correlations (arXiv)
    2. Yong-Hyun Park, Chieh-Hsin Lai, Satoshi Hayakawa, Yuhta Takida, Yuki Mitsufuji. (2024). Jump your steps: optimizing sampling schedule of discrete diffusion models (arXiv)
    3. Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Saad Hamid, Harald Oberhauser, Michael A Osborne. (2024). A quadrature approach for general-purpose batch Bayesian optimization via probabilistic lifting (arXiv)

    Refereed Papers

    1. Satoshi Hayakawa, Tetsuro Morimura. (2024). Policy Gradient with Kernel Quadrature, Transactions on Machine Learning Research, published online. (link)
    2. Masaki Adachi, Satoshi Hayakawa, Xingchen Wan, Martin Jørgensen, Vu Nguyen, Harald Oberhauser, Michael A Osborne. (2024). Adaptive batch sizes in active learning: a probabilistic numerics approach, Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS 2024), accepted. (arXiv)
    3. Satoshi Hayakawa and Ken'ichiro Tanaka. (2024). Convergence analysis of approximation formulas for analytic functions via duality for potential energy minimization, Japan Journal of Industrial and Applied Mathematics, 41, 105-127. (link)
    4. Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Michael A Osborne. (2023). SOBER: highly parallel Bayesian optimization and Bayesian quadrature over discrete and mixed spaces, ICML 2023 Workshop: Sampling and Optimization in Discrete Space. (arXiv)
    5. Hayata Yamasaki, Sathyawageeswar Subramanian, Satoshi Hayakawa, Sho Sonoda. (2023). Quantum ridgelet transform: winning lottery ticket of neural networks with quantum computation, Proceedings of the 40th International Conference on Machine Learning (ICML 2023), 39008-39034. (arXiv, proceedings)
    6. Satoshi Hayakawa, Harald Oberhauser and Terry Lyons. (2023). Sampling-based Nyström approximation and kernel quadrature, Proceedings of the 40th International Conference on Machine Learning (ICML 2023), 12678-12699. (arXiv, proceedings)
    7. Satoshi Hayakawa, Harald Oberhauser and Terry Lyons. (2023). Hypercontractivity meets random convex hulls: analysis of randomized multivariate cubatures, Proceedings of the Royal Society A, 479(2273), 20220725. (link)
    8. Satoshi Hayakawa, Terry Lyons and Harald Oberhauser. (2023). Estimating the probability that a given vector is in the convex hull of a random sample, Probability Theory and Related Fields, 185, 705-746. (link)
    9. Masaki Adachi*, Satoshi Hayakawa*, Martin Jørgensen, Harald Oberhauser and Michael A Osborne. (2022). Fast Bayesian inference with batch Bayesian quadrature via kernel recombination, Advances in Neural Information Processing Systems 35 (NeurIPS 2022), 16533-16547. (*equal contribution, arXiv, proceedings)
    10. Satoshi Hayakawa, Harald Oberhauser and Terry Lyons. (2022). Positively weighted kernel quadrature via subsampling, Advances in Neural Information Processing Systems 35 (NeurIPS 2022), 6886-6900. (arXiv, proceedings)
    11. Satoshi Hayakawa and Ken'ichiro Tanaka. (2022). Monte Carlo construction of cubature on Wiener space, Japan Journal of Industrial and Applied Mathematics, 39(2), 543-571. (link)
    12. Satoshi Hayakawa. (2021). Monte Carlo cubature construction, Japan Journal of Industrial and Applied Mathematics, 38(2), 561-577. (link)
    13. Satoshi Hayakawa and Ken'ichiro Tanaka. (2020). Error bounds of potential theoretic numerical integration formulas in weighted Hardy spaces, JSIAM Letters, 12, 21-24. (link)
    14. Satoshi Hayakawa and Taiji Suzuki. (2020). On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces, Neural Networks, 123, 343-361. (link)
    15. Hisashi Hayakawa, Yusuke Ebihara, David P Hand, Satoshi Hayakawa, Sandeep Kumar, Shyamoli Mukherjee and B Veenadhari. (2018). Low-latitude Aurorae during the Extreme Space Weather Events in 1859, The Astrophysical Journal, 869(1), 57. (link)
    16. Hisashi Hayakawa, Yusuke Ebihara, José M Vaquero, Kentaro Hattori, Víctor MS Carrasco, María de la Cruz Gallego, Satoshi Hayakawa, Yoshikazu Watanabe, Kiyomi Iwahashi, Harufumi Tamazawa, Akito D Kawamura and Hiroaki Isobe. (2018). A great space weather event in February 1730, Astronomy & Astrophysics, 616, A177. (link)
    17. Hisashi Hayakawa, Yusuke Ebihara, David M. Willis, Kentaro Hattori, Alessandra S. Giunta, Matthew N. Wild, Satoshi Hayakawa, Shin Toriumi, Yasuyuki Mitsuma, Lee T. Macdonald, Kazunari Shibata, and Sam M. Silverman. (2018). The Great Space Weather Event during 1872 February Recorded in East Asia, The Astrophysical Journal, 862(1), 15. (link)

    Talks

    English

    1. Nyström approximation and convex kernel quadrature, ICIAM 2023, Tokyo, August 24, 2023.
    2. Positively weighted kernel quadrature and a refined analysis of Nyström approximation, IFAM Seminar, Liverpool, March 22, 2023.
    3. Positively weighted kernel quadrature and a refined analysis of Nyström approximation, Probabilistic methods, Signatures, Cubature and Geometry, York, January 9, 2023.
    4. Hypercontractivity meets random convex hulls: analysis of randomized multivariate cubatures, Berlin SPDE Seminar, Berlin, November 10, 2022.
    5. Random convex hull, cubature, hypercontractivity, 50th Saint-Flour Probability Summer School, Saint-Flour, July 11-23, 2022.
    6. Random convex hull, cubature, hypercontractivity, York University Stochastic Analysis Seminar, York, June 13, 2022.
    7. Random convex hull, cubature, hypercontractivity, 15th Berlin-Oxford Young Researchers Meeting on Applied Stochastic Analysis, Berlin, May 12-14, 2022.
    8. Random convex hull, cubature, hypercontractivity, Toulouse Probability Seminar, Toulouse, May 10, 2022.
    9. Random convex hull, cubature, hypercontractivity, QIP-QML 2022, Vienna, March 10, 2022.
    10. Random convex hull, cubature, hypercontractivity, Rough Path Interest Group at DataSig, Online, January 26, 2022.
    11. Estimating the probability that a given vector is in the convex hull of a random sample, 14th Oxford-Berlin Young Researchers Meeting on Applied Stochastic Analysis, Online, February 10-12, 2021.

    Japanese

    1. Approximating expectation with random convex hulls and Bayesian quadrature, Japanese Joint Statistical Meeting, Kyoto, September 3-7, 2023.
    2. Approximating expectation with random convex hulls and Bayesian quadrature, Probability Young Summer Seminar, Kyoto, August 28-31, 2023.
    3. Optimization-free construction of kernel quadrature, JSIAM Annual Meeting, Spporo, September 8-10, 2022.
    4. Recombination, random convex hull, cubature, Tokyo Probability Seminar, Online, April 18, 2022.
    5. Random convex hull, cubature, hypercontractivity, Probability Early-Spring Seminar, Online, March 3, 2022.
    6. Random convex hulls and kernel quadrature, Probability Young Seminar Online, August 23, 2021.
    7. Point processes and subsampling for numerical integration, Intersections of complex networks in real world and infinite particle systems II, Online, August 19, 2021.
    8. Estimating the probability that a given vector is in the convex hull of a random sample, Probability Seminar at Kansai University, Online, April 24, 2021.
    9. Optimization-based cubature construction and its application to cubature on Wiener space, Probability Young Seminar Online, September 7-9, 2020.
    10. Monte Carlo cubature construction, 16th JSIAM Spring Meeting, Tokyo, March 3-4, 2020.
    11. Convergence analysis of approximation formulas for analytic functions via duality for potential energy minimization, 82nd Kanazawa Analysis Seminar, Kanazawa, January 15, 2020.
    12. On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces, Japanese Joint Statistical Meeting, Shiga, September 8-12, 2019. (slide)
    13. Convergence analysis of approximation and numerical integration formulas for analytic functions via duality, JSIAM Annual Meeting, Tokyo, September 3-5, 2019.

    Education

    2020.10-2024.01
    Doctor of Philosophy in Mathematics, the University of Oxford. (thesis)

    Supervisor: Professor Terry Lyons & Associate Professor Harald Oberhauser

    2019.04-2020.09
    Master of Information Science and Technology, The University of Tokyo.

    Supervisor: Associate Professor Ken'ichiro Tanaka

    2015.04-2019.03
    Bachelor of Engineering, The University of Tokyo.

    Supervisor: Associate Professor Taiji Suzuki

    Awards and Scholarships

    Activities

    Others