Associate Professor Fred Roosta

Associate Professor

School of Mathematics and Physics
Faculty of Science
fred.roosta@uq.edu.au
+61 7 336 53259

Overview

Research Interests

  • Machine Learning
  • Numerical Optimization
  • Numerical Analysis
  • Randomized Algorithms
  • Computational Statistics
  • Scientific Computing

Qualifications

  • Doctor of Philosophy, The University of British Columbia

Publications

  • Berahas, Albert S., Roberts, Lindon and Roosta, Fred (2024). Non-uniform smoothness for gradient descent. Transactions on Machine Learning Research.

  • Hodgkinson, Liam, Van Der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2023). Monotonicity and double descent in uncertainty estimation with gaussian processes. International Conference on Machine Learning, Honolulu, HI United States, 23 - 29 July 2023. San Diego, CA United States: International Conference on Machine Learning.

  • MacDonald, Samual, Foley, Helena, Yap, Melvyn, Johnston, Rebecca L., Steven, Kaiah, Koufariotis, Lambros T., Sharma, Sowmya, Wood, Scott, Addala, Venkateswar, Pearson, John V., Roosta, Fred, Waddell, Nicola, Kondrashova, Olga and Trzaskowski, Maciej (2023). Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology. Scientific Reports, 13 (1) 7395, 1-14. doi: 10.1038/s41598-023-31126-5

View all Publications

Grants

View all Grants

Supervision

  • Doctor Philosophy

  • Doctor Philosophy

  • Doctor Philosophy

View all Supervision

Available Projects

  • Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.

  • To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.

  • In this project, invexity as a subclass of non-convexity will be explored. Particular emphasis is on the design of novel machine learning models that are (approximately) invex and, as a result, enjoy favorable properties in the context of optimization.

View all Available Projects

Publications

Book Chapter

  • Kylasa, Sudhir, Fang, Chih-Hao, Roosta, Fred and Grama, Ananth (2020). Parallel optimization techniques for machine learning. Parallel algorithms in computational science and engineering. (pp. 381-417) edited by Ananth Grama and Ahmed H. Sameh. Cham, Switzerland: Birkhauser. doi: 10.1007/978-3-030-43736-7_13

  • Ye, Nan, Roosta-Khorasani, Farbod and Cui, Tiangang (2019). Optimization methods for inverse problems. 2017 MATRIX annals. (pp. 121-140) edited by David R. Wood, Jan de Gier, Cheryl E. Praeger and Terence Tao. Cham, Switzerland: Springer. doi: 10.1007/978-3-030-04161-8_9

Journal Article

Conference Publication

  • Hodgkinson, Liam, Van Der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2023). Monotonicity and double descent in uncertainty estimation with gaussian processes. International Conference on Machine Learning, Honolulu, HI United States, 23 - 29 July 2023. San Diego, CA United States: International Conference on Machine Learning.

  • Nguyen, Dung, Zhao, Yan, Zhang, Yifan, Huynh, Anh Ngoc-Lan, Roosta, Fred, Hammer, Graeme, Chapman, Scott and Potgieter, Andries (2022). Crop type prediction utilising a long short-term memory with a self-attention for winter crops in Australia. IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17-22 July 2022. Piscataway, NJ, United States: Institute of Electrical and Electronics Engineers. doi: 10.1109/IGARSS46834.2022.9883737

  • van der Heide, Chris, Hodgkinson, Liam, Roosta, Fred and Kroese, Dirk (2021). Shadow Manifold Hamiltonian Monte Carlo. International Conference on Artificial Intelligence and Statistics, Online, 27-30- July 2021. Tempe, AZ, United States: ML Research Press.

  • Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: Computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence, AAAI 2021, Online, 2 - 9 February 2021. Menlo Park, CA United States: Association for the Advancement of Artificial Intelligence.

  • Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks. 35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence, Electr Network, Feb 02-09, 2021. PALO ALTO: ASSOC ADVANCEMENT ARTIFICIAL INTELLIGENCE.

  • Feng, Zhili, Roosta, Fred and Woodruff, David P. (2021). Non-PSD matrix sketching with applications to regression and optimization. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).

  • Hodgkinson, Liam, van der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2021). Stochastic continuous normalizing flows: training SDEs as ODEs. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).

  • Fang, Chih-Hao, Kylasa, Sudhir B., Roosta, Fred, Mahoney, Michael W. and Grama, Ananth (2020). Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems. International Conference on High Performance Computing, Networking, Storage and Analysis (SC), Atlanta, GA, United States, 9-19 November 2020. Piscataway, NJ, United States: IEEE Computer Society. doi: 10.1109/SC41405.2020.00061

  • Crane, Rixon and Roosta, Fred (2020). DINO: Distributed Newton-type optimization method. International Conference on Machine Learning, Virtual, 12-18 July 2020. San Diego, CA, United States: International Conference on Machine Learning.

  • Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2020). Second-order optimization for non-convex machine learning: an empirical study. SIAM International Conference on Data Mining, Cincinnati, OH, United States, 7-9 May 2020. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611976236.23

  • Crane, Rixon and Roosta, Fred (2019). DINGO: Distributed Newton-type method for gradient-norm optimization. Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8-14 December 2019. Maryland Heights, MO United States: Morgan Kaufmann Publishers.

  • Tsuchida, Russell, Roosta, Fred and Gallagher, Marcus (2019). Exchangeability and kernel invariance in trained MLPs. Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19, Macao, China, 10-16 August 2019. Marina del Rey, CA USA: International Joint Conferences on Artificial Intelligence. doi: 10.24963/ijcai.2019/498

  • Kylasa, Sudhir, Roosta, Fred (Farbod), Mahoney, Michael W. and Grama, Ananth (2019). GPU accelerated sub-sampled Newton's method for convex classification problems. SIAM International Conference on Data Mining, Calgary, Canada, 2-4 May 2019. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611975673.79

  • Cheng, Xiang, Roosta-Khorasani, Farbod, Palombo, Stefan, Bartlett, Peter L. and Mahoney, Michael W. (2018). FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods. Twenty-First International Conference on Artificial Intelligence and Statistics, Lanzarote, Canary Islands, 9-11 April 2018. Cambridge, MA, United States: M I T Press.

  • Wang, Shusen, Roosta-Khorasani, Farbod, Xu, Peng and Mahoney, Michael W. (2018). GIANT: Globally improved approximate Newton method for distributed optimization. 32nd Conference on Neural Information Processing Systems, NeurIPS 2018, Montreal, QC, Canada, 2 - 8 December, 2018. Maryland Heights, MO, United States: Neural information processing systems foundation.

  • Tsuchida, Russell, Roosta-Khorasani, Farbod and Gallagher, Marcus (2018). Invariance of weight distributions in rectified MLPs. 35th International Conference on Machine Learning, Stockholm, Sweden, 10-15 July 2018. Cambridge, MA, United States: M I T Press.

  • Levin, Keith, Roosta-Khorasani, Farbod, Mahoney, Michael W. and Priebe, Carey E. (2018). Out-of-sample extension of graph adjacency spectral embedding. 35th International Conference on Machine Learning, Stockholm, Sweden, 10-15 July 2018. Cambridge, MA, United States: M I T Press.

  • Bouchard, Kristofer E, Bujan, Alejandro F, Roosta-Khorasani, Farbod, Prabhat, Snijders, Jian-Hua Mao, Chang, Edward F, Mahoney, Michael W and Bhattacharyya, Sharmodeep (2017). The Union of Intersections (UoI) method for interpretable data driven discovery and prediction. 31st Annual Conference on Neural Information Processing Systems (NIPS), Long Beach, CA United States, 4-9 December 2017. Maryland Heights, MO, United States: Morgan Kaufmann Publishers.

  • Shun, Julian, Roosta-Khorasani, Farbod, Fountoulakis, Kimon and Mahoney, Michael W. (2016). Parallel local graph clustering. International Conferenceon Very Large Data Bases, New Delhi, India, 5-9 September 2016. New York, United States: Association for Computing Machinery. doi: 10.14778/2994509.2994522

  • Xu, Peng, Yang, Jiyan, Roosta-Khorasani, Farbod, Re, Christopher and Mahoney, Michael (2016). Sub-sampled Newton methods with non-uniform sampling. Neural Information Processing Systems 2016, Barcelona Spain, 5 - 10 December 2016 . La Jolla, CA United States: Neural Information Processing Systems Foundation.

Grants (Administered at UQ)

PhD and MPhil Supervision

Current Supervision

  • Doctor Philosophy — Principal Advisor

    Other advisors:

  • Doctor Philosophy — Principal Advisor

  • Doctor Philosophy — Principal Advisor

  • Doctor Philosophy — Principal Advisor

    Other advisors:

  • Doctor Philosophy — Principal Advisor

  • Doctor Philosophy — Principal Advisor

  • Doctor Philosophy — Associate Advisor

    Other advisors:

  • Doctor Philosophy — Associate Advisor

Completed Supervision

Possible Research Projects

Note for students: The possible research projects listed on this page may not be comprehensive or up to date. Always feel free to contact the staff for more information, and also with your own research ideas.

  • Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.

  • To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.

  • In this project, invexity as a subclass of non-convexity will be explored. Particular emphasis is on the design of novel machine learning models that are (approximately) invex and, as a result, enjoy favorable properties in the context of optimization.

  • This project will investigate, both theoretically and empirically, novel statistical techniques to explore the trade-offs between high-generalization performance and low-model complexity for scientific machine learning.

  • In this project, the application of kernelized Stein discrepancy for the design of novel randomized algorithms in a diverse range of contexts from optimization to statistical estimation will be theoretically and empirically explored.

  • This project aims to develop the next generation of second-order optimization methods for training complex machine learning models, with particular focus on scientific machine learning applications.

  • This project aims to design, analyze and implement efficient optimization algorithms suitable for distributed computing environments, with focus on large-scale machine learning.