Journal Article: Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology
MacDonald, Samual, Foley, Helena, Yap, Melvyn, Johnston, Rebecca L., Steven, Kaiah, Koufariotis, Lambros T., Sharma, Sowmya, Wood, Scott, Addala, Venkateswar, Pearson, John V., Roosta, Fred, Waddell, Nicola, Kondrashova, Olga and Trzaskowski, Maciej (2023). Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology. Scientific Reports, 13 (1) 7395, 1-14. doi: 10.1038/s41598-023-31126-5
Journal Article: Inexact Newton-CG algorithms with complexity guarantees
Yao, Zhewei, Xu, Peng, Roosta, Fred, Wright, Stephen J. and Mahoney, Michael W. (2023). Inexact Newton-CG algorithms with complexity guarantees. IMA Journal of Numerical Analysis, 43 (3), 1855-1897. doi: 10.1093/imanum/drac043
Journal Article: MINRES: From negative curvature detection to monotonicity properties
Liu, Yang and Roosta, Fred (2022). MINRES: From negative curvature detection to monotonicity properties. SIAM Journal on Optimization, 32 (4), 2636-2661. doi: 10.1137/21m143666x
ARC Training Centre for Information Resilience
(2021–2026) ARC Industrial Transformation Training Centres
CropVision: A next-generation system for predicting crop production
(2021–2025) ARC Linkage Projects
Big time series data and randomised numerical linear algebra
(2021) University of Melbourne
Stochastic Simulation and Optimization Methods for Machine Learning
Doctor Philosophy
Newton-MR Methods for Non-convex Smooth Unconstrained Optimizations
(2023) Doctor Philosophy
Interpretable AI-Theory and Practice
Doctor Philosophy
Non-convex Optimization for Machine Learning
Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.
Novel Machine Learning Models for Scientific Discovery
To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.
Exploring Invexity in Optimization of Machine Learning Models
In this project, invexity as a subclass of non-convexity will be explored. Particular emphasis is on the design of novel machine learning models that are (approximately) invex and, as a result, enjoy favorable properties in the context of optimization.
Parallel optimization techniques for machine learning
Kylasa, Sudhir, Fang, Chih-Hao, Roosta, Fred and Grama, Ananth (2020). Parallel optimization techniques for machine learning. Parallel algorithms in computational science and engineering. (pp. 381-417) edited by Ananth Grama and Ahmed H. Sameh. Cham, Switzerland: Birkhauser. doi: 10.1007/978-3-030-43736-7_13
Optimization methods for inverse problems
Ye, Nan, Roosta-Khorasani, Farbod and Cui, Tiangang (2019). Optimization methods for inverse problems. 2017 MATRIX annals. (pp. 121-140) edited by David R. Wood, Jan de Gier, Cheryl E. Praeger and Terence Tao. Cham, Switzerland: Springer. doi: 10.1007/978-3-030-04161-8_9
Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology
MacDonald, Samual, Foley, Helena, Yap, Melvyn, Johnston, Rebecca L., Steven, Kaiah, Koufariotis, Lambros T., Sharma, Sowmya, Wood, Scott, Addala, Venkateswar, Pearson, John V., Roosta, Fred, Waddell, Nicola, Kondrashova, Olga and Trzaskowski, Maciej (2023). Generalising uncertainty improves accuracy and safety of deep learning analytics applied to oncology. Scientific Reports, 13 (1) 7395, 1-14. doi: 10.1038/s41598-023-31126-5
Inexact Newton-CG algorithms with complexity guarantees
Yao, Zhewei, Xu, Peng, Roosta, Fred, Wright, Stephen J. and Mahoney, Michael W. (2023). Inexact Newton-CG algorithms with complexity guarantees. IMA Journal of Numerical Analysis, 43 (3), 1855-1897. doi: 10.1093/imanum/drac043
MINRES: From negative curvature detection to monotonicity properties
Liu, Yang and Roosta, Fred (2022). MINRES: From negative curvature detection to monotonicity properties. SIAM Journal on Optimization, 32 (4), 2636-2661. doi: 10.1137/21m143666x
Rijsdijk, Timothy, Nehring, Micah, Kizil, Mehmet and Roosta, Fred (2022). Confirming the Lassonde Curve through life cycle analysis and its effect on share price: A case study of three ASX listed gold companies. Resources Policy, 77 102704, 1-12. doi: 10.1016/j.resourpol.2022.102704
Newton-MR: inexact Newton Method with minimum residual sub-problem solver
Roosta, Fred, Liu, Yang, Xu, Peng and Mahoney, Michael W. (2022). Newton-MR: inexact Newton Method with minimum residual sub-problem solver. EURO Journal on Computational Optimization, 10 100035, 1-44. doi: 10.1016/j.ejco.2022.100035
LSAR: efficient leverage score sampling algorithm for the analysis of big time series data
Eshragh, Ali, Roosta, Fred, Nazari, Asef and Mahoney, Michael W. (2022). LSAR: efficient leverage score sampling algorithm for the analysis of big time series data. Journal of Machine Learning Research, 23, 1-36.
Implicit Langevin algorithms for sampling from log-concave densities
Hodgkinson, Liam, Salomone, Robert and Roosta, Fred (2021). Implicit Langevin algorithms for sampling from log-concave densities. Journal of Machine Learning Research, 22 136, 1-30.
Potgieter, A. B., Zhao, Yan, Zarco-Tejada, Pablo J, Chenu, Karine, Zhang, Yifan, Porker, Kenton, Biddulph, Ben, Dang, Yash P., Neale, Tim, Roosta, Fred and Chapman, Scott (2021). Evolution and application of digital technologies to predict crop type and crop phenology in agriculture. In Silico Plants, 3 (1) diab017, 1-23. doi: 10.1093/insilicoplants/diab017
Inexact nonconvex Newton-type methods
Yao, Zhewei, Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2021). Inexact nonconvex Newton-type methods. INFORMS Journal on Optimization, 3 (2), 154-182. doi: 10.1287/ijoo.2019.0043
Convergence of Newton-mr under inexact hessian information
Liu, Yang and Roosta, Fred (2021). Convergence of Newton-mr under inexact hessian information. SIAM Journal on Optimization, 31 (1), 59-90. doi: 10.1137/19M1302211
Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings
Levin, Keith D., Roosta, Fred, Tang, Minh, Mahoney, Michael W. and Priebe, Carey E. (2021). Limit theorems for out-of-sample extensions of the adjacency and Laplacian spectral embeddings. Journal of Machine Learning Research, 22 194, 1-59.
Newton-type methods for non-convex optimization under inexact Hessian information
Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2020). Newton-type methods for non-convex optimization under inexact Hessian information. Mathematical Programming, 184 (1-2), 35-70. doi: 10.1007/s10107-019-01405-z
Roosta-Khorasani, Farbod and Mahoney, Michael W. (2018). Sub-sampled Newton methods. Mathematical Programming, 174 (1-2), 293-326. doi: 10.1007/s10107-018-1346-5
Variational perspective on local graph clustering
Fountoulakis, Kimon, Roosta-Khorasani, Farbod, Shun, Julian, Cheng, Xiang and Mahoney, Michael W. (2017). Variational perspective on local graph clustering. Mathematical Programming, 174 (1-2), 553-573. doi: 10.1007/s10107-017-1214-8
Algorithms that satisfy a stopping criterion, probably
Ascher, Uri and Roosta-Khorasani, Farbod (2016). Algorithms that satisfy a stopping criterion, probably. Vietnam Journal of Mathematics, 44 (1), 49-69. doi: 10.1007/s10013-015-0167-6
Schur properties of convolutions of gamma random variables
Roosta-Khorasani, Farbod and Szekely, Gábor J. (2015). Schur properties of convolutions of gamma random variables. Metrika, 78 (8), 997-1014. doi: 10.1007/s00184-015-0537-9
Improved bounds on sample size for implicit matrix trace estimators
Roosta-Khorasani, Farbod and Ascher, Uri (2015). Improved bounds on sample size for implicit matrix trace estimators. Foundations of Computational Mathematics, 15 (5), 1187-1212. doi: 10.1007/s10208-014-9220-1
Roosta-Khorasani, Farbod, Székely, Gábor J. and Ascher, Uri M. (2015). Assessing stochastic algorithms for large scale nonlinear least squares problems using extremal probabilities of linear combinations of gamma random variables. SIAM/ASA Journal on Uncertainty Quantification, 3 (1), 61-90. doi: 10.1137/14096311X
Data completion and stochastic algorithms for PDE inversion problems with many measurements
Roosta-Khorasani, Farbod, van den Doel, Kees and Ascher, Uri (2014). Data completion and stochastic algorithms for PDE inversion problems with many measurements. Electronic Transactions on Numerical Analysis, 42, 177-196.
Stochastic algorithms for inverse problems involving pdes and many measurements
Roosta-Khorasani, Farbod, Van Den Doel, Kees and Ascher, Uri (2014). Stochastic algorithms for inverse problems involving pdes and many measurements. SIAM Journal on Scientific Computing, 36 (5), S3-S22. doi: 10.1137/130922756
Nguyen, Dung, Zhao, Yan, Zhang, Yifan, Huynh, Anh Ngoc-Lan, Roosta, Fred, Hammer, Graeme, Chapman, Scott and Potgieter, Andries (2022). Crop type prediction utilising a long short-term memory with a self-attention for winter crops in Australia. IGARSS 2022 - 2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17-22 July 2022. Piscataway, NJ, United States: Institute of Electrical and Electronics Engineers. doi: 10.1109/IGARSS46834.2022.9883737
Shadow Manifold Hamiltonian Monte Carlo
van der Heide, Chris, Hodgkinson, Liam, Roosta, Fred and Kroese, Dirk (2021). Shadow Manifold Hamiltonian Monte Carlo. International Conference on Artificial Intelligence and Statistics, Online, 27-30- July 2021. Tempe, AZ, United States: ML Research Press.
Avoiding kernel fixed points: Computing with ELU and GELU infinite networks
Tsuchida, Russell, Pearce, Tim, van der Heide, Chris, Roosta, Fred and Gallagher, Marcus (2021). Avoiding kernel fixed points: Computing with ELU and GELU infinite networks. 35th AAAI Conference on Artificial Intelligence, AAAI 2021, Online, 2 - 9 February 2021. Menlo Park, CA United States: Association for the Advancement of Artificial Intelligence.
Non-PSD matrix sketching with applications to regression and optimization
Feng, Zhili, Roosta, Fred and Woodruff, David P. (2021). Non-PSD matrix sketching with applications to regression and optimization. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).
Stochastic continuous normalizing flows: training SDEs as ODEs
Hodgkinson, Liam, van der Heide, Chris, Roosta, Fred and Mahoney, Michael W. (2021). Stochastic continuous normalizing flows: training SDEs as ODEs. Conference on Uncertainty in Artificial Intelligence, Online, 27-29 July 2021. San Diego, CA, United States: Association For Uncertainty in Artificial Intelligence (AUAI).
Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems
Fang, Chih-Hao, Kylasa, Sudhir B., Roosta, Fred, Mahoney, Michael W. and Grama, Ananth (2020). Newton-admm: a distributed GPU-accelerated optimizer for multiclass classification problems. International Conference on High Performance Computing, Networking, Storage and Analysis (SC), Atlanta, GA, United States, 9-19 November 2020. Piscataway, NJ, United States: IEEE Computer Society. doi: 10.1109/SC41405.2020.00061
DINO: Distributed Newton-type optimization method
Crane, Rixon and Roosta, Fred (2020). DINO: Distributed Newton-type optimization method. International Conference on Machine Learning, Virtual, 12-18 July 2020. San Diego, CA, United States: International Conference on Machine Learning.
Second-order optimization for non-convex machine learning: an empirical study
Xu, Peng, Roosta, Fred and Mahoney, Michael W. (2020). Second-order optimization for non-convex machine learning: an empirical study. SIAM International Conference on Data Mining, Cincinnati, OH, United States, 7-9 May 2020. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611976236.23
DINGO: Distributed Newton-type method for gradient-norm optimization
Crane, Rixon and Roosta, Fred (2019). DINGO: Distributed Newton-type method for gradient-norm optimization. Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8-14 December 2019. Maryland Heights, MO United States: Morgan Kaufmann Publishers.
Exchangeability and kernel invariance in trained MLPs
Tsuchida, Russell, Roosta, Fred and Gallagher, Marcus (2019). Exchangeability and kernel invariance in trained MLPs. Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19, Macao, China, 10-16 August 2019. Marina del Rey, CA USA: International Joint Conferences on Artificial Intelligence. doi: 10.24963/ijcai.2019/498
GPU accelerated sub-sampled Newton's method for convex classification problems
Kylasa, Sudhir, Roosta, Fred (Farbod), Mahoney, Michael W. and Grama, Ananth (2019). GPU accelerated sub-sampled Newton's method for convex classification problems. SIAM International Conference on Data Mining, Calgary, Canada, 2-4 May 2019. Philadelphia, PA, United States: Society for Industrial and Applied Mathematics. doi: 10.1137/1.9781611975673.79
FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods
Cheng, Xiang, Roosta-Khorasani, Farbod, Palombo, Stefan, Bartlett, Peter L. and Mahoney, Michael W. (2018). FLAG n’ FLARE: fast linearly-coupled adaptive gradient methods. Twenty-First International Conference on Artificial Intelligence and Statistics, Lanzarote, Canary Islands, 9-11 April 2018. Cambridge, MA, United States: M I T Press.
GIANT: Globally improved approximate Newton method for distributed optimization
Wang, Shusen, Roosta-Khorasani, Farbod, Xu, Peng and Mahoney, Michael W. (2018). GIANT: Globally improved approximate Newton method for distributed optimization. 32nd Conference on Neural Information Processing Systems, NeurIPS 2018, Montreal, QC, Canada, 2 - 8 December, 2018. Maryland Heights, MO, United States: Neural information processing systems foundation.
Invariance of weight distributions in rectified MLPs
Tsuchida, Russell, Roosta-Khorasani, Farbod and Gallagher, Marcus (2018). Invariance of weight distributions in rectified MLPs. 35th International Conference on Machine Learning, Stockholm, Sweden, 10-15 July 2018. Cambridge, MA, United States: M I T Press.
Out-of-sample extension of graph adjacency spectral embedding
Levin, Keith, Roosta-Khorasani, Farbod, Mahoney, Michael W. and Priebe, Carey E. (2018). Out-of-sample extension of graph adjacency spectral embedding. 35th International Conference on Machine Learning, Stockholm, Sweden, 10-15 July 2018. Cambridge, MA, United States: M I T Press.
The Union of Intersections (UoI) method for interpretable data driven discovery and prediction
Bouchard, Kristofer E, Bujan, Alejandro F, Roosta-Khorasani, Farbod, Prabhat, Snijders, Jian-Hua Mao, Chang, Edward F, Mahoney, Michael W and Bhattacharyya, Sharmodeep (2017). The Union of Intersections (UoI) method for interpretable data driven discovery and prediction. 31st Annual Conference on Neural Information Processing Systems (NIPS), Long Beach, CA United States, 4-9 December 2017. Maryland Heights, MO, United States: Morgan Kaufmann Publishers.
Parallel local graph clustering
Shun, Julian, Roosta-Khorasani, Farbod, Fountoulakis, Kimon and Mahoney, Michael W. (2016). Parallel local graph clustering. International Conferenceon Very Large Data Bases, New Delhi, India, 5-9 September 2016. New York, United States: Association for Computing Machinery. doi: 10.14778/2994509.2994522
Sub-sampled Newton methods with non-uniform sampling
Xu, Peng, Yang, Jiyan, Roosta-Khorasani, Farbod, Re, Christopher and Mahoney, Michael (2016). Sub-sampled Newton methods with non-uniform sampling. Neural Information Processing Systems 2016, Barcelona Spain, 5 - 10 December 2016 . La Jolla, CA United States: Neural Information Processing Systems Foundation.
ARC Training Centre for Information Resilience
(2021–2026) ARC Industrial Transformation Training Centres
CropVision: A next-generation system for predicting crop production
(2021–2025) ARC Linkage Projects
Big time series data and randomised numerical linear algebra
(2021) University of Melbourne
Approximate solutions to large Markov decision processes
(2019) University of Melbourne
Efficient Second-Order Optimisation Algorithms for Learning from Big Data
(2018–2023) ARC Discovery Early Career Researcher Award
Stochastic Simulation and Optimization Methods for Machine Learning
Doctor Philosophy — Principal Advisor
Interpretable AI-Theory and Practice
Doctor Philosophy — Principal Advisor
Other advisors:
Newton type methods for constrained optimization
Doctor Philosophy — Principal Advisor
Advancing Deep Neural Network Reliability During Dataset Shift
Doctor Philosophy — Principal Advisor
Other advisors:
Characterizing Influence and Sensitivity in the Interpolating Regime
Doctor Philosophy — Principal Advisor
Other advisors:
Forecasting the Market Capitalisation of ASX Listed Junior Resource Companies through an Artificial Neural Network
Doctor Philosophy — Associate Advisor
Other advisors:
Newton-MR Methods for Non-convex Smooth Unconstrained Optimizations
(2023) Doctor Philosophy — Principal Advisor
Other advisors:
Efficient second-order optimisation methods for large scale machine learning
(2022) Doctor Philosophy — Principal Advisor
Other advisors:
Discounting-free Policy Gradient Reinforcement Learning from Transient States
(2022) Doctor Philosophy — Associate Advisor
Other advisors:
Results on Infinitely Wide Multi-layer Perceptrons
(2020) Doctor Philosophy — Associate Advisor
Other advisors:
Advances in Monte Carlo Methodology
(2018) Doctor Philosophy — Associate Advisor
Other advisors:
Note for students: The possible research projects listed on this page may not be comprehensive or up to date. Always feel free to contact the staff for more information, and also with your own research ideas.
Non-convex Optimization for Machine Learning
Design, analysis, and implementation of novel optimization algorithms for optimization of modern non-convex machine learning problems.
Novel Machine Learning Models for Scientific Discovery
To extend the application range of machine learning to scientific domains, this project will design, analyze and implement novel machine learning techniques that learn from data, while conform with known properties of the underlying scientific models.
Exploring Invexity in Optimization of Machine Learning Models
In this project, invexity as a subclass of non-convexity will be explored. Particular emphasis is on the design of novel machine learning models that are (approximately) invex and, as a result, enjoy favorable properties in the context of optimization.
Exploring Predictivity--Parsimony Trade-off In Scientific Machine Learning
This project will investigate, both theoretically and empirically, novel statistical techniques to explore the trade-offs between high-generalization performance and low-model complexity for scientific machine learning.
Randomized Algorithm Using Kernelized Stein Discrepancy
In this project, the application of kernelized Stein discrepancy for the design of novel randomized algorithms in a diverse range of contexts from optimization to statistical estimation will be theoretically and empirically explored.
Second-order Optimization Algorithms for Machine Learning
This project aims to develop the next generation of second-order optimization methods for training complex machine learning models, with particular focus on scientific machine learning applications.
Distributed Optimization Algorithms for Large-scale Machine Learning
This project aims to design, analyze and implement efficient optimization algorithms suitable for distributed computing environments, with focus on large-scale machine learning.