Mode
Internal

Study As
Full Time

Principal Supervisor
Professor Jiuyong Li

Main Campus
Mawson Lakes

Applications Close
02 Aug 2024

Study Level
PhD

Applications Open To
Domestic Candidate

Tuition Fees:

All domestic students are eligible for a fee waiver. Find out more about fees and conditions.

Project Stipend:
No stipend available

About This Project 

Foundation models are complex neural networks that are popular in various applications. ChatGPT is a typical example of foundation model. Spatial-temporal foundation models have demonstrated remarkable accuracy and generalizability in tasks related to weather and climate forecasts. While large foundation models excel in performance, their inherent black-box nature drastically raises concerns about transparency and trustworthiness, especially in critical decision-making processes such as drought and flood forecasts. Therefore, interpretable methods for large-scale climate foundation models are currently urgently needed. The goals of the project are twofold. The first is to study new explanation methods for climate spatial-temporal foundation models, such as variable attribution methods for estimating weights of input variables for a prediction. The second is to use causal discovery and inference approaches to explore the predictions using counterfactual reasoning in different scenarios. The outcome will make the predictions made by spatial-temporal foundation models transparent and support human-AI interactive decision-making.


What you’ll do 

The student will conduct fundamental research for designing model explanation methods for spatial-temporal foundation models. The project will produce high-quality publications in top journals and conferences. The publications will build our track record for a potential future ARC Discovery application in explaining large foundation models for trustworthy decision-making. Technically, the project will develop new interpretable methods for climate foundation models and causal counterfactual explanation methods for exploring a prediction in different scenarios for human-AI interactive decision-making. Given that foundation models are popular in various applications, the developed explanation methods have potential transferability to other applications such as economic forecasting and scientific discovery.


Where you’ll be based 

This project is well-aligned with the research theme of the Industrial AI Research Centre and the STEM research strategy. It specifically contributes to explainable AI, which is a crucial area within the broader field of AI. The project's focus is on explaining foundation models, which are widely used in various applications. Specifically, foundation models are popular in various applications due to the success of large language models such as ChatGPT. There will be a lot of demand for explanation methods for foundation models. The methods developed in the project are potentially useful for other researchers in the university who are using foundation models.

The supervisory team for this PhD project consists of experts in the AI/machine learning field. Both Prof Jiuyong Li and Lin Liu are experts in data mining/machine research and have studied causal discovery and inference for many years. Their work has been supported by three ARC Discovery projects. They have jointly supervised more than 10 PhD graduates, including an Ian Davey best thesis recipient. The team's expertise positions them well to lead the project and deliver high-quality research outcomes. Given that there will be a lot of demand for explanation methods of foundation models in the coming years, the project will help both supervisors build a track record for this highly in-demand area and assist them in applying for ARC Discovery projects in the future.

Supervisory team

Financial Support  

This project is funded for reasonable research expenses.  A fee offset for the standard term of the program is available to Australian and New Zealand citizens, and permanent residents of Australia, including permanent humanitarian visa holders. Additionally, any Australian Aboriginal and/or Torres Strait Islander applicant who holds an offer of admission without a living allowance will be eligible for the Aboriginal Enterprise Research Scholarship. This scholarship is to the value of $50,291 per annum. Any Aboriginal Enterprise Research Scholarship recipient will also receive a fee waiver. International applicants are not invited to apply at this time.


Eligibility and Selection 

This project is open to applications from Australian or New Zealand citizens, and Australian permanent residents or permanent humanitarian visa holders. International applicants are not invited to apply at this time.

Applicants must meet the eligibility criteria for entrance into a PhD. All applications that meet the eligibility and selection criteria will be considered for this project. Additionally applicants must meet the projects selection criteria: 
 

  • A master or honour degree in computer science or related area.
  • IELTS test results with minimum score 6.5 if the honour/master degree is not from Australia.
  • Good programming skills. 
  • Research publications in reputable venues are desirable.

A merit selection process will be used to determine the successful candidate. The successful applicant is expected to study full-time and to be based at our Mawson Lakes Campus in the north of Adelaide. 


Essential Dates 

Applicants are expected to start in a timely fashion upon receipt of an offer. Extended deferral periods are not available. Applications close on 2 August 2024.

How to apply:

Applications must be lodged online, please note UniSA does not accept applications via email.

For further support see our step-by-step guide on how to apply , or contact the Graduate Research team on +61 8 8302 5880, option 1 or email us at research.admissions@unisa.edu.au. You will receive a response within one working day.

IMPORTANT: This site is optimised for the latest versions of Internet Explorer, Safari, Firefox and Chrome. Note that earlier versions of any browsers mentioned are supported, but likely to demonstrate slower response times.

By choosing to continue, you agree to the privacy policy. Show Privacy Policy

Research and industry

Latest news

Contact

If you wish to develop your own project please review our guidelines and contact the Graduate Research Admissions team if you have any questions. 

Contact us