University of Melbourne
  • Melbourne, Victoria, Australia
Recent publications
This paper copes with the parameter estimation (PE) of the synchronous generator (SG) and the excitation systems. The field winding signals are not available for accurate measurement in the proposed strategy. To this end, a robust unknown input reconstruction (UIR) observer is exploited to determine both SG unknown inputs, including the mechanical torque and field voltage, under disturbances influence. Moreover, the gathered data from one of the SG identification tests, such as a power system stabilizer (PSS)-based method, is deployed for PE of the SG and excitation system hierarchically in the way that the excitation system identification is conducted after reconstructing the field voltage and current. The recommended approach is applied to a 150 MW gas unit, and the comparisons of the simulation and measurement data during validation tests demonstrate the high accuracy of the resultant parameters
Camera pose estimation has long relied on geometry-based approaches and sparse 2D-3D keypoint correspondences. With the advent of deep learning methods, the estimation of camera pose parameters, i.e., the six parameters that describe position and rotation denoted by 6 Degrees of Freedom (6-DoF), has decreased from tens of meters to a few centimeters in median error for indoor applications. For outdoor applications, errors can be quite large and highly dependent on the variations in occlusion, contrast, brightness, repetitive structures, or blur introduced by camera motion. To address these limitations, we introduce, B-Pose, a Bayesian Convolutional deep network capable of not only automatically estimating the camera's pose parameters from a single RGB image but also provides a measure of uncertainty in the parameter estimation. Reported experiments on outdoor and indoor datasets demonstrate that B-Pose outperforms SOTA techniques and generalizes better to unseen RGB images. A strong correlation is shown between the prediction error and the model's uncertainty, indicating that the prediction is almost always incorrect whenever the model's uncertainty is high. The code will be made publicly available at .
The use of antiresorptive therapies, such as bisphosphonates and denosumab, has helped to significantly advance the management of osteoporosis. The introduction, and eventual widespread use of these medications soon led to increasing reports of a new adverse event called medication-related (previously bisphosphonate-related) osteonecrosis of the jaw (MRONJ). MRONJ remains a relatively rare condition associated with the use of antiresorptive agents for osteoporosis and is characterized by exposed areas of necrotic jaw bone. These areas of necrotic bone may range from an asymptomatic ulcer in the mouth, to debilitating presentations complicated by pain and infection that may progress to involve oral-cutaneous fistulae and predispose to pathologic fractures of the mandible. Recognition of the potentially significant impact on the quality of life of this condition has resulted in increased efforts to ensure screening of oral health and necessity for dental treatments, particularly invasive procedures, such as dental extractions, prior to commencement of these therapies. This chapter will provide an overview of the current understanding of MRONJ in relation to incidence, pathophysiology, risk factors and clinical presentation. In addition, it will provide a brief overview of approaches to prevention and management of this condition.
Forecasting models that are trained across sets of many time series, known as global forecasting models, have recently shown promising results in prestigious forecasting competitions and real-world applications, outperforming many state-of-the-art univariate forecasting techniques. This chapter provides insights on why global models are important for forecasting in the context of Big Data and how these models outperform traditional univariate models, in the presence of large collections of related time series. Furthermore, we explain the data preparation steps of global model fitting and provide a brief history of the evolution of global models over the past few years. We also cover the recent theoretical discussions and intuitions around global models and share a summary of open-source frameworks available to implement global models.
Machine learning (ML) based time series forecasting models often require and assume certain degrees of stationarity in the data when producing forecasts. However, in many real-world situations, the data distributions are not stationary and they can change over time while reducing the accuracy of the forecasting models, which in the ML literature is known as concept drift. Handling concept drift in forecasting is essential for many ML methods in use nowadays, however, the prior work only proposes methods to handle concept drift in the classification domain. To fill this gap, we explore concept drift handling methods in particular for Global Forecasting Models (GFM) which recently have gained popularity in the forecasting domain. We propose two new concept drift handling methods, namely Error Contribution Weighting (ECW) and Gradient Descent Weighting (GDW), based on a continuous adaptive weighting concept. These methods use two forecasting models which are separately trained with the most recent series and all series, and finally, the weighted average of the forecasts provided by the two models is considered as the final forecasts. Using LightGBM as the underlying base learner, in our evaluation on three simulated datasets, the proposed models achieve significantly higher accuracy than a set of statistical benchmarks and LightGBM baselines across four evaluation metrics.
Wind derivatives are financial instruments designed to mitigate losses caused by adverse wind conditions. With the rapid growth of wind power capacity due to efforts to reduce carbon emissions, the demand for wind derivatives to manage uncertainty in wind power production is expected to increase. However, existing wind derivative literature often assumes normally distributed wind speed, despite the presence of skewness and leptokurtosis in historical wind speed data. This paper investigates how the misspecification of wind speed models affects wind derivative prices and proposes the use of the generalized hyperbolic distribution to account for non-normality. The study develops risk-neutral approaches for pricing wind derivatives using the conditional Esscher transform, which can accommodate stochastic processes with any distribution, provided the moment-generating function exists. The analysis demonstrates that model risk varies depending on the choice of the underlying index and the derivative’s payoff structure. Therefore, caution should be exercised when choosing wind speed models. Essentially, model risk cannot be ignored in pricing wind speed derivatives.
Advances in image reconstruction using either single or multimodality imaging data provide increasingly accurate three-dimensional (3D) patient’s arterial models for shear stress evaluation using computational fluid dynamics (CFD). We aim to evaluate the impacts on endothelial shear stress (ESS) derived from a simple image reconstruction using 3D-quantitative coronary angiography (3D-QCA) versus a multimodality reconstruction method using optical coherence tomography (OCT) in patients’ vessels treated with bioresorbable scaffolds. Seven vessels at baseline and five-year follow-up of seven patients from a previous CFD investigation were retrospectively selected for a head-to-head comparison of angiography-derived versus OCT-derived ESS. 3D-QCA significantly underestimated the minimum stent area [MSA] (-2.38mm²) and the stent length (-1.46 mm) compared to OCT-fusion method reconstructions. After carefully co-registering the region of interest for all cases with a sophisticated statistical method, the difference in MSA measurements as well as the inability of angiography to visualise the strut footprint in the lumen surface have translated to higher angiography-derived ESS than OCT-derived ESS (1.76 Pa or 1.52 times for the overlapping segment). The difference in ESS widened with a more restricted region of interest (1.97 Pa or 1.63 times within the scaffold segment). Angiography and OCT offer two distinctive methods of ESS calculation. Angiography-derived ESS tends to overestimate the ESS compared to OCT-derived ESS. Further investigations into ESS analysis resolution play a vital role in adopting OCT-derived ESS.
Tau protein is implicated in the pathogenesis of Alzheimer’s disease (AD) and other tauopathies, but its physiological function is in debate. Mostly explored in the brain, tau is also expressed in the pancreas. We further explored the mechanism of tau’s involvement in the regulation of glucose-stimulated insulin secretion (GSIS) in islet β-cells, and established a potential relationship between type 2 diabetes mellitus (T2DM) and AD. We demonstrate that pancreatic tau is crucial for insulin secretion regulation and glucose homeostasis. Tau levels were found to be elevated in β-islet cells of patients with T2DM, and loss of tau enhanced insulin secretion in cell lines, drosophila, and mice. Pharmacological or genetic suppression of tau in the db/db diabetic mouse model normalized glucose levels by promoting insulin secretion and was recapitulated by pharmacological inhibition of microtubule assembly. Clinical studies further showed that serum tau protein was positively correlated with blood glucose levels in healthy controls, which was lost in AD. These findings present tau as a common therapeutic target between AD and T2DM.
Background Mental-health-related stigma among physicians towards people with mental illnesses remains a barrier to quality care, yet few curricula provide training with a proactive focus to reduce the potential negative impacts of stigma. The aim of our study was to explore medical students’ perspectives on what areas of learning should be targeted (where stigma presents) and how they could be supported to prevent the formation of negative attitudes. Methods Six focus group discussions were conducted with second, third, and fourth-year postgraduate medical students (n = 34) enrolled at The University of Melbourne Medical School in September – October 2021. Transcripts were analysed using inductive thematic analysis. Results In terms of where stigma presents, three main themes emerged – (1) through unpreparedness in dealing with patients with mental health conditions, (2) noticing mentors expressing stigma and (3) through the culture of medicine. The primary theme related to 'how best to support students to prevent negative attitudes from forming' was building stigma resistance to reduce the likelihood of perpetuating stigma towards patients with mental health conditions and therefore enhance patient care. The participants suggest six primary techniques to build stigma resistance, including (1) reflection, (2) skills building, (3) patient experiences, (4) examples and exemplars, (5) clinical application and (6) transforming structural barriers. We suggest these techniques combine to form the ReSPECT model for stigma resistance in the curriculum. Conclusions The ReSPECT model derived from our research could provide a blueprint for medical educators to integrate stigma resistance throughout the curriculum from year one to better equip medical students with the potential to reduce interpersonal stigma and perhaps self-stigma. Ultimately, building stigma resistance could enhance care towards patients with mental health conditions and hopefully improve patient outcomes.
Ohlson (2023. Empirical accounting seminars: Elephants in the room. Accounting, Economics, and Law: A Convivium ) draws on his experience in empirical accounting seminars to identify five “elephants in the room”. I interpret each of these elephants as either a variant or a symptom of p-hacking. I provide evidence of the prevalence of p-hacking in accounting research that complements the observations made by Ohlson (2023. Empirical accounting seminars: Elephants in the room. Accounting, Economics, and Law: A Convivium ). In this paper, I identify a number of steps that could be taken to reduce p-hacking in accounting research. I conjecture that facilitating and encouraging replication alone could have profound effects on the quality and quantity of empirical accounting research.
The concept of biological age has emerged as a measurement that reflects physiological and functional decline with ageing. Here we aimed to develop a deep neural network (DNN) model that predicts biological age from optical coherence tomography (OCT). A total of 84,753 high-quality OCT images from 53,159 individuals in the UK Biobank were included, among which 12,631 3D-OCT images from 8,541 participants without any reported medical conditions at baseline were used to develop an age prediction model. For the remaining 44,618 participants, OCT age gap, the difference between the OCT-predicted age and chronological age, was calculated for each participant. Cox regression models assessed the association between OCT age gap and mortality. The DNN model predicted age with a mean absolute error of 3.27 years and showed a strong correlation of 0.85 with chronological age. After a median follow-up of 11.0 years (IQR 10.9–11.1 years), 2,429 deaths (5.44%) were recorded. For each 5-year increase in OCT age gap, there was an 8% increased mortality risk (hazard ratio [HR] = 1.08, CI:1.02–1.13, P = 0.004). Compared with an OCT age gap within ± 4 years, OCT age gap less than minus 4 years was associated with a 16% decreased mortality risk (HR = 0.84, CI: 0.75–0.94, P = 0.002) and OCT age gap more than 4 years showed an 18% increased risk of death incidence (HR = 1.18, CI: 1.02–1.37, P = 0.026). OCT imaging could serve as an ageing biomarker to predict biological age with high accuracy and the OCT age gap, defined as the difference between the OCT-predicted age and chronological age, can be used as a marker of the risk of mortality.
Prior infection can generate protective immunity against subsequent infection, although the efficacy of such immunity can vary considerably. Live-attenuated vaccines (LAVs) are one of the most effective methods for mimicking this natural process, and analysis of their efficacy has proven instrumental in the identification of protective immune mechanisms. Here, we address the question of what makes a LAV efficacious by characterising immune responses to a LAV, termed TAS2010, which is highly protective (80–90%) against lethal murine salmonellosis, in comparison with a moderately protective (40–50%) LAV, BRD509. Mice vaccinated with TAS2010 developed immunity systemically and were protected against gut-associated virulent infection in a CD4 ⁺ T cell-dependent manner. TAS2010-vaccinated mice showed increased activation of Th1 responses compared with their BRD509-vaccinated counterparts, leading to increased Th1 memory populations in both lymphoid and non-lymphoid organs. The optimal development of Th1-driven immunity was closely correlated with the activation of CD11b ⁺ Ly6G neg Ly6C hi inflammatory monocytes (IMs), the activation of which can be modulated proportionally by bacterial load in vivo . Upon vaccination with the LAV, IMs expressed T cell chemoattractant CXCL9 that attracted CD4 ⁺ T cells to the foci of infection, where IMs also served as a potent source of antigen presentation and Th1-promoting cytokine IL-12. The expression of MHC-II in IMs was rapidly upregulated following vaccination and then maintained at an elevated level in immune mice, suggesting IMs may have a role in sustained antigen stimulation. Our findings present a longitudinal analysis of CD4 ⁺ T cell development post-vaccination with an intracellular bacterial LAV, and highlight the benefit of inflammation in the development of Th1 immunity. Future studies focusing on the induction of IMs may reveal key strategies for improving vaccine-induced T cell immunity.
Importance Thromboprophylaxis for individuals receiving systemic anticancer therapies has proven to be effective. Potential to maximize benefits relies on improved risk-directed strategies, but existing risk models underperform in cohorts with lung and gastrointestinal cancers. Objective To assess clinical benefits and safety of biomarker-driven thromboprophylaxis and to externally validate a biomarker thrombosis risk assessment model for individuals with lung and gastrointestinal cancers. Design, Setting, and Participants This open-label, phase 3 randomized clinical trial (Targeted Thromboprophylaxis in Ambulatory Patients Receiving Anticancer Therapies [TARGET-TP]) conducted from June 2018 to July 2021 (with 6-month primary follow-up) included adults aged 18 years or older commencing systemic anticancer therapies for lung or gastrointestinal cancers at 1 metropolitan and 4 regional hospitals in Australia. Thromboembolism risk assessment based on fibrinogen and d -dimer levels stratified individuals into low-risk (observation) and high-risk (randomized) cohorts. Interventions High-risk patients were randomized 1:1 to receive enoxaparin, 40 mg, subcutaneously daily for 90 days (extending up to 180 days according to ongoing risk) or no thromboprophylaxis (control). Main Outcomes and Measures The primary outcome was objectively confirmed thromboembolism at 180 days. Key secondary outcomes included bleeding, survival, and risk model validation. Results Of 782 eligible adults, 328 (42%) were enrolled in the trial (median age, 65 years [range, 30-88 years]; 176 male [54%]). Of these participants, 201 (61%) had gastrointestinal cancer, 127 (39%) had lung cancer, and 132 (40%) had metastatic disease; 200 (61%) were high risk (100 in each group), and 128 (39%) were low risk. In the high-risk cohort, thromboembolism occurred in 8 individuals randomized to enoxaparin (8%) and 23 control individuals (23%) (hazard ratio [HR], 0.31; 95% CI, 0.15-0.70; P = .005; number needed to treat, 6.7). Thromboembolism occurred in 10 low-risk individuals (8%) (high-risk control vs low risk: HR, 3.33; 95% CI, 1.58-6.99; P = .002). Risk model sensitivity was 70%, and specificity was 61%. The rate of major bleeding was low, occurring in 1 participant randomized to enoxaparin (1%), 2 in the high-risk control group (2%), and 3 in the low-risk group (2%) ( P = .88). Six-month mortality was 13% in the enoxaparin group vs 26% in the high-risk control group (HR, 0.48; 95% CI, 0.24-0.93; P = .03) and 7% in the low-risk group (vs high-risk control: HR, 4.71; 95% CI, 2.13-10.42; P < .001). Conclusions and Relevance In this randomized clinical trial of individuals with lung and gastrointestinal cancers who were stratified by risk score according to thrombosis risk, risk-directed thromboprophylaxis reduced thromboembolism with a desirable number needed to treat, without safety concerns, and with reduced mortality. Individuals at low risk avoided unnecessary intervention. The findings suggest that biomarker-driven, risk-directed primary thromboprophylaxis is an appropriate approach in this population. Trial Registration ANZCTR Identifier: ACTRN12618000811202
Queering the Map ( ) is a novel digital platform: a storymap, an anonymous collaborative record, an archive of queer experiences. To contribute to the platform, visitors make their own mark by clicking on an empty space on the map. As to what visitors contribute, the platform’s About section suggests, simply, ‘If it counts for you, then it counts for Queering the Map’. In this article, we probe this guiding principle. What does count in this context? What matters in the queer archive? Drawing on interviews with 14 site users and an analysis of nearly 2000 stories pinned to Australia on the map, we consider what platform practices reveal about queer collective memory-making, to illuminate the how and why of a queer archive. We see that relatability matters because of the affective, affirming and community-building seeds it can generate; situation matters because it is through participatory practices that recognition, visibility and community place- making are enacted; and, the everyday matters as the archive’s visitors collectively claim and gift their varied personal experiences. Through these themes we explore queer contributions or how site visitors are oriented towards giving something of themselves to the archive. We discuss how archival properties of the platform are key to (queer) participation, and to meaning-making – as distinct, as queer, as a valued record. Queering the Map, we argue, is significant in how space is made for queer representation, carving new contours for archival ‘evidence’ and community histories.
Emulsion systems are extensively utilized in the food industry, including dairy products, such as ice cream and salad dressing, as well as meat products, beverages, sauces, and mayonnaise. Meanwhile, diverse advanced technologies have been developed for emulsion preparation. Compared with other techniques, high‐intensity ultrasound (HIUS) and high‐pressure homogenization (HPH) are two emerging emulsification methods that are cost‐effective, green, and environmentally friendly and have gained significant attention. HIUS‐induced acoustic cavitation helps in efficiently disrupting the oil droplets, which effectively produces a stable emulsion. HPH‐induced shear stress, turbulence, and cavitation lead to droplet disruption, altering protein structure and functional aspects of food. The key distinctions among emulsification devices are covered in this review, as are the mechanisms of the HIUS and HPH emulsification processes. Furthermore, the preparation of emulsions including natural polymers (e.g., proteins‐polysaccharides, and their complexes), has also been discussed in this review. Moreover, the review put forward to the future HIUS and HPH emulsification trends and challenges. HIUS and HPH can prepare much emulsifier‐stable food emulsions, (e.g., proteins, polysaccharides, and protein–polysaccharide complexes). Appropriate HIUS and HPH treatment can improve emulsions’ rheological and emulsifying properties and reduce the emulsions droplets’ size. HIUS and HPH are suitable methods for developing protein–polysaccharide forming stable emulsions. Despite the numerous studies conducted on ultrasonic and homogenization‐induced emulsifying properties available in recent literature, this review specifically focuses on summarizing the significant progress made in utilizing biopolymer‐based protein–polysaccharide complex particles, which can provide valuable insights for designing new, sustainable, clean‐label, and improved eco‐friendly colloidal systems for food emulsion. Practical Application Utilizing complex particle‐stabilized emulsions is a promising approach towards developing safer, healthier, and more sustainable food products that meet legal requirements and industrial standards. Moreover, the is an increasing need of concentrated emulsions stabilized by biopolymer complex particles, which have been increasingly recognized for their potential health benefits in protecting against lifestyle‐related diseases by the scientific community, industries, and consumers.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
44,160 members
Mia Cobb
  • Faculty of Veterinary Science
Karin Maria Verspoor
  • School of Computing and Information Systems
Derrick D. Brown
  • Victorian College of the Arts
Grattan Street, 3010, Melbourne, Victoria, Australia