Majid Mohammadi

Majid Mohammadi
  • Doctor of Philosophy
  • Researcher at Vrije Universiteit Amsterdam

About

53
Publications
11,821
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
930
Citations
Current institution
Vrije Universiteit Amsterdam
Current position
  • Researcher

Publications

Publications (53)
Article
In recent years, employing Shapley values to compute feature importance has gained considerable attention. Calculating these values inherently necessitates managing an exponential number of parameters—a challenge commonly mitigated through an additivity assumption coupled with linear regression. This paper proposes a novel approach by modeling supe...
Preprint
Full-text available
This paper conceptualizes the Deep Weight Spaces (DWS) of neural architectures as hierarchical, fractal-like, coarse geometric structures observable at discrete integer scales through recursive dilation. We introduce a coarse group action termed the fractal transformation, $T_{r_k} $, acting under the symmetry group $G = (\mathbb{Z}, +) $, to analy...
Article
Full-text available
Ameliorating the public transport system is the top priority for legislators and planners alike. But to implement effective and long-lasting solutions, they must take into account not only the viewpoint of decision-makers on better solutions but also the necessity of including the public in the process of evolution. To estimate the public transport...
Chapter
In this study, we propose a probabilistic group decision-making method based on the Best-Worst Tradeoff method (BWT) and the Bayesian approach. BWT is a pairwise comparison method that is used to elicit the tradeoffs among a set of attributes (criteria) in a multi-criteria decision-making problem. While BWT is suitable for a single decision-maker s...
Chapter
Full-text available
Understanding the rationale behind the predictions made by machine learning models holds paramount importance across numerous applications. Various explainable models have been developed to shed light on these predictions by assessing the individual contributions of features to the outcome of black-box models. However, existing methods often overlo...
Preprint
Full-text available
Priorities in multi-criteria decision-making (MCDM) convey the relevance preference of one criterion over another, which is usually reflected by imposing the non-negativity and unit-sum constraints. The processing of such priorities is different than other unconstrained data, but this point is often neglected by researchers, which results in fallac...
Article
Full-text available
Priorities in multi-criteria decision-making (MCDM) convey the relevance preference of one criterion over another, which is usually reflected by imposing the non-negativity and unit-sum constraints. The processing of such priorities is different than other unconstrained data, but this point is often neglected by researchers, which results in fallac...
Chapter
This chapter introduces the fundamental to supervised machine learning algorithms, namely the classification and regression problems. We explain each technique using an inspiring example and discuss how the corresponding algorithms work together with the data engineering pipelines. They provide some guidelines for implementing a classification or r...
Article
Full-text available
This paper presents a new multi‐criteria decision‐making (MCDM) method, namely the ratio product model (RPM). We first overview two popular aggregating models: the weighted sum model (WSM) and the weighted product model (WPM). Then, we argue that the two models suffer from some fundamental issues mainly due to ignoring the ratio nature of the alter...
Chapter
The current multi-criteria decision-making (MCDM) ranking methods provide suggestions on the superiority of an alternative to other alternatives mainly based on the alternatives’ performance difference and the weights (relative importance) of the criteria as absolute values, while the aim of the MCDM is to only provide the relative importance of cr...
Article
Full-text available
Machine learning is widely used to predict software defect-prone components, facilitating testing and improving application quality. In a recent meta-analysis on binary classification for software defect prediction, the so-called researcher bias —i.e., the group who conducts the study— has been shown to play a critical role; the analysis, however,...
Preprint
Full-text available
This paper presents a Bayesian framework predicated on a probabilistic interpretation of the MCDM problems and encompasses several well-known multi-criteria decision-making (MCDM) methods. Owing to the flexibility of Bayesian models, the proposed framework can address several long-standing, fundamental challenges in MCDM, including group decision-m...
Article
Full-text available
Manufacturing firms that continued production activities during the COVID-19 have been taking necessary measures to cope with the risks imposed by the pandemic. This study assesses the measures implemented by the Ready-Made Garments (RMG) sector in Bangladesh. With the increase in COVID-19 cases in Bangladesh, following government order, along with...
Article
The increasing heterogeneity of the VM offerings on public IaaS clouds gives rise to a very large number of deployment options for constructing distributed, multi-component cloud applications. However, selecting an appropriate deployment variant , i.e., a valid combination of deployment options, to meet required performance levels is non-trivia...
Chapter
Full-text available
In the Best-Worst Method (BWM), the criteria weights are typically characterized by intervals, each value in the interval representing an optimal weight for the associated criterion according to the preferences of the decision-maker. While the intervals can potentially provide the DM with more information, it makes it challenging to process the wei...
Preprint
Full-text available
Health care professionals rely on treatment search engines to efficiently find adequate clinical trials and early access programs for their patients. However, doctors lose trust in the system if its underlying processes are unclear and unexplained. In this paper, a model-agnostic explainable method is developed to provide users with further informa...
Preprint
Full-text available
\ell_1$ regularization has been used for logistic regression to circumvent the overfitting and use the estimated sparse coefficient for feature selection. However, the challenge of such a regularization is that the $\ell_1$ norm is not differentiable, making the standard algorithms for convex optimization not applicable to this problem. This paper...
Article
Full-text available
This paper is dedicated to applying ontology alignment systems to the heterogeneity problem in logistics. The primary motivation for doing so is to enable interoperability among different IT systems in logistics, all with their own database scheme. We first analyze different standards for logistics interoperability, which are implemented by XML sch...
Conference Paper
Infrastructure-as-Code (IaC) is increasingly adopted. However, little is known about how to best maintain and evolve it. Previous studies focused on defining Machine-Learning models to predict defect-prone blueprints using supervised binary classification. This class of techniques uses both defective and non-defective instances in the training phas...
Article
Full-text available
Ontology alignment is vital in Semantic Web technologies with numerous applications in diverse disciplines. Due to diversity and abundance of ontology alignment systems, a proper evaluation can portray the evolution of ontology alignment and depicts the efficiency of a system for a particular domain. Evaluation can help system designers recognize t...
Chapter
The online spread of rumours in disasters can create panic and anxiety and disrupt crisis operations. Hence, it is crucial to take measure against such a distressing phenomenon since it can turn into a crisis by itself. In this work, the automatic rumour detection in natural disasters is addressed from an imbalanced learning perspective due to the...
Preprint
Full-text available
Simulated annealing-based ontology matching (SANOM) participates for the second time at the ontology alignment evaluation initiative (OAEI) 2019. This paper contains the configuration of SANOM and its results on the anatomy and conference tracks. In comparison to the OAEI 2017, SANOM has improved significantly, and its results are competitive with...
Article
Full-text available
Ontology alignment is an important and inescapable problem for the interconnections of two ontologies stating the same concepts. Ontology alignment evaluation initiative (OAEI) has been taken place for more than a decade to monitor and help the progress of the field and to compare systematically existing alignment systems. As of 2018, the evaluatio...
Article
Full-text available
One of the essential problems in multi-criteria decision-making (MCDM) is ranking a set of alternatives based on a set of criteria. In this regard, there exist several MCDM methods which rank the alternatives in different ways. As such, it would be worthwhile to try and arrive at a consensus on this important subject. In this paper, a new approach...
Article
Full-text available
This paper presents a discrete-time neurodynamic model to solve linear and quadratic programming with respect to linear equality and inequality constraints. The new model is obtained by using an auxiliary variable, and can be seen as the generalization of a neural model for bound constraints in the literature in the sense that bound constraints lim...
Article
Full-text available
Rumor spreading in online social networks can inflict serious damages on individual, organizational, and societal levels. This problem has been addressed via computational approach in recent years. The dominant computational technique for the identification of rumors is the binary classification that uses rumor and non-rumor for the training. In th...
Article
The generalized lasso (GLasso) is an extension of the lasso regression in which there is an l <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub> penalty term (or regularization) of the linearly transformed coefficient vector. Finding the optimal solution of GLasso is not straightforward since the...
Conference Paper
Full-text available
The logistics sector consists of a limited number of large enterprises and many Small and Medium-sized Enterprises (SMEs). These enterprises either have developed proprietary information systems or use Commercial of the Shelve (COTS) systems tailored to their business processes. It is a large number of heterogeneous systems interoperable via a larg...
Article
Full-text available
The fused lasso signal approximator (FLSA) is a vital optimization problem with extensive applications in signal processing and biomedical engineering. However, the optimization problem is difficult to solve since it is both nonsmooth and nonseparable. The existing numerical solutions implicate the use of several auxiliary variables in order to dea...
Article
Full-text available
With the advancement in information technology, datasets with an enormous amount of data are available. The classification task on these datasets is more time- and memory-consuming as the number of data increases. The support vector machine (SVM), which is arguably the most popular classification technique, has disappointing performance in dealing...
Article
Full-text available
The best-worst method (BWM) is a multi-criteria decision-making method which finds the optimal weights of a set of criteria based on the preferences of only one decision-maker (DM) (or evaluator). However, it cannot amalgamate the preferences of multiple decision-makers/evaluators in the so-called group decision-making problem. A typical way of agg...
Article
Full-text available
Ontology alignment is a fundamental task to reconcile the heterogeneity among various information systems using distinct information sources. The evolutionary algorithms (EAs) have been already considered as the primary strategy to develop an ontology alignment system. However, such systems have two significant drawbacks: they either need a ground...
Article
Full-text available
Ontology alignment systems are evaluated by various performance scores which are usually computed by a ratio related directly to the frequency of the true positives. However, such ratios provide little information regarding the uncertainty of the overall performance of the corresponding systems. The comparison is also drawn merely by the juxtaposit...
Conference Paper
Full-text available
This paper describes the Ontology Alignment Evaluation Initiative 2017.5 pre-campaign. Like in 2012, when we transitioned the evaluation to the SEALS platform, we have also conducted a pre-campaign to assess the feasibility of moving to the HOBBIT platform. We report the experiences of this pre-campaign and discuss the future steps for the OAEI.
Conference Paper
Full-text available
Simulated annealing-based ontology matching (SANOM) participates for the second time at the ontology alignment evaluation initiative (OAEI) 2018. This paper contains the configuration of SANOM and its results on the anatomy and conference tracks. In comparison to the OAEI 2017, SANOM has improved significantly, and its results are competitive with...
Article
Full-text available
The identification of copy number variations (CNVs) helps the diagnosis of many diseases. One major hurdle in the path of CNVs discovery is that the boundaries of normal and aberrant regions cannot be distinguished from the raw data since various types of noise contaminate them. To tackle this challenge, the total variation regularization is mostly...
Article
The l1-regularized least square problem has been considered in diverse fields. However, finding its solution is exacting as its objective function is not differentiable. In this paper, we propose a new one-layer neural network to find the optimal solution of the l1-regularized least squares problem. To solve the problem, we first convert it into a...
Article
The detection of DNA copy number variants (CNVs) is essential for the diagnosis and prognosis of multiple diseases including cancer. Array-based comparative genomic hybridization (aCGH) is a technique to find these aberrations. The available methods for CNV discovery are often predicated on several critical assumptions based on which various regula...
Article
Full-text available
Comparing ontology matching systems are typically performed by comparing their average performances over multiple datasets. However, this paper examines the alignment systems using statistical inference since averaging is statistically unsafe and inappropriate. The statistical tests for comparison of two or multiple alignment systems are theoretica...
Article
Full-text available
Ontology alignment is widely-used to find the correspondences between different ontologies in diverse fields.After discovering the alignments,several performance scores are available to evaluate them.The scores typically require the identified alignment and a reference containing the underlying actual correspondences of the given ontologies.The cur...
Preprint
Ontology alignment is widely-used to find the correspondences between different ontologies in diverse fields.After discovering the alignments,several performance scores are available to evaluate them.The scores typically require the identified alignment and a reference containing the underlying actual correspondences of the given ontologies.The cur...
Conference Paper
Full-text available
In most machine learning problems, the labeling of the training data is an expensive or even impossible task. Crowdsourcing-based learning uses uncertain labels from many non-expert annotators instead of one reference label. Crowdsourcing based linear regression is an efficient way for function estimation when many labels are available for each ins...
Conference Paper
In most machine learning problems, the labeling of the training data is an expensive or even impossible task. Crowdsourcing-based learning uses uncertain labels from many non-expert annotators instead of one reference label. Crowdsourcing based linear regression is an efficient way for function estimation when many labels are available for each ins...
Article
One of the most important needs in the post-genome era is providing the re- searchers with reliable and efficient computational tools to extract and analyze this huge amount of biological data, in which DNA copy number variation (CNV) is a vitally important one. Array-based comparative genomic hybridization (aCGH) is a common approach in order to d...
Article
This paper presents a general half quadratic framework for simultaneous analysis of the whole array comparative genomic hybridization (aCGH) profiles in a data set. The proposed framework accommodates different M-estimation loss functions and two underlying assumptions for aCGH profiles of a data set: sparsity and low rank. Using M-estimation loss...
Article
Full-text available
One of the central challenges in cancer research is identifying significant genes among thousands of others on a microarray. Since preventing outbreak and progression of cancer is the ultimate goal in bioinformatics and computational biology, detection of genes that are most involved is vital and crucial. In this article, we propose a Maximum–Minim...
Article
Full-text available
This paper presents a new method for analyzing Array comparative genomic hybridization (aCGH) data based on Correntropy. A new formulation based on low-rank aCGH data and Correntropy is proposed and its solution is presented based on Half-Quadratic method. Comparing to existing methods, the proposed method is more robust to high corruptions and var...

Questions

Question (1)
Question
Dear Javad,
I have seen some of your articles about solving optimization problems using recurrent neural networks. As you mentioned in the project, the neural solutions are fast since they can be implemented in parallel. May I ask how you implemented the neural networks? The only solution I have seen is to solve the dynamic equation regarding the neural network using ODE tools (usually in Matlab) which are not efficient at all.
Thanks in advance for your question.
Best,
Majid

Network

Cited By