Amirmohammad Pasdar

Amirmohammad Pasdar
Macquarie University · School of Computing

Doctor of Philosophy

About

20
Publications
6,077
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
157
Citations

Publications

Publications (20)
Conference Paper
In the United Arab Emirates, housing establishments offer citizens opportunities to apply for housing loans or grants. The eligibility for a loan or grant depends on the citizen's income. Those with incomes below a predetermined threshold can apply for grants, while those with higher incomes are eligible for loans. Along with proof of income, other...
Article
Full-text available
The emergence of fog computing has significantly enhanced real-time data processing by bringing computation resources closer to data sources. This adoption is very beneficial in the healthcare sector, where abundant time-sensitive processing tasks exist. Although such adoption is very promising, there is a challenge with the limited computational c...
Article
Full-text available
As a critical factor in ensuring the growth of the electronic auction (e-auction) domain, the privacy and security of the participants (sellers and buyers) must always be guaranteed. Traditionally, auction data, including participant details, are stored in a third party (auctioneer) database. This leads to a high risk of a single point of failure i...
Preprint
Full-text available
Edge computing has been getting a momentum with ever-increasing data at the edge of the network. In particular, huge amounts of video data and their real-time processing requirements have been increasingly hindering the traditional cloud computing approach due to high bandwidth consumption and high latency. Edge computing in essence aims to overcom...
Chapter
Blockchain, a type of distributed ledger technology, has revolutionized the digital economy such as cryptocurrencies and supply chain management with its transparency, immutability, and decentralization properties. In addition, smart contracts are introduced to the blockchain to provide programmability removing third parties for administration. Alt...
Article
Full-text available
The usage of blockchain technology has been significantly expanded with smart contracts and blockchain oracles . While smart contracts enables to automate the execution of an agreement between untrusted parties, oracles provide smart contracts with data external to a given blockchain, i.e., off-chain data. However, the validity and accuracy of su...
Preprint
Full-text available
The vulnerability of smartphones to cyberattacks has been a severe concern to users arising from the integrity of installed applications (\textit{apps}). Although applications are to provide legitimate and diversified on-the-go services, harmful and dangerous ones have also uncovered the feasible way to penetrate smartphones for malicious behaviors...
Article
A blockchain is a form of distributed ledger technology where transactions as data state changes are permanently recorded securely and transparently without the need for third parties. Besides, introducing smart contracts to the blockchain has added programmability, revolutionizing the software ecosystem toward decentralized applications (DApps). A...
Preprint
Full-text available
Blockchain is a form of distributed ledger technology (DLT) where data is shared among users connected over the internet. Transactions are data state changes on the blockchain that are permanently recorded in a secure and transparent way without the need of a third party. Besides, the introduction of smart contracts to the blockchain has added prog...
Article
Full-text available
The interaction between artificial intelligence (AI), edge, and cloud is a fast-evolving realm in which pushing computation close to the data sources is increasingly adopted. Captured data may be processed locally (i.e., on the edge) or remotely in the clouds where abundant resources are available. While many emerging applications are processed in...
Chapter
Cloud computing has been widely adopted, in the forms of public clouds and private clouds, for many benefits, such as availability and cost-efficiency. In this paper, we address the problem of scheduling jobs across multiple clouds, including a private cloud, to optimize cost efficiency explicitly taking into account data privacy. In particular, th...
Article
Scientific workflows consist of many interdependent tasks dictated by their data dependencies. As these workflows are becoming resource-intensive in both data and computing, private clouds struggle to cope with their resource requirements. Private cloud’s limitations are claimed to be addressed by public clouds, however, the complete offloading of...
Chapter
While private clouds are still widely adopted due primarily to privacy and security reasons, they are often less resilient with fluctuating workloads compared to public clouds. Workload surges in private clouds can be dealt with by offloading some workload/jobs to public clouds; this is referred to as cloud bursting. Although the dynamic use of pub...
Chapter
In this paper, we address the scheduling of scientific workflows in hybrid clouds considering data placement and present the Hybrid Scheduling for Hybrid Clouds (HSHC) algorithm. HSHC is a two-phase scheduling algorithm with a genetic algorithm based static phase and dynamic programming based dynamic phase. We evaluate HSHC with both a real-world s...
Conference Paper
Data play the main role in scientific workflows. In the cloud environment there are many workflows need these data and their size might be exceeded to terabytes or petabytes. Since these workflows consist of many interdependent tasks and each task in the workflow requires some dataset as its input, the data should be somehow managed in order to pro...
Conference Paper
Full-text available
Data play the main role in scientific workflows. In the cloud environment there are many workflows need these data and their size might be exceeded to terabytes or petabytes. Since these workflows consist of many interdependent tasks and each task in the workflow requires some dataset as its input, the data should be somehow managed in order to pro...
Conference Paper
Full-text available
Abstract— Nowadays, there are big data which are transferring among data centers on cloud environments. These data consume network bandwidth, storage resources and may use up some computations. These resources are distributed geographically on the cloud environment. Since the policy of data centers differs from each other in the cloud environments,...

Questions

Questions (2)
Question
Dear Researchers,
I'm just wondering why we should ALWAYS use the learning rate on neural network between [0 1]? What's the benefit of using the learning rate between that domain?
Can we extend the domain for getting bigger rate? For example [0 4]
I've read about it but I'm not convinced so I would like to extend my understanding about this matter.
Regards,
Amirmohammad

Network

Cited By