Science method
Distributed Computing - Science method
Explore the latest publications in Distributed Computing, and find Distributed Computing experts.
Publications related to Distributed Computing (10,000)
Sorted by most recent
In this paper, we aim to develop distributed continuous-time algorithms over directed graphs to seek the Nash equilibrium in a noncooperative game. Motivated by the recent consensus-based designs, we present a distributed algorithm with a proportional gain for weight-balanced directed graphs. By further embedding a distributed estimator of the left...
This work focuses on the study of a recently published dataset (Bogado et al. in ATLAS Rucio transfers dataset. Zenodo, 2020.) with data that allow us to reconstruct the lifetime of file transfers in the contexts of the Worldwide LHC Computing Grid (WLCG). Several models for Rule Time To Complete (TTC) prediction are presented and evaluated. The da...
Task graphs provide a simple way to describe scientific workflows (sets of tasks with dependencies) that can be executed on both HPC clusters and in the cloud. An important aspect of executing such graphs is the used scheduling algorithm. Many scheduling heuristics have been proposed in existing works; nevertheless, they are often tested in oversim...
Complex system theory is increasingly applied to develop control protocols for distributed computational and networking resources. The paper deals with the important subproblem of finding complex connected structures having excellent navigability properties using limited computational resources. Recently, the two-dimensional hyperbolic space turned...
In this work, we propose a multi-tier architectural model to separate functionality and security concerns for distributed cyber-physical systems. On the line of distributed computing, such systems require the identification of leaders for distribution of work, aggregation of results, etc. Further, we propose a fault-tolerant leader election algorit...
I am interested in the integration of several advanced technologies and techniques -- machine learning and distributed computing, SQL, text processing, challenging conventional thought process. If beaten, I am not on it!
Artificial intelligence is the capacity of a computer or a robot controlled by a computer to do activities that normally require human intelligence and judgement. Edge computing is a distributed computing paradigm that relocates processing and data storage closer to data sources. This is supposed to enhance reaction times while also conserving band...
Communicating state machines provide a formal foundation for distributed computation. Unfortunately, they are Turing-complete and, thus, challenging to analyse. In this paper, we classify restrictions on channels which have been proposed to work around the undecidability of verification questions. We compare half-duplex communication, existential B...
Cloud computing has taken over the high-performance distributed computing area, and it currently provides on-demand services and resource polling over the web. As a result of constantly changing user service demand, the task scheduling problem has emerged as a critical analytical topic in cloud computing. The primary goal of scheduling tasks is to...
Named Data Networking (NDN) offers promising advantages in deploying next-generation service applications over distributed computing networks. We consider the problem of dynamic orchestration over a NDN-based computing network, in which nodes can be equipped with communication, computation, and data producing resources. Given a set of services with...
Distributed computing is a system that licenses straightforward, all-inclusive and on-time web network to a common pool of figuring powers, for example, switches, servers, programming's and administrations that can be handily circulated and conveyed with least exertion the board or connection between specialist coops. Distributed computing characte...
We consider the synthesis of three and four abstract polymer chains classified into two sexes, where the synthesis quality is determined by the pairing of different-sex chains. The use of some types of entangled biphotons in one-sided control produces a difference in the degree of superimposition between good and bad pairs that is unattainable with...
With the success of Bitcoin and the introduction of different uses of Blockchain, such as smart contracts in Ethereum, many researchers and industries have turned their attention to applications that use this technology. In response to the advantages and disadvantages of Blockchain, similar technologies have emerged with alterations to the original...
During the last two decades, a small set of distributed computing models for networks have emerged, among which LOCAL, CONGEST, and Broadcast Congested Clique (BCC) play a prominent role. We consider hybrid models resulting from combining these three models. That is, we analyze the computing power of models allowing to, say, perform a constant numb...
A finite-time resilient consensus protocol (RCP) is developed for a connected network of agents, where communication between agents occurs locally, a few of the agents are malicious (MA), and the non-malicious or cooperating (CO) agents do not know the locations of the MA ones. Networks with a single leader and several followers as well as leaderle...
Learning health systems are challenged to combine computable biomedical knowledge (CBK) models. Using common technical capabilities of the World Wide Web (WWW), digital objects called Knowledge Objects, and a new pattern of activating CBK models brought forth here, we aim to show that it is possible to compose CBK models in more highly standardized...
Modern computationally-heavy applications are often time-sensitive, demanding distributed strategies to accelerate them. On the other hand, distributed computing suffers from the bottleneck of slow workers in practice. Distributed coded computing is an attractive solution that adds redundancy such that a subset of distributed computations suffices...
The heterogeneity of today's state-of-the-art computer architectures is confronting application developers with an immense degree of complexity which results from two major challenges. First, developers need to acquire profound knowledge about the programming models or the interaction models associated with each type of heterogeneous system resourc...
Nature inspired algorithm plays a very vibrant role in solving the different optimization problems these days. The fundamental attitude of naturalistic approaches is to boost the competence, improvement, proficiency, success in the task except from it to help in underrating the energy use, cost, size. Several computing techniques are taking the ben...
People’s cognition of objectively existing things is developing in the direction of digitization, and more and more objectively existing things are constantly being presented in the form of data. In order to realize the analysis of constantly changing large-scale data and the automatic extraction of valuable information, this paper analyzes the app...
In recent years, the landscape of computing paradigms has witnessed a gradual yet remarkable shift from monolithic computing to distributed and decentralized paradigms such as Internet of Things (IoT), Edge, Fog, Cloud, and Serverless. The frontiers of these computing technologies have been boosted by shift from manually encoded algorithms to Artif...
This article highlights quantum Internet computing as referring to distributed quantum computing over the quantum Internet, analogous to (classical) Internet computing involving (classical) distributed computing over the (classical) Internet. Relevant to quantum Internet computing would be areas of study such as quantum protocols for distributed no...
The Internet-of-Things (IoT) edge allows cloud computing services for topology and location-sensitive distributed computing. As an immediate benefit, it improves network reliability and latency by enabling data access and processing rapidly and efficiently near IoT devices. However, it comes with several issues stemming from the complexity, the sec...
In this work, we extend the topology-based framework and method for the quantification and classification of general resilient asynchronous complexity. We presentthe arbitrary resilient asynchronous complexity theorem, applied to decision tasks in an iterated delayed model which is based on a series of communicating objects, each of which mainly co...
Computing shortest paths from a single source is one of the central problems studied in the CONGEST model of distributed computing. After many years in which no algorithmic progress was made, Elkin [STOC ‘17] provided the first improvement over the distributed Bellman-Ford algorithm. Since then, several improved algorithms have been published. The...
Graphlet enumeration is a basic task in graph analysis with many applications. Thus it is important to be able to perform this task within a reasonable amount of time. However, this objective is challenging when the input graph is very large, with millions of nodes and edges. Known solutions are limited in terms of the scale of the graph that they...
Cloud computing is a model for conveying data benefits that gives adaptable utilization of
virtual workers, enormous versatility, and the board administrations. With the word reference
definition far removed, we would now be able to continue to depicting distributed computing
as far as its fundamental highlights and how it capacities close by other...
Decentralized computation outsourcing should allow anyone to access the large amounts of computational power that exists in the Internet of Things. Unfortunately, when trusted third parties are removed to achieve this decentralization, ensuring an outsourced computation is performed correctly remains a significant challenge. In this paper, we provi...
Every year, millions of new devices are added to the Internet of things, which has both great benefits and serious security risks for user data privacy. It is the device owners’ responsibility to ensure that the ownership settings of Internet of things devices are maintained, allowing them to communicate with other user devices autonomously. The ul...
In this article, from a conceptual point of view, the processes of changes occurring in industrial enterprises caused by Industry 4.0 are considered. For the sustainable development of the light and weaving industry, as well as the fashion industry,
it is necessary to develop technologies for adapting the functioning mechanisms of
industry enterpri...
The goal of this study is to explore emerging trends in cloud computing technology that can support an economic and social change. We apply the methods of entity linking, which links word strings to entities from a knowledge base, to extract main keywords in cloud computing from accumulated publications from 2004 to 2021. Results suggest that in cl...
Peer-to-Peer (P2P) networks is a distributed structure, which is widely used in file sharing, distributed computing, deep search engines. Each node plays the dual role of client and server, which makes the P2P network further exacerbates the insecurity of P2P systems. For the behavior of nodes in P2P systems, requesting nodes are abstracted into cu...
In the era of big data, to achieve an efficient deep learning and computer vision system for big data, developers need to build a computerized deep learning and computer vision system, and the system can simultaneously complete the tasks of deep learning and computer vision and large-scale data processing. The existing training dataset is reused, a...
2022 Second International Conference on Distributed Computing and High Performance
Computing (DCHPC)
Cloud computing has become the backbone of the computing industry and offers subscription-based on-demand services. Through virtualization, which produces a virtual instance of a computer system running in an abstracted hardware layer, it has made it possible for us to share resources among many users. Contrary to early distributed computing models...
Parallel applications represented by Directed Acyclic Graphs (DAGs) as Parallel Task Graphs (PTGs) requiring high execution times with large amounts of storage, are executed on High Performance Computing (HPC) Systems such as clusters. For the execution of these applications, a scheduler performs the scheduling and allocation of the resources conta...
Unmanned Aerial Vehicles (UAVs) are often studied as tools to perform data collection from Wireless Sensor Networks (WSNs). Path planning is a fundamental aspect of this endeavor. Works in the current literature assume that data are always ready to be retrieved when the UAV passes. This operational model is quite rigid and does not allow for the in...
Quantum communication networks are connected by various devices to achieve communication or distributed computing for users in remote locations. In order to solve the problem of generating temporary session key for secure communication in optical-ring quantum networks, a quantum key agreement protocol is proposed. In the key agreement protocols, an...
In distributed computing, a Byzantine fault is a condition where a component behaves inconsistently, showing different symptoms to different components of the system. Consensus among the correct components can be reached by appropriately crafted communication protocols even in the presence of byzantine faults. Quantum-aided protocols built upon dis...
With promising futures, cloud computing represents a significant turning point in the commercialization of distributed computing. On the other hand, the framework cost of the cloud achieves an inconceivable utmost. Thusly, the virtualization idea is connected in distributed computing frameworks to support clients and proprietor s to accomplish bett...
In the shipping digitalisation process, the peak will be reached with the advent of a wholly autonomous and at the same time safe and reliable ship. Full autonomy could be obtained by two linked Artificial-Intelligence systems representing the ship navigator and the ship engineer that possess sensing and analysis skills, situational awareness, plan...
The recent advances in unmanned aerial vehicles (UAVs) enormously improve their utility and expand their application scope. The UAV and swarm implementation further prevail in Smart City practices with the aid of edge computing and urban Internet of Things. The lead–follow formation in UAV swarm is an important organization means and has been adopt...
Diabetes has advanced as one of the most perilous dangers to the human world. Many are turning into its casualties and can't emerge from it paying little heed to the way that they are attempting to keep away from it for becoming further. Distributed computing and the Internet of Things (IoT) are two devices that assume a vital part in the present l...
In the era of digital media, the rapidly increasing volume and complexity of multimedia data cause many problems in storing, processing, and querying information in a reasonable time. Feature extraction and processing time play an extremely important role in large-scale video retrieval systems and currently receive much attention from researchers....
Spanning trees are widely used in interconnection networks for routing, broadcasting, fault-tolerance, and securely delivering messages, as well as parallel and distributed computing. The problem of efficiently finding a maximal set of independent spanning trees (ISTs) is still open for arbitrary (general) interconnection topologies (graphs). The f...
Cloud Computing is the origin for various distributed computing such as mobile cloud computing, mobile edge computing, fog computing, transparent computing etc. The proposed work focuses mainly on mobile cloud computing and mobile edge computing. It discusses various frameworks to execute these computing and pricing models suitable to be incorporat...
Distributed computing has been the field of enthusiasm by the exploration network and application improvement industry for scarcely any decades at this point. The simplicity of advancement, arrangement, and the executives of utilizations from a broad scope of figuring worldview and capacity to deal with the applications over organization empowered...
Fog computing enables cloud and edge resource integration. It provides intelligent, decentralized processing of the amount of data generated by the Internet of Things (IoT) sensors for seamless integration of physical and cyber environments. This can create many benefits for society. The IoT framework uses wireless nodes to collect data and monitor...
We address problem-based scenario generation for two-stage stochastic programming by decomposing recourse functions. A novel scenario reduction procedure is proposed, agnostic to the specific problem and input distribution, computed using standard and efficient linear algebra algorithms that scale well. The method is especially well suited to addre...
The Second International Conference on Distributed Computing and High Performance Computing (DCHPC2022), 2nd-3rd March 2022, Qom, Iran.
Coded distributed computing (CDC) is a new technique proposed with the purpose of decreasing the intense data exchange required for parallelizing distributed computing systems. Under the famous MapReduce paradigm, this coded approach has been shown to decrease this communication overhead by a factor that is linearly proportional to the overall comp...
Ubiquitous sensors and Internet of Things (IoT) technologies have revolutionized the sports industry, providing new methodologies for planning, effective coordination of training, and match analysis post game. New methods, including machine learning, image and video processing, have been developed for performance evaluation, allowing the analyst to...
In this work, we explore the problem of multi-user linearly-separable distributed computation, where $N$ servers help compute the desired functions (jobs) of $K$ users, and where each desired function can be written as a linear combination of up to $L$ (generally non-linear) subtasks (or sub-functions). Each server computes some of the subtasks, co...
This paper presents an in-depth study on the new mode of intelligent multidistance teaching of English with the help of virtual scenes of the Internet of Things. The virtual simulation technology is integrated into the traditional IoT teaching, and the professional education of IoT application technology is tapped; from the analysis of the current...
Hadoop is an open source from Apache with a distributed file system and MapReduce distributed computing framework. The current Apache 2.0 license agreement supports on-demand payment by consumers for cloud platform services, helping users leverage their respective different hardware to provides cloud services. In cloud-based environment, there is a...
In recent years, big data has gotten a lot of attention. There are certain issues in big data analytics as big data makes its way into corporations and businesses. The Apache Spark framework has grown in popularity as a tool for distributed data processing. Spark is a big data analytic machine with SQL, streaming, graph processing, and machine lear...
Due to the increasing use of sensors/devices in smart cities, IoT/cloud data centers must provide adequate computing resources. Efficient resource management is of the biggest challenges in distributed computing. This research proposes a solution to use the activity log of sensors to extract their activity patterns. These patterns contribute to the...
Cloud computing can provide unlimited computing resources according to demand because of having high scalability as per its nature, which removes the requirements for the Cloud services providers in the planning of far-ahead on the provisioning of hardware. Security is a prime challenge for promoting cloud computing in the present period. Trust is...
The rapid growth of the Digital forensics in the Blockchain paradigm is
critical due to its heterogeneity and lack of transparency in evidence
processing. This urges a forensic framework for Blockchain which
provides distributed computing, decentralization, and transparency of
forensic investigation of digital evidence from cross-border perspective...
The conveyance of different administrations through the Internet is known as distributed computing. These assets incorporate information stockpiling, servers, data sets, systems administration, and programming, among different apparatuses and applications. Cloud-based capacity permits you to store documents to a far off data set instead of keeping...
Using cloud computing, businesses can adopt IT without incurring a significant upfront cost. The Internet has numerous benefits, but model security remains a concern, which affects cloud embracing negatively. The security challenge gets too difficult underneath data center, while additional dimensions such as model design, multitenancy, elasticity,...
Cloud computing is the on-call for openness of PC framework belongings, specially statistics capacity and working out electricity, without critical dynamic pastime by means of the patron. Enormous mists often have capacities dispersed over incalculable districts, each location being a server farm. In spite of the fact that dispensed computing is fa...
We study two fundamental problems of distributed computing, consensus and approximate agreement, through a novel approach for proving lower bounds and impossibility results, that we call the asynchronous speedup theorem. For a given $n$-process task $\Pi$ and a given computational model $M$, we define a new task, called the closure of $\Pi$ with re...