Chapter

Honeypot-Based Data Collection for Dark Web Investigations Using the Tor Network

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The Dark Web presents a challenging and complex environment where cyber criminals conduct illicit activities with high degrees of anonymity and privacy. This chapter describes a honeypot-based data collection approach for Dark Web browsing that incorporates honeypots on three isolated virtual machines, including production honeypots, an onion-website-based research honeypot (Honey Onion) offering illegal services and a log server that collects and securely stores the honeypot logs. Experiments conducted over 14 days collected more than 250 requests on the Honey Onion service and in excess of 28,000 chat records from the Dark Web forum. The log server also monitored Honey Onion traffic, providing details such as packet types, timestamps, network data, and HTTP requests. The data collection results provide valuable insights into Dark Web activities, including malicious, benign and uncategorized activities. The data analysis identified common user categories such as malicious actors, researchers and security professionals, and uncategorized actors. The experimental results demonstrate that honeypot-based data collection can advance Dark Web investigations as well as enable the development of effective cyber security strategies and efforts to combat cyber crime in the Dark Web.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Gathering evidence in cybercrime is a complex process. Under this premise, it is very important to have cutting-edge methodologies that allow the observation of criminal phenomena on the Internet and protect the personal integrity of the investigator, especially when the content is extremely critical and is located in the so-called dark zone of the Internet (hereinafter Dark Web). This article develops a new research methodology in the field of cybercrime based on the use of honeypots, and its main objective is to evaluate the ability to de-anonymize users in the Tor network to collect geographic coordinates and demographic data on the Dark Web in order to obtain digital evidence related to minors sexual (CSEM) offenders. The information collected aims to map the consumption of CSEM through cartograms, with a particular focus on the Spanish territory. The outcomes demonstrate the ability of using honeypots to assess the prevalence of CSEM consumption on the Dark Web. Future improvements of the methodology will be discussed.
Article
Full-text available
The use of the unindexed web, commonly known as the deep web and dark web, to commit or facilitate criminal activity has drastically increased over the past decade. The dark web is a dangerous place where all kinds of criminal activities take place, Despite advances in web forensic techniques, tools, and methodologies, few studies have formally tackled dark and deep web forensics and the technical differences in terms of investigative techniques and artefact identification and extraction. This study proposes a novel and comprehensive protocol to guide and assist digital forensic professionals in investigating crimes committed on or via the deep and dark web. The protocol, named D2WFP, establishes a new sequential approach for performing investigative activities by observing the order of volatility and implementing a systemic approach covering all browsing-related hives and artefacts which ultimately resulted in improving the accuracy and effectiveness. Rigorous quantitative and qualitative research has been conducted by assessing the D2WFP following a scientifically sound and comprehensive process in different scenarios and the obtained results show an apparent increase in the number of artefacts recovered when adopting the D2WFP which outperforms any current industry or opensource browsing forensic tools. The second contribution of the D2WFP is the robust formulation of artefact correlation and cross-validation within the D2WFP which enables digital forensic professionals to better document and structure their analysis of host-based deep and dark web browsing artefacts.
Article
Full-text available
Purpose This paper aims to investigate the potential challenges that governments in the Commonwealth Caribbean are likely to face combating crimes facilitated by the dark Web. Design/methodology/approach The “lived experience” methodology guided by a contextual systematic literature review was used to ground the investigation of the research phenomena in the researchers’ collective experiences working in, living in and engaging in research with governments in the Commonwealth Caribbean. Findings The two major findings emerging from the analysis are that jurisdictional and technical challenges are producing major hindrances to the creation of an efficient and authoritative legislative framework and the building of the capacity of governments in the Commonwealth Caribbean to confront the technicalities that affect systematic efforts to manage problems created by the dark Web. Practical implications The findings indicate the urgency that authorities in the Caribbean region must place on reevaluating their administrative, legislative and investment priorities to emphasize cyber-risk management strategies that will enable their seamless and wholesome integration into this digital world. Originality/value The research aids in developing and extending theory and praxis related to the problematization of the dark Web for governments by situating the experiences of Small Island Developing States into the ongoing discourse.
Article
Full-text available
Abstract—Dark web is a canopy concept that denotes any kind of illicit activities carried out by anonymous persons or organizations, thereby making it difficult to trace. The illicit content on the dark web is constantly updated and changed. The collection and classification of such illegal activities are challeng-ing tasks, as they are difficult and time-consuming. This problem has in recent times emerged as an issue that requires quick attention from both the industry and academia. To this end, efforts have been made in this article a crawler that is capable of collecting dark web pages, cleaning them, and saving them in a doc-ument database, is proposed. The crawler carries out an automatic classification of the gathered web pages into five classes. The classifiers used in classifying the pages include Linear Support Vector Classifier (SVC), Naïve Bayes (NB), and Document Frequency (TF-IDF). The experimental results revealed that an accu-racy rate of 92% and 81% were achieved by SVC and NB, respectively.
Article
Full-text available
The World Wide Web (www) consists of the surface web, deep web, and Dark Web, depending on the content shared and the access to these network layers. Dark Web consists of the Dark Net overlay of networks that can be accessed through specific software and authorization schema. Dark Net has become a growing community where users focus on keeping their identities, personal information, and locations secret due to the diverse population base and well-known cyber threats. Furthermore, not much is known of Dark Net from the user perspective, where often there is a misunderstanding of the usage strategies. To understand this further, we conducted a systematic analysis of research relating to Dark Net privacy and security on N=200 academic papers, where we also explored the user side. An evaluation of secure end-user experience on the Dark Net establishes the motives of account initialization in overlaid networks such as Tor. This work delves into the evolution of Dark Net intelligence for improved cybercrime strategies across jurisdictions. The evaluation of the developing network infrastructure of the Dark Net raises meaningful questions on how to resolve the issue of increasing criminal activity on the Dark Web. We further examine the security features afforded to users, motives, and anonymity revocation. We also evaluate more closely nine user-study-focused papers revealing the importance of conducting more research in this area. Our detailed systematic review of Dark Net security clearly shows the apparent research gaps, especially in the user-focused studies emphasized in the paper.
Article
Full-text available
The dark web is a section of the Internet that is not accessible to search engines and requires an anonymizing browser called Tor. Its hidden network and anonymity pave the way for illegal activities and help cybercriminals to execute well-planned, coordinated, and malicious cyberattacks. Cyber security experts agree that online criminal activities are increasing exponentially, and they are also becoming more rampant and intensified. These illegal cyber activities include various destructive crimes that may target a single person or a whole nation, for example, data breaches, ransomware attacks, black markets, mafias, and terrorist attacks. So, maintaining data privacy and secrecy is the new dilemma of the era. This paper has extensively reviewed various attacks and attack patterns commonly applied in the dark web. We have also classified these attacks in our unique trilogies classification system. Furthermore, a detailed overview of existing threat detection techniques and their limitations is discussed for anonymity providing services like Tor, I2P, and Freenet. Finally, the paper has identified significant weaknesses that make the dark web vulnerable to different attacks.
Article
Full-text available
As technology has become pivotal a part of life, it has also become a part of criminal life. Criminals use new technology developments to commit crimes, and investigators must adapt to these changes. Many people have, and will become, victims of cybercrime, making it even more important for investigators to understand current methods used in cyber investigations. The two general categories of cyber investigations are digital forensics and open-source intelligence. Cyber investigations are affecting more than just the investigators. They must determine what tools they need to use based on the information that the tools provide and how effectively the tools and methods work. Tools are any application or device used by investigators, while methods are the process or technique of using a tool. This survey compares the most common methods available to investigators to determine what kind of evidence the methods provide, and which of them are the most effective. To accomplish this, the survey establishes criteria for comparison and conducts an analysis of the tools in both mobile digital forensic and open-source intelligence investigations. We found that there is no single tool or method that can gather all the evidence that investigators require. Many of the tools must be combined to be most effective. However, there are some tools that are more useful than others. Out of all the methods used in mobile digital forensics, logical extraction and hex dumps are the most effective and least likely to cause damage to the data. Among those tools used in open-source intelligence, natural language processing has more applications and uses than any of the other options.
Article
Full-text available
Recently, online black markets and anonymous virtual currencies have become the subject of academic research, and we have gained a certain degree of knowledge about the dark web. However, as terms such as deep web and dark net, which have different meanings, have been used in similar contexts, discussions related to the dark web tend to be confusing. Therefore, in this paper, we discuss the differences between these concepts in an easy-to-understand manner, including their historical circumstances, and explain the technology known as onion routing used on the dark web.
Article
Full-text available
Dark Web is one of the most challenging and untraceable mediums adopted by the cyber criminals, terrorists, and state-sponsored spies to fulfil their illicit motives. Cyber-crimes happening inside the Dark Web are alike the real world crimes. However, the sheer size, unpredictable ecosystem and anonymity provided by the Dark Web services are the essential confrontations to trace the criminals. To discover the potential solutions towards cyber-crimes evaluating the sailing Dark Web crime threats is a crucial step. In this paper, we will appraise the Dark Web by analysing the crimes with their consequences and enforced methods as well as future manoeuvres to lessen the crime threats. We used Systematic Literature Review (SLR) method with the aspiration to provide the direction and aspect of emerging crime threats in the Dark Web for the researchers and specialist in Cyber security field. For this SLR 65 most relevant articles from leading electronic databases were selected for data extraction and synthesis to answer our predefined research questions. The result of this systematic literature review provides (i) comprehensive knowledge on the growing crimes proceeding with Dark Web (ii) assessing the social, economic and ethical impacts of the cyber-crimes happening inside the Dark Web and (iii) analysing the challenges, established techniques and methods to locate the criminals and their drawbacks. Our study reveals that more in depth researches are required to identify criminals in the Dark Web with new prominent way, the crypto markets and Dark Web discussion forums analysis is crucial for forensic investigations, the anonymity provided by Dark Web services can be used as a weapon to catch the criminals and digital evidences should be analysed and processed in a way that follows the law enforcement to make the seizure of the criminals and shutting down the illicit sites in the Dark Web.
Article
Full-text available
The Internet today is beset with constant attacks targeting users and infrastructure. One popular method of detecting these attacks and the infected hosts behind them is to monitor unused network addresses because many Internet threats propagate randomly, Deep web content cannot be indexed by search engine such as Google, Yahoo and Bing and Darknet is lies within the deep web. Dark web has been intentionally hidden, and it is not accessible through standard browser. Deep web can be accessed by anyone who has The Onion Router (TOR) browser. TOR is a virtual and encrypted tunnel which allows people to hide their identity and network traffic and allow them to use internet anonymously. Dark web is virtually online market for anything, including but not limited to drugs, weapons, credit card data, forged documents, hire services for murder, narcotics and indecent pornography etc. Because of these reasons, it is difficult for law enforcement agencies or digital forensic professionals to pinpoint the origin of traffic, location or ownership of any computer or person on the Darknet. Silk Road was only accessible via the TOR network and hidden from mainstream web. There has been lot of buzz around Bitcoin, TOR network and Darknet, because most of the Darknet sites carried out transactions through anonymous digital currency, pear to peer, distributed and Bitcoin which is based on cryptography principal (D. Rathod, 2017). In this research paper, I proposed the rise and challenge of Darknet. I am also proposed and discussed Darknet Deployment and why use the Darknet.
Article
Full-text available
A honeypot is a type of security facility deliberately created to be probed, attacked, and compromised. It is often used for protecting production systems by detecting and deflecting unauthorized accesses. It is also useful for investigating the behavior of attackers, and in particular, unknown attacks. For the past 17 years plenty of effort has been invested in the research and development of honeypot techniques, and they have evolved to be an increasingly powerful means of defending against the creations of the blackhat community. In this paper, by studying a wide set of honeypots, the two essential elements of honeypots—the decoy and the captor— are captured and presented, together with two abstract organizational forms—independent and cooperative—where these two elements can be integrated. A novel decoy and captor (D-C) based taxonomy is proposed for the purpose of studying and classifying the various honeypot techniques. An extensive set of independent and cooperative honeypot projects and research that cover these techniques is surveyed under the taxonomy framework. Furthermore, two subsets of features from the taxonomy are identified, which can greatly influence the honeypot performances. These two subsets of features are applied to a number of typical independent and cooperative honeypots separately in order to validate the taxonomy and predict the honeypot development trends.
Conference Paper
Full-text available
Conference Paper
Full-text available
In recent years, the amount and the sophistication of cyber attacks has increased significantly. This creates a plethora of challenges from a security perspective. First, for the efficient monitoring of a network, the generated alerts need to be presented and summarized in a meaningful manner. Second, additional analytics are required to identify sophisticated and correlated attacks. In particular, the detection of correlated attacks requires collaboration between different monitoring points. Cyber incident monitors are platforms utilized for supporting the tasks of network administrators and provide an initial step towards coping with the aforementioned challenges. In this paper, we present our cyber incident monitor TraCINg. TraCINg obtains alert data from honeypot sensors distributed across all over the world. The main contribution of this paper is a thoughtful discussion of the lessons learned, both from a design rational perspective as well as from the analysis of data gathered during a five month deployment period. Furthermore, we show that even with a relatively small number of deployed sensors, it is possible to detect correlated attacks that target multiple sensors.
Conference Paper
Full-text available
In this paper we review the recent advances in honeypot. Some notable proposals and there analysis have been discussed. The aspects of using honeypot in education and in hybrid environment with IDS have been explained. In this paper we also defines the use of signature technique in honeypot for traffic analysis. In the end we summarizes all these aspects.
Article
Full-text available
We present Tor, a circuit-based low-latency anonymous communication service. This second-generation Onion Routing system addresses limitations in the original design by adding perfect forward secrecy, congestion control, directory servers, integrity checking, configurable exit policies, and a practical design for location-hidden services via rendezvous points. Tor works on the real-world Internet, requires no special privileges or kernel modifications, requires little synchronization or coordination between nodes, and provides a reasonable tradeoff between anonymity, usability, and efficiency. We briefly describe our experiences with an international network of more than 30 nodes. We close with a list of open problems in anonymous communication.
Article
Practice and Policy-Oriented Abstract Law enforcement bodies have largely responded to the increase in darknet activities through site shutdowns, which involve significant investment of policing resources. Despite these efforts, new darknet sites continue to show up after the site takedowns. We offer a new look at this issue by assessing the viability of selectively targeting large drug vendors operating on darknet sites. We find that the arrest of a major drug vendor reduced subsequent transaction levels by 39% and the number of remaining vendors by 56% on Silk Road 2.0. This deterrent effect also spilled over to drug vendors located in countries beyond the prosecutorial jurisdiction of the arrested vendor. We further find that small darknet drug vendors were most deterred by the arrest and vendors selling dangerous drugs were relatively more deterred. Our study findings hold policy-relevant implications to government agencies and law enforcement. Whereas site shutdowns can disrupt these markets momentarily, the selective targeting of large-scale drug vendors should be given serious consideration and used to a broader extent. The design of future enforcement strategies should also account for the finding that darknet markets are made up of both small-scale drug dealers new to the drug trade and large-scale drug syndicates.
Article
From 2018 to 2021, seizures of counterfeit oxycodone pills containing non-pharmaceutical fentanyl or other novel synthetic opioids increased significantly contributing to continuing increases in overdose mortality in Northern America. Evidence suggests that counterfeit pills are distributed through cryptomarkets. This article presents data regarding the availability and characteristics of oxycodone pills advertised on one major cryptomarket between January and March 2022. Collected data were processed using a dedicated Named Entity Recognition algorithm to identify oxycodone listings and categorized them as either counterfeit or pharmaceutical. Frequency of listings, average number of pills advertised, average prices per milligram, number of sales, and geographic indicators of shipment origin and destination were analyzed. In total, 2,665 listings were identified as oxycodone. 48.2% (1,285/2,665) of these listings were categorized as counterfeit oxycodone, advertising a total of 652,699 pills (93,242.7 pills per datapoint) offered at a lower price than pharmaceutical pills. Our data indicate the presence of a large volume of counterfeit oxycodone pills both in retail- and wholesale-level amounts mostly targeting US and Canadian customers. These exploratory findings call for more research to develop epidemiological surveillance systems to track counterfeit pill and other drug availability on the Dark web environment.
Chapter
The dark web is that integral part of WWW that provides freedom of Content Hosting. The dark web is accessible with specially designed browsers and tools of having peer-to-peer network technology such as TOR, IP2, and FREENET etc. These tools help users exchange information on the dark web while remaining anonymous; We used TOR (The Onion Router) browser in our research for understanding the dark web. It provides excellent anonymity to its users. The users mainly utilized the dark web for illegal activities; although accessing it is legal in most countries, its usage can arouse suspicion with the law. Categories like Adult, Counterfeits, illicit markets, and weapons are prevalent. This research provides an analytical framework for automating the classification of web pages with scraping and analysis of its hosted content on the dark web. The method we used can easily crawl data, classify the hosted content by machine learning model, and categorize that the hosted content is illegal and legal. The proposed framework contains Machine Learning Classifier Algorithms that are Naïve Bayes with an accuracy of 0.87%, Random Forest with an accuracy of 0.91%, Linear SVM with an accuracy of 0.91%, and Logistic Regression with an accuracy of 0.94%. This study created a machine learning framework that can classify hosted text content on dark web websites—models trained to classify the content and evaluated them based on accuracy. The results from our study validate the effectiveness of the proposed classification framework for analyzing the text data, which has more relevance with smaller datasets. Also, it is encouraging further studying this growing phenomenon and for investigators examining illegal activities on the Dark Web.
Article
Despite their growing popularity, cryptomarkets generate risks for participants. This has promoted the reemergence of a more personal transaction model on the dark web: single vendor shops. To date, little is known about how single vendors display trust to attract potential customers without relying on the structural trust provided by cryptomarkets’ review and escrow systems. A total of 108 single vendor shops were identified. A coding grid was used to determine whether vendors displayed any of the four categories of trust signals typically found on cryptomarkets (i.e., signals related to identity, marketing, security, and signals that directly express trust). While the majority of single vendor shops were involved in illicit drug dealing, other products such as electronics, weapons, and fake documents were also offered. On average, shops displayed few trust signals. However, variations between different kinds of vendors were found: while vendors involved in illicit drug dealing displayed more identity- and marketing-related trust signals, vendors involved in fraud displayed more security-related signals and signals that directly expressed trust. Differences between vendors might be due to the nature of the products they offer and to the level of competition in their respective markets.
Chapter
According to the World Economic Forum (WEF), the Fourth Industrial Revolution (4IR, also known as digital transformation) “has the potential to raise global income levels and improve the quality of life for populations around the world” (WEF 2020). However, “there has never been a time of greater promise or potential peril (increased cybercrime)” (WEF 2020). Key amongst these perils is the rapid increase of cybercrime, which now affects every sector of society. Furthermore, one must understand how the willful negligence and pursuit of nationalistic cyber policies and objectives of China, Russia and the United States (US) have given rise to a lawless and under governed cyber space where so-called state actors and non-state actors further political-military objectives, spread disinformation, interfere with democratic elections, collect ransoms and compromise critical infrastructure. This permissive environment has even facilitated crime in ways that no one could have previously imagined (the use of social media as a “megaphone” to spread hate speech, incite murder, and then broadcast these acts via the internet).
Chapter
This chapter presents the main differences of the surface web, Deep Web and Dark Web as well as their dependences. It further discusses the nature of the Dark Web and the structure of the most prominent darknets, namely, Tor, I2P and Freenet, and provides technical information regarding the technologies behind these darknets. In addition, it discusses the effects police actions on the surface web can have on the Dark Web, the “dilemma” of usage that anonymity technologies present,as well as the challenges LEAs face while trying to prevent and fight against crime and terrorism in the Dark Web.
Chapter
ZeroNet is a new decentralized web-like network of peer-to-peer users created in 2015. ZeroNet, called an open, free and robust network, has attracted a rising number of users. Most of P2P networks are applied into file-sharing systems or data-storage system and the characteristics of these networks are widely investigated. However, there are obvious differences between ZeroNet and conventional P2P networks. Existing researches rarely involve ZeroNet and the characteristics and robustness of ZeroNet are unknown. To tackle the aforementioned problem, the present study measures the ZeroNet peer resources and site resources separately, and at the same time, proposes collection methods for both. No like other simulation experiments, the experiments of this paper are set on real-world environment. This is also the first measurement study about ZeroNet. Experimental results show that the topology of the peer network in ZeroNet has scarce edges, short distances and low clustering coefficients, and its degree distribution exhibits a special distribution. These indicate that the peer network of ZeroNet has poor robustness and the experimental results of the ZeroNet resilience verify this issue. In addition, this paper represents an improved peer exchange method to enhance the robustness of the ZeroNet. We also measure the topology characteristics, languages, sizes and versions of the sites in ZeroNet. We find that the size of the sites and the client version are also the reasons for the low robustness of the ZeroNet.
Conference Paper
The Dark Web is known as the part of the Internet operated by decentralized and anonymous-preserving protocols like Tor. To date, the research community has focused on understanding the size and characteristics of the Dark Web and the services and goods that are offered in its underground markets. However, little is still known about the attacks landscape in the Dark Web. For the traditional Web, it is now well understood how websites are exploited, as well as the important role played by Google Dorks and automated attack bots to form some sort of "background attack noise" to which public websites are exposed. This paper tries to understand if these basic concepts and components have a parallel in the Dark Web. In particular, by deploying a high interaction honeypot in the Tor network for a period of seven months, we conducted a measurement study of the type of attacks and of the attackers behavior that affect this still relatively unknown corner of the Web.
Article
SSH Attacks are of various types: SSH port scanning, SSH Brute-force attacks, Attacks using compromised SSH server. Attacks using a compromised server could be DoS attacks, Phishing attacks, E-mail spamming and so on. This paper questions whether the attacks from a compromised SSH server be segregated from other attacks using the network flows. In this work, we categorize SSH attacks into two types. The first category consists of all attack activities after a successful compromise of an SSH server. We name it as “severe” attacks. The second type includes all attacks leading to a successful compromise. It consists of SSH port scanning, SSH Brute-force attack, and compromised SSH server with no activities. The second category is named as “not-so-severe” attacks. We employ Machine Learning algorithms, namely, Naive Bayes learner, Logistic Regression, J48 decision tree, and Support Vector Machine to classify these attacks. Suitable features were selected based on domain knowledge, literature survey and feature selection technique. The performance metrics are accuracy, sensitivity, precision, and F-score.
Conference Paper
Honeypots are systems aimed at deceiving threat agents. In most of the cases the latter are cyber attackers with financial motivations, and malicious software with the ability to launch automated attacks. Honeypots are usually deployed as either production systems or as research units to study the methods employed by attackers. In this paper we present the results of two distinct research honeypots. The first acted as a malware collector, a device usually deployed in order to capture self-propagating malware and monitor their activity. The second acted as a decoy server, dropping but logging every malicious connection attempt. Both of these systems have remained online for a lengthy period of time to study the aforementioned malicious activity. During this assessment it was shown that human attackers and malicious software are constantly attacking servers, trying to break into systems or spread across networks. It was also shown that the usage of honeypots for malware monitoring and attack logging can be very effective and provide valuable data. Lastly, we present an open source visualization tool which was developed to help security professionals and researchers during the analysis and conclusion drawing phases, for use with one of the systems fielded in our study.
Article
Honeypots play an important role in collecting relevant information about malicious activities that happen on the Internet. In this paper, we are particularly interested in attacks targeting Web services. We therefore propose a honeypot implementation for Web services, called WS Honeypot. However, the data collected by honeypots can become very large, which greatly complicates the analysis task performed by the human analyst. As a solution for this problem, we propose in this paper an automatic technique to analyze the data collected from our WS Honeypot. The proposed approach is based on four machine learning methods: support vector machines, support vector regression, spectral clustering, and k-means clustering. Our main objectives are to analyze the collected data, automatically characterizing the captured attacks and detecting the denial-of-service and novel attacks. Copyright © 2013 John Wiley & Sons, Ltd.
Conference Paper
Information security is a growing concern today for organizations and individuals alike. This has led to growing interest in more aggressive forms of defense to supplement the existing methods. One of these methods involves the use of honeypots. A honeypot is a security resource whose value lies in being probed, attacked or compromised. In this paper we present an overview of honeypots and provide a starting point for persons who are interested in this technology. We examine different kinds of honeypots, honeypot concepts, and approaches to their implementation.
Article
We describe Freenet, an adaptive peer-to-peer network application that permits the publication, replication, and retrieval of data while protecting the anonymity of both authors and readers. Freenet operates as a network of identical nodes that collectively pool their storage space to store data files and cooperate to route requests to the most likely physical location of data. No broadcast search or centralized location index is employed. Files are referred to in a location-independent manner, and are dynamically replicated in locations near requestors and deleted from locations where there is no interest. It is infeasible to discover the true origin or destination of a le passing through the network, and difficult for a node operator to determine or be held responsible for the actual physical contents of her own node.
Systematic approach for detection and assessment of Dark Web threat evolution, in Using Computational Intelligence for the Dark Web and Illicit Behavior Detection
  • P William
  • M Jawale
  • A Pawar
  • R Bibave
  • P Narode
P. William, M. Jawale, A. Pawar, R. Bibave and P. Narode, Systematic approach for detection and assessment of Dark Web threat evolution, in Using Computational Intelligence for the Dark Web and Illicit Behavior Detection, R. Rawat, U. Kaur, S. Khan, R. Sikarwar and K. Sankaran (Eds.), IGI Global, Hershey, Pennsylvania, pp. 230-256, 2022.
  • F Astolfi
  • J Kroese
  • J Van Oorschot
F. Astolfi, J. Kroese and J. Van Oorschot, I2P -The Invisible Internet Project, Web Technology Report, Media Technology, Leiden University, Leiden, The Netherlands, 2015.