FIGURE 1 - uploaded by Ashad Kabir
Content may be subject to copyright.
Classification of literature surveyed.

Classification of literature surveyed.

Source publication
Article
Full-text available
The dark web is a section of the Internet that is not accessible to search engines and requires an anonymizing browser called Tor. Its hidden network and anonymity pave the way for illegal activities and help cybercriminals to execute well-planned, coordinated, and malicious cyberattacks. Cyber security experts agree that online criminal activities...

Contexts in source publication

Context 1
... were then filtered manually for greater relevance by selecting only those focused on the topics and peripherally relevant. The final list was composed of 79 articles categorized in 5 different areas mentioned in Figure 1. In our review process, almost 50% of our literature review focuses on attacks, 36% on threat intelligence techniques, and 14% on the dark web's anonymity and crimes taking place in the dark web. ...
Context 2
... employed a generally used process to summarise the detecting architecture. Figure 10 depicts the architectural framework analysis process [1]. ...
Context 3
... outcome of the models can be used by security agencies or law enforcement, which is the ultimate purpose of all frameworks. It should be noted that the procedures and examples described in Figure 10 and the description are not exhaustive. Table 5 presents some strategies from our peer-reviewed papers that use various models and tools to detect threats in the categories mentioned above. ...
Context 4
... this section, different attacks on different browsers, Tor being the most popular browser, are discussed and their categorization in detail. Figure 11 gives a detailed overview of the attacks on the dark web discussed in this section. ...
Context 5
... is transferred through cells, stored temporarily in a queue, then sent to an output buffer before entering the network. A signal can be deployed in the traffic by gaining control of a cell count of the output buffer, as shown in Figure 12. The time selection is essential while sending each 'symbol,' as short waiting will cause cells combined with other relays to wait for a long time to look suspicious and increase the latency, which may cause the user to create a new circuit [10]. ...
Context 6
... may seem harmless, but it can be used to attack the Tor network -this is called a Sybil attack. Figure 13 illustrates datasets that contribute to the assailant, agreement, and worker descriptors; malevolent transfers and the exit map [10]. In a Sybil attack, an enemy oversees virtual personalities to acquire an unnecessarily massive impact on the organization. ...

Citations

... This robust anonymity, while beneficial for legitimate users, also enables criminals to engage in illicit activities securely. For example, criminals build illegal trading services and anti-social propaganda services in the Tor network, and hackers use the Tor network to carry out anonymous ransom attacks and other hacking attacks [3][4][5][6][7][8]. ...
... The anonymity provided by Tor is increasingly exploited by criminals, who utilize it to establish illicit hidden services for the anonymous trading of firearms and drugs, as well as to conduct hacking attacks [3][4][5][6][7][8]27]. In response, researchers develop de-anonymization attacks on Tor to trace and identify these individuals. ...
Article
Full-text available
Criminals exploit the robust anonymity afforded by Tor for illicit purposes, prompting heightened interest among researchers in de-anonymization attacks on the Tor network. The execution of experiments on de-anonymization attacks within a real Tor network presents considerable challenges, hence the necessity for a simulation environment. However, existing methods for simulating the Tor network are inadequate regarding realism, flexibility, and scalability, with some being prohibitively expensive. In this paper, we develop a lightweight and scalable Tor network simulation environment based on Kubernetes (K8s), employing Docker containers to simulate Tor relays. The results demonstrate that a network of up to a thousand Tor relays can be simulated using just four standard hosts. Furthermore, two de-anonymization attack experiments were conducted within this simulated environment, which exhibited high levels of realism and flexibility. Finally, a hybrid networking approach combining multi-granularity relays was explored to enhance further the balance between realism and cost in Tor network simulations.
... The Dark Web thrives in the depths of encrypted networks, giving users anonymity and protecting them from surveillance, unlike the Surface Web that comprises pages that are indexed and accessible using conventional search engines [17]. Privacy plays a crucial role in safeguarding the principles of free expression and enabling individuals to evade oppressive regimes [32]. However, the Dark Web is also a fertile environment for cyber criminals, illicit marketplaces and malicious activities. ...
... Security and flexibility were ensured by using virtual machines (VMs) to host web servers and log servers. Other studies [9,32] have demonstrated the importance of deploying multiple honeypots to increase data collection and address problems such as insufficient comment data filtration to prevent future security breaches. ...
Chapter
The Dark Web presents a challenging and complex environment where cyber criminals conduct illicit activities with high degrees of anonymity and privacy. This chapter describes a honeypot-based data collection approach for Dark Web browsing that incorporates honeypots on three isolated virtual machines, including production honeypots, an onion-website-based research honeypot (Honey Onion) offering illegal services and a log server that collects and securely stores the honeypot logs. Experiments conducted over 14 days collected more than 250 requests on the Honey Onion service and in excess of 28,000 chat records from the Dark Web forum. The log server also monitored Honey Onion traffic, providing details such as packet types, timestamps, network data, and HTTP requests. The data collection results provide valuable insights into Dark Web activities, including malicious, benign and uncategorized activities. The data analysis identified common user categories such as malicious actors, researchers and security professionals, and uncategorized actors. The experimental results demonstrate that honeypot-based data collection can advance Dark Web investigations as well as enable the development of effective cyber security strategies and efforts to combat cyber crime in the Dark Web.
... The dark web is a hidden part of the internet that requires special software like Tor to access, offering anonymity to both users and websites as discussed in [14]. ...
Conference Paper
The cyberspace contains vast amounts of information that are crucial for cybersecurity professionals to gather threat intelligence, prevent cyberattacks, and secure organizational networks. Unlike earlier and less targeted attacks, modern cyber-attacks are more organized and sophisticated, often targeting specific groups, which leaves many users unaware of the vulnerable resources within the cyberspace. The increasing freedom on information access in the deep and dark web has led many organizations to identify their data loose on these spaces. Therefore, creating methods to crawl and extract valuable information from the deep web is a critical concern. Some deep web content can be accessed through the surface web by submitting query forms to retrieve the needed information, but it is not as simple in all cases. This paper proposes a system of framework to identify these leaks and notify relevant parties on the same in-time.
... This is just one of many illegal services that can be found on the deep web, making it a dangerous and illicit place to explore. 62 The digital black market has been growing at an unprecedented rate in recent years, with its vast scale and far-reaching operations expanding every year. The idea of a cyber black market may have started small, but with the growing consumer demand for illegal goods and services, it has become a lucrative venture for cybercriminals to exploit. ...
... 63 As mentioned, technology is advancing at an unprecedented pace, and many opportunities are becoming available not only to ordinary citizens but also to cybercriminals. This means that these criminals are constantly 62 De Sanctis, F.M., (2023 adapting and improving their tactics to evade detection and continue their illegal activities. Additionally, the anonymity provided by the internet and the ease of communication and exchange of information has made it difficult for law enforcement agencies to track and apprehend cybercriminals. ...
Article
Full-text available
In the twenty-first century, several negative moments have accompanied many positive technological progress events. With the development of the digital world, criminals with special knowledge-cybercriminals have become active and have developed many means in the depths of cyberspace, with the help of which they achieve their criminal and illegal goals. The presented paper aims to analyze in detail and in-depth the system and working principles of digital black markets created by cybercriminals on the dark side of the Internet. The paper also looks at what tools and products are available on the digital black market and why digital black markets are hazardous to operate successfully. The paper analyzes practical cases, legal studies, and other in-terdisciplinary studies to present the problems in the legal struggle against digital black markets as of the day it was written. The pa-per's primary purpose is to show why the unimpeded functioning of digital black markets on the dark web is dangerous-both for the ordinary citizen and the state structures. In the final part of the article, based on the analysis of the processed literature and practical cases, recommendations will be presented , the use of which is of particular importance to make the legal fight against digital black markets successful and to prevent such a dangerous phenomenon as digital black markets from existing in any part of the digital world.
... Another study [17] explains that a portion of the internet called the Dark Web is not included in search engine indexes. Data breaches, illicit drug use, pornography, human trafficking, and other unlawful activities make up 57% of the dark web's operations. ...
Article
The emergence of the digital age has ushered in unprecedented connectivity and convenience, but it has also brought about a surge in cybercrimes, posing significant challenges to security. Criminals exploit vulnerabilities in cyberspace to access sensitive information, perpetrate fraudulent activities, and undermine trust in online interactions. Despite the pervasive impact of cyber threats on various facets of society, awareness levels among the general populace remain a critical concern. This paper presents findings from a comprehensive survey aimed at assessing public awareness of cybercrimes and cybersecurity measures. The survey, conducted through a structured questionnaire, solicited responses from a diverse demographic sample, encompassing individuals' background information and their perceptions of cyber threats. Participants were queried on their familiarity with prevalent cyber scams, including phishing, email spoofing, lottery scams, credit card frauds, investment scams, vishing frauds, and gaming frauds. Key findings reveal gaps in awareness and understanding of cyber risks, underscoring the need for enhanced education and vigilance. Furthermore, insights into participants' responses regarding their experiences with cybercrimes, including measures taken to address incidents and financial losses incurred, shed light on the real-world impact of cyber threats. In conclusion, this paper emphasizes the importance of proactive measures to safeguard against cybercrimes and offers recommendations for bolstering cybersecurity awareness and resilience in the digital age. Key Words: Cybercrime, Scams, threats, Analysis, Awareness.
... The author (Saleem et al.,2022) went into greater detail about the technological features of websites hosted on Tor hidden services, proposing a dark crawler technique based on OSN (social network analysis) (Alshammery & Aljuboori, 2022) with content selection strategies for gathering information from cyber terrorists (Howell et al., 2022) and extremist weblinks. The author (Alshammery & Aljuboori, 2022) worked on a crawler-based system that feeds data to a search engine and specialises in detecting data from dangerous and problematic websites. ...
... Some of these websites are easily accessible by going to certain web pages. These sites can be used as dictionaries (similar to Hidden Wiki) (Howell et al., 2022) or to search the Tor network using specialised but limited search engines like Grams, DuckDuckGo (Howell et al., 2022), and TorSearch (Saleem et al., 2022), but only for limited anonymous services. ...
Article
Full-text available
The limits of user visibility have been exceeded by the internet. The “Dark Web” or “Dark Net” refers to certain unknown portions of the internet that cannot be found using standard search methods. A number of computerised techniques are being explored to extract or crawl the concealed data. All users can freely interact on the surface web. Identity identities may be found on the deep web, and the dark web (DW), a hub for anonymous data, is a haven for terrorists and cybercriminals to promote their ideologies and illegal activities. Officials in clandestine surveillance and cyberpolicing are always trying to track down offenders' trails or hints. The search for DW offenders might take five to ten years.The proposed study provides data from a DW mining and online marketplaces situation from a few domains, as well as an overview for investigators to build an automated engine for scraping all dangerous information from related sites.
... Consequently, the content concealed within the deep web remains beyond the reach of search engine crawlers, rendering it invisible to conventional search queries (Ngo et al., 2023). Additionally, the content in the deep web is often dispersed and necessitates specific access credentials or pathways, further complicating the task of discovering and retrieving valuable resources (Saleem et al., 2022). ...
... · Ethical and Legal Considerations: Certain deep web content might raise ethical or legal concerns, encompassing scenarios like illicit marketplaces or the presence of sensitive personal data. Navigating these intricate ethical and legal dimensions can prove perplexing for users (Saleem et al., 2022). · Privacy Vulnerabilities: Venturing into the deep web exposes users to potential privacy vulnerabilities. ...
... The covert nature of the content and the employment of anonymizing networks can attract malicious actors. Users must exercise prudence to safeguard their privacy and security (Saleem et al., 2022). ...
Article
Full-text available
In the modern digital landscape, the deep web presents itself as a hidden realm containing a wealth of concealed knowledge beneath the surface web. This paper extensively examines the challenges associated with the deep web, elucidating its lack of indexation, access barriers, and ethical considerations. It also provides insights into effective strategies for navigating this domain, including advanced search methods, specialized search engines, and collaborative efforts. Furthermore, the paper explores the fusion of technology, education, and open-access initiatives as sustainable approaches for contemporary knowledge seekers. The utilization of technology, encompassing techniques like web scraping and data interpretation, emerges as a central aspect of accessing hidden insights. Nevertheless, this is complemented by digital literacy and ethical consciousness to ensure responsible exploration. Collaborative partnerships between technological pioneers and information experts facilitate wider access to deep web resources. In conclusion, the study emphasizes the importance of comprehending the complexities of the deep web to tap into its extensive knowledge reserves, ultimately empowering modern information seekers of knowledge.
... The obvious anonymous routing network Tor, routing the traffic through multiple volunteer servers makes the communication decentralized, obscure and more secure for the privacy concerned entities [3]. Anonymity is guaranteed via advance layered encryption using multiple securing protocols and manifold encryption ciphers e.g. ...
Article
Full-text available
In the digital landscape of privacy-concerned era, anonymous or Stealth Services (SSs) offer a promising future for entities who wish for obscurity and solitude in their communication. The elevating paramount of privacy concern consistently increasing the usage of SSs and affected individuals day by day. Some SSs being the hot stakes of cyber world needs to be analyzed more critically. We categorize, analyze and synthesize the comprehensive analysis of six prominent SSs (Tor, I2P, Freenet, Riffle, Lokinet and JAP) to commence their in-depth comparative evaluation metrics in multiple aspects. Analytical and performance based scrutiny elucidates the diverse approaches, their architectures, security features, flaws and associated attacks. The synthesized examination offers a nuanced aspect towards SSs and enlightens the way for future developments in cyber security and secrecy. Performance and security based evaluation reveals that Tor incur performance bottlenecks while decentralized SS I2P enhances security with compromised throughput. Likewise, the Freenet and JAP offer more data resilience but suffer from transparency and scalability issues. Although, Riffle and Lokinet are innovative paradigms of SSs but both have more complexity and other trade-offs. The valuable insights will help the researchers and decision-makers navigate the complexities of the digital life, facilitating a deeper understanding of the challenges involved in selecting a secure communication network and offering improved stealth solutions.
... The user designates a portion of their hard disk within the system and utilizes it as a distributed storage system for sharing purposes. The Freenet system determines the privileges of insertion, retrieval, and deletion, while the specific placement of a shared file is decided by a unique routing key linked with it [61]. Therefore, every individual within the Freenet possesses knowledge about their immediate neighboring peers. ...
... End-to-end attacks can manifest in either an active or passive form. In the context of these assaults, the adversary engages in the surveillance of entrance and exit nodes situated at both ends [61]. ...
... By employing this methodology, it becomes feasible to effortlessly adjust the size of each packet to match the precise dimensions, such as the maximum transmission unit (MTU). Similarly, multiple methodologies have been examined in diverse research endeavors to optimize the padding of packet size with both efficiency and efficacy [61]. To enhance the efficiency of traffic flow, artificial delays might be introduced between each packet to obscure the measurement of traffic time. ...
Article
Full-text available
The primary objective of an anonymity tool is to protect the anonymity of its users through the implementation of strong encryption and obfuscation techniques. As a result, it becomes very difficult to monitor and identify users’ activities on these networks. Moreover, such systems have strong defensive mechanisms to protect users against potential risks, including the extraction of traffic characteristics and website fingerprinting. However, the strong anonymity feature also functions as a refuge for those involved in illicit activities who aim to avoid being traced on the network. As a result, a substantial body of research has been undertaken to examine and classify encrypted traffic using machine-learning techniques. This paper presents a comprehensive examination of the existing approaches utilized for the categorization of anonymous traffic as well as encrypted network traffic inside the darknet. Also, this paper presents a comprehensive analysis of methods of darknet traffic using ML (machine learning) techniques to monitor and identify the traffic attacks inside the darknet.
... A major part of the internet that is hidden and not indexed by conventional search engines is referred to as the "darknet." It is frequently linked to illegal activities like the trafficking of illegal goods, including weapons, drugs, and stolen data [1], [2]. Darknet websites usually run on encrypted networks and need special software or configurations to access; this gives users anonymity and makes it harder for law authorities to track their movements [3]. ...
Article
Full-text available
Darknet refers to a significant portion of the internet that is hidden and not indexed by traditional search engines. It is often associated with illicit activities such as the trafficking of illicit goods, such as drugs, weapons, and stolen data. To keep our online cyber spaces safe in this era of rapid technological advancement and global connectivity, we should analyse and recognise darknet traffic. Beyond cybersecurity, this attention to detail includes safeguarding intellectual property, stopping illegal activity, and following the law. In order to improve accuracy and precision in identifying illicit activities, this study presents a novel approach named Active-Darknet that uses an active learning-based machine learning model for detecting darknet traffic. In order to guarantee high-quality analysis, our methodology includes extensive data preprocessing, such as numerically encoding categorical labels and improving the representation of minority classes using data balancing. In addition to machine learning models, we also use Deep Neural Networks (DNN), Bidirectional Long Short-Term Memory (BI-LSTM) and Flattened-DNN for experimentation. The majority of models exhibited encouraging outcomes; however, the models that utilised active learning, specifically the Random Forest (RF) and Decision Tree (DT) models, attained promising accuracy levels of 87%, rendering them the most efficient in detecting darknet traffic. Large traffic analysis is greatly enhanced by this method, which also increases the detection process’s robustness and effectiveness.