Bhushan M. Manjre’s scientific contributions

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (5)


Exploring the Synergy between Artificial Intelligence and Computational Mathematics in Scientific Computing
  • Article

July 2024

·

16 Reads

Panamerican Mathematical Journal

Bhushan Manjre

When artificial intelligence (AI) and computational mathematics come together, it opens up a new era in science computing with unmatched chances to make study and technology better. This combination uses the best parts of AI's data-driven methods and computational mathematics' strict logical models to make it easier to solve problems in a wide range of scientific areas. AI algorithms, especially those that are based on machine learning and deep learning, are very good at finding trends and making guesses from very large datasets. This lets them solve hard, multidimensional problems that traditional computers have a hard time with. On the other hand, computational mathematics gives AI models the theoretical background and accuracy they need to be easier to understand and more reliable. By combining AI with computer methods like numerical analysis, optimization, and differential equations, researchers can make mixed models that make computers much faster and more accurate. This method from different fields not only speeds up the simulation and modeling processes, but it also makes it possible to work with bigger and more complicated information, which helps scientists, engineers, and biologists make important discoveries. Additionally, using AI-driven methods in high-performance computer settings makes the best use of resources, which speeds up calculations and lowers costs. As AI keeps getting better, its programs get better at learning from data and drawing conclusions from them. This means that the computing methods they are used with are always getting better. The mutually beneficial connection between AI and computer mathematics also encourages new ways of making algorithms, which leads to progress that can be used more readily in the real world. In the end, this fusion is going to change the way science computing is done by allowing for more complex studies, more accurate predictions, and the discovery of new things. The current study and development in this area shows how important it is for people from different fields to work together.


Design of an Efficient Model for Enhanced Blockchain Forensics through Anomaly Detection, Graph Neural Networks, and Cross-Blockchain Analysis

May 2024

·

22 Reads

The emergent necessity for sophisticated forensic methodologies in blockchain technology stems from its burgeoning utilization as a decentralized ledger, juxtaposed with a parallel increase in its exploitation for illicit activities. Existing forensic techniques in blockchain investigations often grapple with limitations such as inadequate detection of anomalies, insufficient analysis across different blockchain networks, and a lack of effective tools to analyze complex transaction relationships. Addressing these gaps, this research pioneers a novel framework integrating Anomaly Detection, Graph Neural Networks (GNNs), and Cross-Blockchain Analysis to significantly enhance forensic capabilities in blockchain investigations & operations. Innovatively, the proposed model employs Long Short-Term Memory (LSTM) networks for Anomaly Detection, capitalizing on their proficiency in modeling the temporal dynamics of blockchain transactions. The integration of LSTM addresses the critical challenge of detecting deviations in transaction patterns, a common shortcoming in current methods. Concurrently, the utilization of Graph Neural Networks (GNNs) facilitates a nuanced analysis of transaction graphs, enabling effective clustering of wallet addresses. This aids in tracing the trajectory of funds and unveiling illicit actors within the blockchain network, a capability often underexplored in existing models. Furthermore, the introduction of Cross-Blockchain Analysis marks a significant stride in blockchain forensics, allowing for the amalgamation and examination of data across diverse blockchain networks, thereby offering a more comprehensive forensic view for different scenarios. The empirical validation of this integrated framework on CSAFE & CFReDS datasets exhibits an enhancement in precision by 4.9%, accuracy by 4.5%, recall by 4.3%, AUC by 5.9%, and processing speed by 5.5% in comparison to existing methods. These results underscore the superiority of the proposed model in terms of its efficiency and effectiveness in identifying suspicious transactions and wallet address clusters. The impacts of this work are manifold and far-reaching. By augmenting the precision and scope of blockchain forensics, it empowers law enforcement and regulatory bodies in their pursuit against financial crimes within the blockchain space sets. The framework's comprehensive approach in investigating blockchain-based illicit activities also provides valuable insights for improving regulatory measures. In conclusion, the fusion of Anomaly Detection, GNNs, and Cross-Blockchain Analysis in blockchain forensics heralds a groundbreaking advancement in the field, offering a highly effective, multifaceted solution to combat the complexities of financial crimes in the evolving landscape of blockchain technology process. Keywords: Blockchain Forensics, Anomaly Detection, Graph Neural Networks, Cross-Blockchain Analysis, Long Short-Term Memory Networks


Wireless TCP Congestion Control Based on Loss Discrimination Approach Using Machine Learning

February 2024

·

47 Reads

·

1 Citation

Lecture Notes in Electrical Engineering

Pooja G. Dhawane

·

·

Bhushan Manjre

·

[...]

·

Transport Control Protocol (TCP) prominently used for wired and wireless communication. In wired network data loss occurs only because of congestion of network and TCP handles this congestion by reducing the Congestion window size (Cwn). But In wireless communication, loss of packets may occur mostly due to variable bandwidth, variability in network topology, change of host locality etc. These all situations are not differentiated by conventional TCP and all are treated as congestion loss. Due to this misinterpretation of problem, these situations are handled as a congestion problem of network and conventional TCP slows down the sender by reducing congestion window size (Cwn), this leads to deterioration of network performance. As a solution to this problem, we are proposing the packet loss discrimination algorithm using Machine Learning (ML-LDA) approach to eradicate the drawback of conventional TCP for wireless network and improve the performance of network in turn. We are proposing the system called packet loss discrimination in wireless TCP congestion with ML approach which is the extension of TCP Reno and we anticipate the better performance than it. Whenever packets are lost, then packet losses are classified and checked whether packets are lost due to network congestion or other random error (like high bit error rate, link problem etc.). According to cause of packet loss appropriate corrective measures are invoked.



Citations (1)


... Recently, many proposed CCAs use learning techniques such as machine learning [334]- [343]. For these CCAs, the learnt rate update algorithm is a black box, which makes modeling the CCA infeasible. ...

Reference:

Evaluating Transport Layer Congestion Control Algorithms: A Comprehensive Survey
Wireless TCP Congestion Control Based on Loss Discrimination Approach Using Machine Learning
  • Citing Chapter
  • February 2024

Lecture Notes in Electrical Engineering