Identity and access management (IAM) system usually consist of predefined tasks as an information security system. Themain task is the authentication, since it is responsible for user identity proving for service providers that corporate with (IAM).This paper provides a review on intelligent authentication research applicable to IAM systems. These researches areevaluated according to the proposal of intelligent authentication key factors. Depending on this evaluation it could not be foundresearch implement an authentication that satisfies all these key factors.
Identity and access management (IAM) system usually consist of predefined tasks as an information security system. The
main task is the authentication, since it is responsible for user identity proving for service providers that corporate with (IAM).
This paper provides a review on intelligent authentication research applicable to IAM systems. These researches are
evaluated according to the proposal of intelligent authentication key factors. Depending on this evaluation it could not be found
research implement an authentication that satisfies all these key factors.
The evolution of technology has increased the rate of traffic accidents that occurs frequently causing loss of lifeand property. Therefore, the automatic traffic monitoring system gradually attracted the attention of researchers in improvingtraffic safety through the field of intelligent transport systems. In this paper, a cost-effective approach based on GSM systemfor automatic traffic incident detection is proposed. This paper provides an optimal solution to reduce the death rate by usingthe vibration sensor and GSM system; the implementation system is based on hardware (circuits) and software to build agraphical user interface (GUI) using LabView TM to process the data. Sensors are installed into the vehicles on each side ofthe vehicle. An SMS will be sent to the user after the accident. This system will assist in search and rescue vehicle involved inthe accident.
All the important information is exchanged between facilities using the internet and networks, all these data should besecret and secured probably, the personal information of person in each of these institutions day by day need to organized secretlyand the need of the cryptography systems is raised which can easily encrypt the personal and critical data and it can be shared withother centers via internet without and concerns about privacy. Chaotic performance is added to different phases of AES but very few apply it on key generation and choosing ChebyshevPolynomial will provide a chaotic map which will led to random strong key. our system based on modified advanced encryptionstandard (AES) , with encryption and decryption in real time taking to consideration the criticality of data images that beenencrypted the main encryption algorithm is the same the modification is done by replacing the key generation algorithm byChebyshev Polynomial to generate key with the required key size.
With development, technology, computer science, computer networks and transmission of multimedia between two or more parts, a security of multimedia becomes an essential issue since most of the systems became easy to attack. In this newspaper, we suggest amodel to Hybrid Cipher for Secure Multimedia by using AES and RC4 Chain. The analysis and evaluate the performance of this model ismeasured by testing several parameters. Show The resulting multimedia is found to be more distorted in hybrid Cipher.
This paper describe Web Content Filtering that aimed to block out offensive material by using DistributedAgents. The proposed system using FCM algorithm and other page's features (Title, Metadata , Warning Message) to classifythe websites (using as candidate) into two types:- white that considered acceptable, and black that contain harmful materialtaking the English Pornographic websites as a case study.
Wireless sensor network WSN consists of small sensor nodes with limited resources, which are sensing, gathering and transmitting data to base station. Sensors of various types are deployed ubiquitously and widely in varied environments for instance, wildlife reserves, battlefields, mobile networks and office building. Sensor nodes are having restricted and non replenishable power resources and this is regarded as one of the main of their critical limits. All applied techniques and protocols on sensor nodes must take into consideration their power limitation. Data aggregation techniques are used by sensor nodes in order to minimize the power consumption by organizing the communication among sensor nodes and eliminating the redundant of sensed data. This paper proposed lightweight modification on data aggregation technique named Energy Aware Distributed Aggregation Tree EADAT. The main principle of this development is using the available information in sensor nodes to pass the role of parent node among sensor nodes in each cluster. The process of passing parent node role is based on nominating the sensor nodes which have higher power on regular bases. A model based on tree network architecture is designed for validation purpose and is used with NS2 simulator to test the proposed development. EADAT and EADAT with proposed development are applied on the designed model and the results were promising
There is a significant necessity to compress the medical images for the purposes of communication and storage.
Most currently available compression techniques produce an extremely high compression ratio with a high-quality loss. In
medical applications, the diagnostically significant regions (interest region) should have a high image quality. Therefore, it is
preferable to compress the interest regions by utilizing the Lossless compression techniques, whilst the diagnostically lessersignificant regions (non-interest region) can be compressed by utilizing the Lossy compression techniques. In this paper, a hybrid
technique of Set Partition in Hierarchical Tree (SPIHT) and Bat inspired algorithms have been utilized for Lossless compression
the interest region, and the non-interest region is loosely compressed with the Discrete Cosine Transform (DCT) technique.
The experimental results present that the proposed hybrid technique enhances the compression performance and ratio. Also,
the utilization of DCT increases compression performance with low computational complexity.
There is a significant necessity to compress the medical images for the purposes of communication and storage.Most currently available compression techniques produce an extremely high compression ratio with a high-quality loss. Inmedical applications, the diagnostically significant regions (interest region) should have a high image quality. Therefore, it ispreferable to compress the interest regions by utilizing the Lossless compression techniques, whilst the diagnostically lessersignificant regions (non-interest region) can be compressed by utilizing the Lossy compression techniques. In this paper, a hybridtechnique of Set Partition in Hierarchical Tree (SPIHT) and Bat inspired algorithms have been utilized for Lossless compressionthe interest region, and the non-interest region is loosely compressed with the Discrete Cosine Transform (DCT) technique.The experimental results present that the proposed hybrid technique enhances the compression performance and ratio. Also,the utilization of DCT increases compression performance with low computational complexity.
Wireless networks used widely in office, home, and public places so security is one of the significant issues to keep the transmitted information safe. The applied security standards have been developed in response to the demand of high security and the developed hardware with software. Currently, the available security standards are (WEP, WPA, WPA2 and under development WPA3). These security standards are different in the offered security level base on the employed authentication method and encryption algorithms. The major objective of this paper is studying security standards and analyzing them based on their features. In addition to presenting a detailed review about WPA3 and its improvements over the older security standards. The conducted evaluations explained the differences among the Wi-Fi security standards in term of the offered security level, software and hardware requirements.
In this paper, a secure chatting application with end to end encryption for smart phones that used the android OS has beenproposed. This is achieved by the use of public key cryptography techniques. The proposed application used the Elliptic Curve DiffieHellman Key Exchange (ECDH) algorithm to generate the key pair and exchange to produce the shared key that will be used for theencryption of data by symmetric algorithms. The proposed Application allows the users to communicate via text messages, voicemessages and photos. For the text message security the standard AES algorithm with a 128 bit key are used. The generated key (160 bit)minimized to 128 bit length by selecting the first 128 bit of the generated key in order to be used by the AES algorithm. For the voice andimage security processes the proposed application used the symmetric algorithm RC4 for this purpose.
Now day’s text Classification and Sentiment analysis is considered as one of the popular Natural Language Processing (NLP) tasks. This kind of technique plays significant role in human activities and has impact on the daily behaviours. Each article in different fields such as politics and business represent different opinions according to the writer tendency. A huge amount of data will be acquired through that differentiation. The capability to manage the political orientation of an online article automatically. Therefore, there is no corpus for political categorization was directed towards this task in Arabic, due to the lack of rich representative resources for training an Arabic text classifier. However, we introduce political Arabic articles dataset (PAAD) of textual data collected from newspapers, social network, general forum and ideology website. The dataset is 206 articles distributed into three categories as (Reform, Conservative and Revolutionary) that we offer to the research community on Arabic computational linguistics. We anticipate that this dataset would make a great aid for a variety of NLP tasks on Modern Standard Arabic, political text classification purposes. We present the data in raw form and excel file. Excel file will be in four types such as V1 raw data, V2 preprocessing, V3 root stemming and V4 light stemming.
Creating a useful portable browser or program is a demanding challenge for organisations due to the size constraints of the display area, which additionally is contingent on usability, internet connectivity, efficient backup power and mobility. Users also expect performance equivalent to that of microcomputers. Portable display programmers need to surmount these challenges without compromising users’ concerns for safety and confidentiality. Various works in the past couple of years revealed the ability of a strongly perceived artistic architecture of a mobile website as a push to overcome clients’ reduced skill. The investigation conducted in this report sought to determine the distinct impact of eight artistic display on the of a mobile website or program and the corresponding clients’ . Organisations may thus take advantage of this knowledge to improve usage, trust, frequency of adoption and consequently financial rewards on m-commerce applications. This study uses a with a screening objective. Data for the experiment was sourced through a web-based questionnaire program, . At the same time, social network sites were used to disseminate the questionnaires to enable gathering a good number of respondents. The data collection instrument had statements with threshold for Ninety-six participants took part in the inquiry. In contrast, seventy-six participants completed the exercise, signifying an estimated. was used in the study of the association concerning the . package was utilised in the evaluation of the . Findings revealed that artistic architecture bears a substantial impact on of portable websites and programs as well as influences artefacts on such platforms. , as well as the of an application, turned out as two features with the highest impact on the two factors. That the orientation of their effects is a reverse of the conjectured outcome suggests that future work should explore this issue.
Association rules discovery has emerged as a very important problem in knowledge discovery in database and data mining. A number of algorithms is presented to mine association rules. There are many factors that affect the efficiency of rules mining algorithms, such as largeness, denances, and sparseness of databases used to be mined, in addition to number of items, number and average sizes of transactions, number and average sizes of frequent itemscts, and number and average sizes of potentially maximal itemsets. It is impossible to change present realworld catabase's characteristics to fairly test and determine the best and wurst cases of rule-mining algorithms. to be efficiently used for present and future databases. So the researchers attend to construct artificial database to qualitative and quantitative presence of the above mentioned factors to test the efficiency of rule mining algorithms and programs. The construction of such databases CATmes very large amount of the and efforts. This resent presents a software system, generator, to construct artificial databases.
The important methods of data mining is large and from these methods is mining of association rule. The miningof association rule gives huge number of the rules. These huge rules make analyst consuming more time when searchingthrough the large rules for finding the interesting rules. One of the solutions for this problem is combing between one of theAssociation rules visualization method and generalization method. Association rules visualization method is graph-basedmethod. Generalization method is Attribute Oriented Induction algorithm (AOI). AOI after combing calls ModifiedAOI because it removes and changes in the steps of the traditional AOI. The graph technique after combing also callsgrouped graph method because it displays the aggregated that results rules from AOI. The results of this paper are ratio ofcompression that gives clarity of visualization. These results provide the ability for test and drill down in the rules orunderstand and roll up.
Audio hiding is method for embedding information into an audio signal. It seeks to do so in a robust Fashion, while not perceivably degrading the host signal (coves audio). Hiding data in audio signals presents & varicty of challenges: due in part to the wider dynamic and differential range of the Human Auditory System (HAS) as compared to other senses. Transform are usually used for robust audio hiding (audio watcrmarking). But, the audio hiding process is affected by the type of transform used. Therefore, this paper presents an evaluation of wavelet transform hiding in comprison with sclccted types of transforms (Walsh transform and cosine transform) hiding. In order to generate the audio stegecover, the research concludce (Wavelet, Walsh, or Cosine) transform of the audio cover, replacing some transformed cover cocfficients with secret audio message coefficients, and inverse (Wavelet, Walsit, or Carsine Irannom for audio cover with replaced coefficients. While, the extracting method concludes (Wavelel, Walsh, or Cosine runsform of the stego ove18 and extracting the secrecy extracted Audio message.
The Internet of Things (IoT) alludes to interestingly identifiable items (things) which can communicate with differentquestions through the worldwide framework of remote/wired Internet. The correspondence system among an expansive number of assetobliged gadgets that produce substantial volumes of information affects the security and protection of the included items. In thispaper, we propose a lightweight protocol for IoT authentication which based on two algorithms LA1 and RA1 which is used forauthentication and generating session key that is used for encryption.
License Plate Recognition (LPR) system becomes animportant research issue in recent years due to its importance to wideranges of commercial applications. The first and the most importantstage for any LPR system is the localization of the number platewithin the vehicle image. This paper presents a methodology for Iraqicars number plates extraction from the vehicle image using twomethods, the first one is morphological operations and the secondmethod is edge detection. The main idea is to use these two differentmethods in such away so that the number plate of the vehicle can beextracted precisely. These algorithms can quickly and correctly detectand extract the number plate from the vehicle image although therewas a little noise in the image. This paper also makes a comparisonbetween the two methods of extraction in results. The software thatused to build the systems is MATLAB R2014a
The increased use of computer and internet had been related to the wide use of multimedia information. The requirement forprotecting this information has risen dramatically. To prevent the confidential information from being tampered with, one needs toapply some cryptographic techniques. Most of cryptographic strategies have one similar weak point that is the information is centralized.To overcome this drawback the secret sharing was introduced. It’s a technique to distribute a secret among a group of members, suchthat every member owns a share of the secret; but only a particular combination of shares could reveal the secret. Individual sharesreveal nothing about the secret. The major challenge faces image secret sharing is the shadow size; that's the complete size of the lowestneeded of shares for revealing is greater than the original secret file. So the core of this work is to use different transform codingstrategies in order to get as much as possible the smallest share size. In this paper Compressive Sharing System for Images UsingTransform Coding and Blackely Method based on transform coding illustration are introduced. The introduced compressive secretsharing scheme using an appropriate transform (Discrete cosine transform and Wavelet) are applied to de-correlate the image samples,then feeding the output (i.e., compressed image data) to the diffusion scheme which is applied to remove any statistical redundancy orbits of important attribute that will exist within the compressed stream and in the last the (k, n) threshold secret sharing scheme, where nis the number of generated shares and k is the minimum needed shares for revealing. For making a certain high security level, eachproduced share is passed through stream ciphering depends on an individual encryption key belongs to the shareholder.
Immersive virtual reality isn’t just for gaming. It’s poised to have a big impact on education as well, giving students an opportunity to interact with content in three-dimensional learning environments. Blended learning, according to the Inn sight Institute is "a formal education program in which a student learns at least inpart through online delivery of content and instruction with some element of student' control over time, place, path and/orpace". On the other hand, there are many disadvantage found in blended learning such as the learners with low motivation or bad study habits may fall behind, and many others. So, there is an essential need to improve and develop the theory of the blended learning by using virtual reality environments and get rid of these disadvantages to develop the faceto-face learning to add a lot of features such as excitement and make it more efficient. As well as affirms clarity of the scientific content of the lecture that the student may be miss them by absent or even mentally, so student can live atmosphere the lecture again and overcome the difficulties resulting from the using blended learning or traditional learning. Firstly, this approach is applied by building a specialized website application that allow using virtual reality feature in order to measure the effectiveness of this study on students, and then a questionnaires was designed and the information result of these questionnaires impact was gathered. It is found that, the most of students were excited, active and they understand the lecture in an easy way with a high Likert Scale (4.74), but they found difficulties in using VR tools which has a low Likert scale (2.66).
The development of modern information technologies in medicine makes actually the creation of the nationalInformation Systems (IS) for joint activities of medical institutions, improve the quality of health services and improvemanagement in the health sector. One of the components of healthcare is the system of Blood Service (BS). In this work the concept ofbuilding the national system is considered on example of the IS of BS. The national IS of BS aims to track relevant information onindicators of the quality of blood products through information integration BS establishments, makes it possible to increase thelevel of infectious safety and quality of transfusion care. The models of integration IS of BS are offered on the conceptual level inthis work for information exchange organization between BS establishments. The analysis of structures of models of integratedsystems is carried out to select the rational national IS of BS.
Inn then last two decades the Content Based Image Retrieval (CBIR) considered as one of the topic of interest for theresearchers. It depending one analysis of the image’s visual content which can be done by extracting the color, texture and shapefeatures. Therefore, feature extraction is one of the important steps in CBIR system for representing the image completely. Color featureis the most widely used and more reliable feature among the image visual features. This paper reviews different methods, namely LocalColor Histogram, Color Correlogram, Row sum and Column sum and Colors Coherences Vectors were used to extract colors featurestaking in consideration the spatial information of the image.
لقد اصبحتت تقنية الاقراص المكتنزة وقواعدها المخزونة عليها امرا لا مفر منه للباحثين والمستفيدين من خدمات المعلومات المتطورة التي تقدمها المكتبات ومراكز المعلومات وتشهد العديد من مناطق العالم ومنها الدول النامية تحولا ملحوظا نحو استثمار قواعد الاقراص واستخدامها عبر شبكات معلومات وطنية واقليميه مناسبة ، كذلك فقد خطت المكتبات ومراكز المعلومات في العراق والاردن خطوات مهمة على طريق استخدام هذه الاقراص الان ان هذا التجارب تحتاج الى مزيد من العناية والتثمين والتنسيق والتعاون . ومن هنا تتوضح اهدف البحث واهميته
LAN Chat Messenger using TCP/IP offers reliability, security and zero cost communication among staff members of acompany. As well, this study offers file transfer. It helps to solve the communication problems that related to time and cost. The proposedprotocol facilitates information exchange among individuals by providing many communication options. It is a standaloneapplication using JAVA as the programming language and tested at LANs of our institute (college's Labs networks).
Salsa (20) cipher is speedier than AES cipher and its offered superior security. Salsa (8) and Salsa (12) are specified
for apps wherever the grade of security is less necessary than speed. The concept of this research is to suggest super salsa
keystream utilizing various volumes matrices size (array (4, 4), array (4, 8), array (4, 16)) are used to increase the complexity of
key stream and make it more reluctant to linear and differential attacks. Furthermore, in each iteration, the diffusion of
generated keystream will increase due the effect of changing the volume acting for one element of the array is not fixed. The
generated keys of the suggested Super SALSA keystream are depicted as simple operations and a high hardiness randomly
keystream by exceeding the five benchmark tests. Likewise, it's presenting a situation of equilibrium between complexity and
speed for Salsa (8, 12 and 20).
Salsa (20) cipher is speedier than AES cipher and its offered superior security. Salsa (8) and Salsa (12) are specifiedfor apps wherever the grade of security is less necessary than speed. The concept of this research is to suggest super salsakeystream utilizing various volumes matrices size (array (4, 4), array (4, 8), array (4, 16)) are used to increase the complexity ofkey stream and make it more reluctant to linear and differential attacks. Furthermore, in each iteration, the diffusion ofgenerated keystream will increase due the effect of changing the volume acting for one element of the array is not fixed. Thegenerated keys of the suggested Super SALSA keystream are depicted as simple operations and a high hardiness randomlykeystream by exceeding the five benchmark tests. Likewise, it's presenting a situation of equilibrium between complexity andspeed for Salsa (8, 12 and 20).
Automated classification of text into predefined categories has always been considered as a vital method in the
natural language processing field. In this paper new methods based on Radial Basis Function (RBF) and Fuzzy Radial Basis
Function (FRBF) are used to solve the problem of text classification, where a set of features extracted for each sentence
in the document collection these set of features introduced to FRBF and RBF to classify documents. Reuters 21578 dataset
utilized for the purpose of text classification. The results showed the effectiveness of FRBF is better than RBF.
Automated classification of text into predefined categories has always been considered as a vital method in thenatural language processing field. In this paper new methods based on Radial Basis Function (RBF) and Fuzzy Radial BasisFunction (FRBF) are used to solve the problem of text classification, where a set of features extracted for each sentencein the document collection these set of features introduced to FRBF and RBF to classify documents. Reuters 21578 datasetutilized for the purpose of text classification. The results showed the effectiveness of FRBF is better than RBF.
In this paper, a study of the role nf clour information in detecting the edges of an image was conducted Therefore, different colour spaces with components corresponding to the attribute tumance, hue, and saturation le., VHS, and LUVI were implemented.
Two edge delection techniques were applied to each of above colour spaces. These techniques are Sobel operator d Noalincar Laplace operator. A proposed nonlinear Laplace operator was implemented, and in encouraging results indicated its better cticiency than the traditional Tuonlinear Laplace operator. Diffcrcnt approaches were utilized to aclect El threshold value either manually or automatically. The automatic selection depends on the calculation of mean colour gredient magnitude or on the accumulated histogram. A suggestec and implemented mechanism based on using two threshold boundaries to detect edges in colour spaces other than RGB, the results indicated an improvement in the resultant eden image.
Information security is a huge trending topic in recent year. Many technique and algorithm were designed and developed toachieve the security of information and/or network across the world. Cryptography is one of the most common tools to provide suchsecurity. Nevertheless, steganography also provide a good security by hiding data within a media in a way that an attacker can't sensethe presence of the secret data. Compression don't normally imply any security , however , it messes up the original encoding of the dataand reduces its size by a measureable amount which makes it perfect for hiding .In this paper a system was proposed where a secretimage is compressed before encryption and hiding. JPEG algorithm was used for the compressing, while at the encryption stage RC4algorithm was used due to its fast processing speed. LSB (Least Significant Bit) technique then applied to hide the secret data within thecover image.
In lately years, data streaming is become more important day by day, considering technologies employed to serve
that manner and share number of terminals within the system either direct or indirect interacting with them.
Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affecting
the data collectivity and appears clearly with the new technologies provided and the increase for the number of the
users of such systems. This is due to the number of the users and resources available system start to employ the computational
power to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed data
as an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced at
unique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systems
technologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From the
presented study, the industrial and research communities are capable of gaining information about requirements for creating
Fog computing environment with a clearer view about managing resources in the Fog.
The main objective of this paper is to provide short brief and information about Data Streaming in Fog Computing
Environment with explaining the major research field within this meaning.
In lately years, data streaming is become more important day by day, considering technologies employed to servethat manner and share number of terminals within the system either direct or indirect interacting with them.Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affectingthe data collectivity and appears clearly with the new technologies provided and the increase for the number of theusers of such systems. This is due to the number of the users and resources available system start to employ the computationalpower to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed dataas an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced atunique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systemstechnologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From thepresented study, the industrial and research communities are capable of gaining information about requirements for creatingFog computing environment with a clearer view about managing resources in the Fog.The main objective of this paper is to provide short brief and information about Data Streaming in Fog ComputingEnvironment with explaining the major research field within this meaning.
This century has a progressive evolution in IT. New techniques gadgets and tools are being invented every day. ThisLeeds to consume energy and resources. The planet need a friendly environment in which consuming resources is balanced andtemperature is decreased. So; one of the most important responsibilities of human is providing green industry in order to get apurity environment This paper is a review of a few vital writings identified with the field of green processing that underscoresthe vitality of green registering for reasonable improvement.
Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most of
the collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convex
optimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state and
control constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses in
minimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis,
the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.
Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most ofthe collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convexoptimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state andcontrol constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses inminimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis,the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.
Random Number Generators are fundamental toolsfor cryptography protocols and algorithms. The basic problems thatface any crypto key generator are randomness, correlations anddistribution of the state of key sequence. This paper proposed a newmethod to enhance RNA crypto key generation. It has beenimplemented by extending the crypto key by applying polynomialconvolution technique which extracts the mask filter from the sameRNA key sequence depending on the start and end codon properties.This will provide another high level of extension and generaterandom-strength crypto key. The proposal approach could passthrough the statistical measurements successfully and achieved highrate of randomness (approximated to 96%).
The huge explosion of information over World Wide Web forces us to use information security methods to keep it away fromintruders. One of these security methods is information hiding method. Advantage of this method over other security methods is hidingexistence of data using carrier to hold this data embedding inside it. Image-based information hiding represents one of widely usedhiding methods due to the image capability of holding large amount of data as well as its resistance to detectable distortion. In lastdecades, statistical methods (types of stego-analysis methods) are used to detect existing of hidden data. Therefore, areas that have colorvariation (edges area) are used to hide data instead of smooth areas. In this paper, Corners points are proposed to hide data instead ofedges, this to avoid statistical attacks that are used to expose hidden message. Additionally, this paper proposes clearing least significantbit (CLSB) method to retrieve data from stego-image without sending pixels' map; this will increase security of the proposed cornerbasedhiding method. Experimental results show that the proposed method is robust against statistical attacks compared with edge-and sequential-based hiding methods. SVM classifier also confirms the outperformance of the proposed method over the previous methods by using Corel-1000image dataset.
Many papers have been published about manipulating computer viruses; instructions that impact a computer system and after a period of incubation and reproducion, activate and demonstrate their presence.
mest Viruses were designed to attack microcomputers, sincce microcomputers are widely used nowadays, and have simple operating systems which result in lack of quality of their security systern. Connecting computers with networks and using copies of programs from unreliable sources such as bullet in board systems will increase the :of viral contact and the spread of viruses. Data Encryption disguises data flowing through a network so that it is unintelligible to any one monitor the data, Encryption techniques can also be used to detect file modification which may caused either by unithorized users or by viruses. This paper concern in viruses atracking users on system files (.exe and com) in microcomputer sytems, where viruses Types, how they work, and anti-virus streiagies are going o scussed. Finally, a dccction stralegy depending on Encryption techniques built the operating sysiems Suggested to improve PCs Security and preventing unauthorized users from inserting into programas commands that will cause system corruption.
There has been a great deal of discussion about null values in relational databases. The relational model was defined in 1969, and Nulls Was died in 1979. Unfortunately, there is not a generally agreeable solution for rull values problem. Null is a special marker which stands for a value undefined or unknown, which means thut ne entry has been made, a missing valuc mark is not a value and not of a date type and cannot be treated as a value by Database Management System (DBMS). As we know, distributed database users are more than a single database and data will be distributed among several data sources or sites, it must be precise data, the replication is allowed there, so complex problems will appear, then there will be need for perfect practical general approaches for treatment of Nulls. A distributed database system is designed, that is "Hotel reservation control system, based on different data sources at four site, each site is represented as a Hotel, for more heterogeneity different application programming languages there are five practical approaches, designed with their rules and algorithms for Null values treatment through the distributed database sites. (1), (2), (3). 14). 15), (9).
DBSCAN (Density-Based Clustering of Applications with Noise )is one of the attractive algorithms among densitybased clustering algorithms. It characterized by its ability to detect clusters of various sizes and shapes with the presence of noise, but its performance degrades when data have different densities .In this paper, we proposed a new technique to separate data based on its density with a new samplingtechnique , the purpose of these new techniques is for getting data with homogenous density .The experimental results onsynthetic data and real world data show that the new technique enhanced the clustering of DBSCAN to large extent.
Recently, different applications of wireless sensor networks (WSNs) in the industry fields using different data transfer protocols has been developed. As the energy of sensor nodes is limited, prolonging network lifetime in WSNs considered a significant occurrence. To develop network permanence, researchers had considered energy consuming in routing protocols of WSNs by using modified Low Energy Adaptive Clustering Hierarchy. This article presents a developed effective transfer protocols for autonomic WSNs. An efficient routing scheme for wireless sensor network regarded as significant components of electronic devices is proposed. An optimal election probability of a node to be cluster head has being presented. In addition, this article uses a Voronoi diagram, which decomposes the nodes into zone around each node. This diagram used in management architecture for WSNs.
In this paper, algorithm (LT10) which is originally consist of four kasumi elements is proposed as a lightweight encryption algorithm, the proposed algorithm take into account that the IOT devices have a limit computation abilities and the sensitivity of smart homes and IOT network information that need to be exchanged the key length is 128 bit and the block length is 128 bit
Fifth−generation (5G) and millimeter−waves
(MM−W) hold tremendous promise to provide opportunities to
revolutionize education, healthcare, business, and agriculture.
Nevertheless, the generation of MM−W in the electrical−domain is
infeasible due to the bandwidth limitation of electronic components
and radio frequency (RF) interference. The capability to generate
MM−W in the optical−domain can provide transportation of
MM−W with low loss from switching center to remote base
stations. The present paper is focusing on electro−optical
up−conversion (EOU) techniques for optical generation and
transmission of 60−GHz MM−W signal. A comparative study is
carried out between three different EOU techniques:
frequency−quadrupling, frequency sextupling and
frequency−octotupling. The comparative study aims at showing the
strengths and weaknesses of three EOU techniques and evaluating
each technique in terms of electrical spurious suppression ratio
(ESSR), as well as in terms of the influence of non−ideal phase
shifting. The performance of the three EOU techniques after
transmission over optical fiber is evaluated by eye pattern test. The
results of the simulation confirm that the frequency−quadrupling
outperforms frequency− sextupling and frequency− octotupling
All the important information is exchanged between
facilities using the internet and networks, all these data should be
secret and secured probably, the personal information of person
in each of these institutions day by day need to organized secretly
and the need of the cryptography systems is raised which can easily
encrypt the personal and critical data and it can be shared with
other centers via internet without and concerns about privacy.
Chaotic performance is added to different phases of AES but very
few apply it on key generation and choosing Chebyshev
Polynomial will provide a chaotic map which will led to random
strong key. our system based on modified advanced encryption
standard (AES) , with encryption and decryption in real time
taking to consideration the criticality of data images that been
encrypted the main encryption algorithm is the same the
modification is done by replacing the key generation algorithm by
Chebyshev Polynomial to generate key with the required key size
Human and computer vision has a vital role in intelligent interaction with computer, face recognition is one of the subjects that have a wide area in researches, a big effort has been exerted in last decades for face recognition, face detection, face tracking, as yet new algorithms for building fully automated system are required, these algorithms should be robust and efficient. The first step of any face recognition system is face detection, the goal of face detection is the extraction of face region within image, taking into consideration lightning, orientation and pose variation, whenever this step accurate the result of face recognition will be better, this paper introduce a survey of techniques and methods of feature based face detection.
In e-government, the mining techniques are considered as a procedure for extracting data from the related webapplication to be converted into useful knowledge. In addition, there are different methods of mining that can be applied to differentgovernment data. The significant ideas behind this paper are to produce a comprehensive study amongst the previous research workin improving the speed of queries to access the database and obtaining specific predictions. The provided study compares datamining methods, database management, and types of data. Moreover, a proposed model is introduced to put these different methodstogether for improving the online applications. These applications produce the ability to retrieve the information, matching keywords,indexing database, and performing the prediction from a vast amount of data.