Iraqi Journal for Computers and Informatics

Published by University of Information Technology and Communications

Online ISSN: 2520-4912

·

Print ISSN: 2313-190X

Articles


INTELLIGENT AUTHENTICATION FOR IDENTITY AND ACCESS MANAGEMENT: A REVIEW PAPER
  • Article

May 2019

·

28 Reads

Iman Hadi
Identity and access management (IAM) system usually consist of predefined tasks as an information security system. Themain task is the authentication, since it is responsible for user identity proving for service providers that corporate with (IAM).This paper provides a review on intelligent authentication research applicable to IAM systems. These researches areevaluated according to the proposal of intelligent authentication key factors. Depending on this evaluation it could not be foundresearch implement an authentication that satisfies all these key factors.
Share

Figure (2): Schematic diagram for Accident Detection (Vibration Sensor)
Figure (3) (a): GSM modem sending message to mobile phone by using LabView
Figure (5): Block diagram of the SMS control part
Automatic Vehicle Accident Detection Based on GSM System
  • Article
  • Full-text available

December 2017

·

38 Reads

The evolution of technology has increased the rate of traffic accidents that occurs frequently causing loss of life and property. Therefore, the automatic traffic monitoring system gradually attracted the attention of researchers in improving traffic safety through the field of intelligent transport systems. In this paper, a cost-effective approach based on GSM system for automatic traffic incident detection is proposed. This paper provides an optimal solution to reduce the death rate by using the vibration sensor and GSM system; the implementation system is based on hardware (circuits) and software to build a graphical user interface (GUI) using LabView TM to process the data. Sensors are installed into the vehicles on each side of the vehicle. An SMS will be sent to the user after the accident. This system will assist in search and rescue vehicle involved in the accident.
Download

Fig.2. the Operation of the Add Round Key [7]
Fig.3. an Example of the AES S-Box [4] 3. Shift Rows
Fig.6. Key Schedule
Fig.7 one round of the proposed system
Fig.15. original sample1 histogram
AES WITH CHAOTIC USING CHEBYSHEV POLYNOMIAL

December 2018

·

54 Reads

All the important information is exchanged between facilities using the internet and networks, all these data should besecret and secured probably, the personal information of person in each of these institutions day by day need to organized secretlyand the need of the cryptography systems is raised which can easily encrypt the personal and critical data and it can be shared withother centers via internet without and concerns about privacy. Chaotic performance is added to different phases of AES but very few apply it on key generation and choosing ChebyshevPolynomial will provide a chaotic map which will led to random strong key. our system based on modified advanced encryptionstandard (AES) , with encryption and decryption in real time taking to consideration the criticality of data images that beenencrypted the main encryption algorithm is the same the modification is done by replacing the key generation algorithm byChebyshev Polynomial to generate key with the required key size.

Distributed Agents for Web Content Filtering

December 2016

·

14 Reads

This paper describe Web Content Filtering that aimed to block out offensive material by using DistributedAgents. The proposed system using FCM algorithm and other page's features (Title, Metadata , Warning Message) to classifythe websites (using as candidate) into two types:- white that considered acceptable, and black that contain harmful materialtaking the English Pornographic websites as a case study.

MEDICAL IMAGES COMPRESSION BASED ON SPIHT AND BAT INSPIRED ALGORITHMS

May 2019

·

4 Reads

There is a significant necessity to compress the medical images for the purposes of communication and storage.Most currently available compression techniques produce an extremely high compression ratio with a high-quality loss. Inmedical applications, the diagnostically significant regions (interest region) should have a high image quality. Therefore, it ispreferable to compress the interest regions by utilizing the Lossless compression techniques, whilst the diagnostically lessersignificant regions (non-interest region) can be compressed by utilizing the Lossy compression techniques. In this paper, a hybridtechnique of Set Partition in Hierarchical Tree (SPIHT) and Bat inspired algorithms have been utilized for Lossless compressionthe interest region, and the non-interest region is loosely compressed with the Discrete Cosine Transform (DCT) technique.The experimental results present that the proposed hybrid technique enhances the compression performance and ratio. Also,the utilization of DCT increases compression performance with low computational complexity.

Fig.3 Big data technologies [17]
Big Data Analytics: A Survey
Internet-based programs and communication techniques have become widely used and respected in the IT industry recently. A persistent source of "big data," or data that is enormous in volume, diverse in type, and has a complicated multidimensional structure, is internet applications and communications. Today, several measures are routinely performed with no assurance that any of them will be helpful in understanding the phenomenon of interest in an era of automatic, large-scale data collection. Online transactions that involve buying, selling, or even investing are all examples of e-commerce. As a result, they generate data that has a complex structure and a high dimension. The usual data storage techniques cannot handle those enormous volumes of data. There is a lot of work being done to find ways to minimize the dimensionality of big data in order to provide analytics reports that are even more accurate and data visualizations that are more interesting. As a result, the purpose of this survey study is to give an overview of big data analytics along with related problems and issues that go beyond technology.

Specifications of the test devices
the voice message duration and size
voice message encryption/decryption time
the image message size, NPCR and UACI
Image message encryption/decryption time
Design of Secure Chatting Application with End to End Encryption for Android Platform

June 2017

·

299 Reads

In this paper, a secure chatting application with end to end encryption for smart phones that used the android OS has beenproposed. This is achieved by the use of public key cryptography techniques. The proposed application used the Elliptic Curve DiffieHellman Key Exchange (ECDH) algorithm to generate the key pair and exchange to produce the shared key that will be used for theencryption of data by symmetric algorithms. The proposed Application allows the users to communicate via text messages, voicemessages and photos. For the text message security the standard AES algorithm with a 128 bit key are used. The generated key (160 bit)minimized to 128 bit length by selecting the first 128 bit of the generated key in order to be used by the AES algorithm. For the voice andimage security processes the proposed application used the symmetric algorithm RC4 for this purpose.

PAAD: POLITICAL ARABIC ARTICLES DATASET FOR AUTOMATIC TEXT CATEGORIZATION

June 2020

·

46 Reads

Now day’s text Classification and Sentiment analysis is considered as one of the popular Natural Language Processing (NLP) tasks. This kind of technique plays significant role in human activities and has impact on the daily behaviours. Each article in different fields such as politics and business represent different opinions according to the writer tendency. A huge amount of data will be acquired through that differentiation. The capability to manage the political orientation of an online article automatically. Therefore, there is no corpus for political categorization was directed towards this task in Arabic, due to the lack of rich representative resources for training an Arabic text classifier. However, we introduce political Arabic articles dataset (PAAD) of textual data collected from newspapers, social network, general forum and ideology website. The dataset is 206 articles distributed into three categories as (Reform, Conservative and Revolutionary) that we offer to the research community on Arabic computational linguistics. We anticipate that this dataset would make a great aid for a variety of NLP tasks on Modern Standard Arabic, political text classification purposes. We present the data in raw form and excel file. Excel file will be in four types such as V1 raw data, V2 preprocessing, V3 root stemming and V4 light stemming.

Combining the Attribute Oriented Induction and Graph Visualization to Enhancement Association Rules Interpretation

December 2016

·

7 Reads

The important methods of data mining is large and from these methods is mining of association rule. The miningof association rule gives huge number of the rules. These huge rules make analyst consuming more time when searchingthrough the large rules for finding the interesting rules. One of the solutions for this problem is combing between one of theAssociation rules visualization method and generalization method. Association rules visualization method is graph-basedmethod. Generalization method is Attribute Oriented Induction algorithm (AOI). AOI after combing calls ModifiedAOI because it removes and changes in the steps of the traditional AOI. The graph technique after combing also callsgrouped graph method because it displays the aggregated that results rules from AOI. The results of this paper are ratio ofcompression that gives clarity of visualization. These results provide the ability for test and drill down in the rules orunderstand and roll up.

Evaluation of Wavelet Transform Audio Hiding

December 2002

·

5 Reads

Audio hiding is method for embedding information into an audio signal. It seeks to do so in a robust Fashion, while not perceivably degrading the host signal (coves audio). Hiding data in audio signals presents & varicty of challenges: due in part to the wider dynamic and differential range of the Human Auditory System (HAS) as compared to other senses. Transform are usually used for robust audio hiding (audio watcrmarking). But, the audio hiding process is affected by the type of transform used. Therefore, this paper presents an evaluation of wavelet transform hiding in comprison with sclccted types of transforms (Walsh transform and cosine transform) hiding. In order to generate the audio stegecover, the research concludce (Wavelet, Walsh, or Cosine) transform of the audio cover, replacing some transformed cover cocfficients with secret audio message coefficients, and inverse (Wavelet, Walsit, or Carsine Irannom for audio cover with replaced coefficients. While, the extracting method concludes (Wavelel, Walsh, or Cosine runsform of the stego ove18 and extracting the secrecy extracted Audio message.

REVIEW ON DETECTION OF RICE PLANT LEAVES DISEASES USING DATA AUGMENTATION AND TRANSFER LEARNING TECHNIQUES
  • New
  • Article
  • Full-text available

June 2023

·

12 Reads

The most important cereal crop in the world is rice (Oryza sativa). Over half of the world's population uses it as a staple food and energy source. Abiotic and biotic factors such as precipitation, soil fertility, temperature, pests, bacteria, and viruses, among others, impact the yield production and quality of rice grain. Farmers spend a lot of time and money managing diseases, and they do so using a bankrupt "eye" method that leads to unsanitary farming practices. The development of agricultural technology is greatly conducive to the automatic detection of pathogenic organisms in the leaves of rice plants. Several deep learning algorithms are discussed, and processors for computer vision problems such as image classification, object segmentation, and image analysis are discussed. The paper showed many methods for detecting, characterizing, estimating, and using diseases in a range of crops. The methods of increasing the number of images in the data set were shown. Two methods were presented, the first is traditional reinforcement methods, and the second is generative adversarial networks. And many of the advantages have been demonstrated in the research paper for the work that has been done in the field of deep learning.


Compression image sharing using DCT- Wavelet transform and coding by Blackely method

June 2017

·

8 Reads

The increased use of computer and internet had been related to the wide use of multimedia information. The requirement for protecting this information has risen dramatically. To prevent the confidential information from being tampered with, one needs to apply some cryptographic techniques. Most of cryptographic strategies have one similar weak point that is the information is centralized. To overcome this drawback the secret sharing was introduced. It’s a technique to distribute a secret among a group of members, such that every member owns a share of the secret; but only a particular combination of shares could reveal the secret. Individual shares reveal nothing about the secret. The major challenge faces image secret sharing is the shadow size; that's the complete size of the lowest needed of shares for revealing is greater than the original secret file. So the core of this work is to use different transform coding strategies in order to get as much as possible the smallest share size. In this paper Compressive Sharing System for Images Using Transform Coding and Blackely Method based on transform coding illustration are introduced. The introduced compressive secret sharing scheme using an appropriate transform (Discrete cosine transform and Wavelet) are applied to de-correlate the image samples, then feeding the output (i.e., compressed image data) to the diffusion scheme which is applied to remove any statistical redundancy or bits of important attribute that will exist within the compressed stream and in the last the (k, n) threshold secret sharing scheme, where n is the number of generated shares and k is the minimum needed shares for revealing. For making a certain high security level, each produced share is passed through stream ciphering depends on an individual encryption key belongs to the shareholder.

Method of interpreting arithmetic means values
Summarized of the Chebyshev Theory
One-Sample Statistics
Test Statistics
Test statistics a,b
Blended Learning Using Virtual Reality Environments

June 2017

·

108 Reads

Immersive virtual reality isn’t just for gaming. It’s poised to have a big impact on education as well, giving students an opportunity to interact with content in three-dimensional learning environments. Blended learning, according to the Inn sight Institute is "a formal education program in which a student learns at least in part through online delivery of content and instruction with some element of student' control over time, place, path and/or pace". On the other hand, there are many disadvantage found in blended learning such as the learners with low motivation or bad study habits may fall behind, and many others. So, there is an essential need to improve and develop the theory of the blended learning by using virtual reality environments and get rid of these disadvantages to develop the faceto-face learning to add a lot of features such as excitement and make it more efficient. As well as affirms clarity of the scientific content of the lecture that the student may be miss them by absent or even mentally, so student can live atmosphere the lecture again and overcome the difficulties resulting from the using blended learning or traditional learning. Firstly, this approach is applied by building a specialized website application that allow using virtual reality feature in order to measure the effectiveness of this study on students, and then a questionnaires was designed and the information result of these questionnaires impact was gathered. It is found that, the most of students were excited, active and they understand the lecture in an easy way with a high Likert Scale (4.74), but they found difficulties in using VR tools which has a low Likert scale (2.66).

Figure 2: The scheme of information exchange between BS establishments with distributed data storage
Figure3: The scheme of information exchange between BS establishments with combined data storage
The Concept of Building a Model of the National Blood Information System

June 2017

·

26 Reads

The development of modern information technologies in medicine makes actually the creation of the nationalInformation Systems (IS) for joint activities of medical institutions, improve the quality of health services and improvemanagement in the health sector. One of the components of healthcare is the system of Blood Service (BS). In this work the concept ofbuilding the national system is considered on example of the IS of BS. The national IS of BS aims to track relevant information onindicators of the quality of blood products through information integration BS establishments, makes it possible to increase thelevel of infectious safety and quality of transfusion care. The models of integration IS of BS are offered on the conceptual level inthis work for information exchange organization between BS establishments. The analysis of structures of models of integratedsystems is carried out to select the rational national IS of BS.

COLOR FEATURE WITH SPATIAL INFORMATION EXTRACTION METHODS FOR CBIR: A REVIEW

May 2019

·

19 Reads

Inn then last two decades the Content Based Image Retrieval (CBIR) considered as one of the topic of interest for theresearchers. It depending one analysis of the image’s visual content which can be done by extracting the color, texture and shapefeatures. Therefore, feature extraction is one of the important steps in CBIR system for representing the image completely. Color featureis the most widely used and more reliable feature among the image visual features. This paper reviews different methods, namely LocalColor Histogram, Color Correlogram, Row sum and Column sum and Colors Coherences Vectors were used to extract colors featurestaking in consideration the spatial information of the image.

DESIGN AND IMPLEMENT CHAT PROGRAM USING TCP/IP

June 2018

·

123 Reads

LAN Chat Messenger using TCP/IP offers reliability, security and zero cost communication among staff members of acompany. As well, this study offers file transfer. It helps to solve the communication problems that related to time and cost. The proposedprotocol facilitates information exchange among individuals by providing many communication options. It is a standaloneapplication using JAVA as the programming language and tested at LANs of our institute (college's Labs networks).

A SUGGESTED SUPER SALSA STREAM CIPHER

December 2018

·

14 Reads

Salsa (20) cipher is speedier than AES cipher and its offered superior security. Salsa (8) and Salsa (12) are specifiedfor apps wherever the grade of security is less necessary than speed. The concept of this research is to suggest super salsakeystream utilizing various volumes matrices size (array (4, 4), array (4, 8), array (4, 16)) are used to increase the complexity ofkey stream and make it more reluctant to linear and differential attacks. Furthermore, in each iteration, the diffusion ofgenerated keystream will increase due the effect of changing the volume acting for one element of the array is not fixed. Thegenerated keys of the suggested Super SALSA keystream are depicted as simple operations and a high hardiness randomlykeystream by exceeding the five benchmark tests. Likewise, it's presenting a situation of equilibrium between complexity andspeed for Salsa (8, 12 and 20).


Figure 1: Steps of RC4.
Figure 2: Steps embedding.
Figure 3: Steps of Extraction
The Results
Enhance the Hiding Image by Using Compression and Securing Techniques

June 2017

·

15 Reads

Information security is a huge trending topic in recent year. Many technique and algorithm were designed and developed to achieve the security of information and/or network across the world. Cryptography is one of the most common tools to provide such security. Nevertheless, steganography also provide a good security by hiding data within a media in a way that an attacker can't sense the presence of the secret data. Compression don't normally imply any security , however , it messes up the original encoding of the data and reduces its size by a measureable amount which makes it perfect for hiding .In this paper a system was proposed where a secret image is compressed before encryption and hiding. JPEG algorithm was used for the compressing, while at the encryption stage RC4 algorithm was used due to its fast processing speed. LSB (Least Significant Bit) technique then applied to hide the secret data within the cover image.

A STUDY ON DATA STREAMING IN FOG COMPUTING ENVIRONMENT

May 2019

·

5 Reads

In lately years, data streaming is become more important day by day, considering technologies employed to servethat manner and share number of terminals within the system either direct or indirect interacting with them.Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affectingthe data collectivity and appears clearly with the new technologies provided and the increase for the number of theusers of such systems. This is due to the number of the users and resources available system start to employ the computationalpower to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed dataas an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced atunique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systemstechnologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From thepresented study, the industrial and research communities are capable of gaining information about requirements for creatingFog computing environment with a clearer view about managing resources in the Fog.The main objective of this paper is to provide short brief and information about Data Streaming in Fog ComputingEnvironment with explaining the major research field within this meaning.

Review of Recycling of E-DATA Through Green Computing

December 2017

·

16 Reads

This century has a progressive evolution in IT. New techniques gadgets and tools are being invented every day. This Leeds to consume energy and resources. The planet need a friendly environment in which consuming resources is balanced and temperature is decreased. So; one of the most important responsibilities of human is providing green industry in order to get a purity environment This paper is a review of a few vital writings identified with the field of green processing that underscores the vitality of green registering for reasonable improvement.

A STUDY ON DATA STREAMING IN FOG COMPUTING ENVIRONMENT

December 2019

·

170 Reads

In lately years, data streaming is become more important day by day, considering technologies employed to serve that manner and share number of terminals within the system either direct or indirect interacting with them. Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affecting the data collectivity and appears clearly with the new technologies provided and the increase for the number of the users of such systems. This is due to the number of the users and resources available system start to employ the computational power to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed data as an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced at unique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systems technologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From the presented study, the industrial and research communities are capable of gaining information about requirements for creating Fog computing environment with a clearer view about managing resources in the Fog. The main objective of this paper is to provide short brief and information about Data Streaming in Fog Computing Environment with explaining the major research field within this meaning.

Figure (2): The convex hull with minimum ellipsoid around the obstacle's points
Figure (3): The shortest distance path planning problem with obstacle's points
ROBOTIC MOTION PLANNING USING CONVEX OPTIMIZATION METHODS

December 2019

·

552 Reads

Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most of the collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convex optimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state and control constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses in minimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis, the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.

ROBOTIC MOTION PLANNING USING CONVEX OPTIMIZATION METHODS

December 2019

·

9 Reads

Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most ofthe collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convexoptimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state andcontrol constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses inminimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis,the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.

virus Detection Using Cryptography Algorithm

June 1996

·

5 Reads

Many papers have been published about manipulating computer viruses; instructions that impact a computer system and after a period of incubation and reproducion, activate and demonstrate their presence. mest Viruses were designed to attack microcomputers, sincce microcomputers are widely used nowadays, and have simple operating systems which result in lack of quality of their security systern. Connecting computers with networks and using copies of programs from unreliable sources such as bullet in board systems will increase the :of viral contact and the spread of viruses. Data Encryption disguises data flowing through a network so that it is unintelligible to any one monitor the data, Encryption techniques can also be used to detect file modification which may caused either by unithorized users or by viruses. This paper concern in viruses atracking users on system files (.exe and com) in microcomputer sytems, where viruses Types, how they work, and anti-virus streiagies are going o scussed. Finally, a dccction stralegy depending on Encryption techniques built the operating sysiems Suggested to improve PCs Security and preventing unauthorized users from inserting into programas commands that will cause system corruption.

A Survey on Cybercrime Using Social Media

June 2023

·

10 Reads

There is growing interest in automating crime detection and prevention for large populations as a result of the increased usage of social media for victimization and criminal activities. This area is frequently researched due to its potential for enabling criminals to reach a large audience. While several studies have investigated specific crimes on social media, a comprehensive review paper that examines all types of social media crimes, their similarities, and detection methods is still lacking. The identification of similarities among crimes and detection methods can facilitate knowledge and data transfer across domains. The goal of this study is to collect a library of social media crimes and establish their connections using a crime taxonomy. The survey also identifies publicly accessible datasets and offers areas for additional study in this area.

Null Values Treatment in Distributed Databases

December 2002

·

14 Reads

There has been a great deal of discussion about null values in relational databases. The relational model was defined in 1969, and Nulls Was died in 1979. Unfortunately, there is not a generally agreeable solution for rull values problem. Null is a special marker which stands for a value undefined or unknown, which means thut ne entry has been made, a missing valuc mark is not a value and not of a date type and cannot be treated as a value by Database Management System (DBMS). As we know, distributed database users are more than a single database and data will be distributed among several data sources or sites, it must be precise data, the replication is allowed there, so complex problems will appear, then there will be need for perfect practical general approaches for treatment of Nulls. A distributed database system is designed, that is "Hotel reservation control system, based on different data sources at four site, each site is represented as a Hotel, for more heterogeneity different application programming languages there are five practical approaches, designed with their rules and algorithms for Null values treatment through the distributed database sites. (1), (2), (3). 14). 15), (9).

Enhancing of DBSCAN based on Sampling and Densitybased Separation

December 2016

·

4 Reads

DBSCAN (Density-Based Clustering of Applications with Noise )is one of the attractive algorithms among densitybased clustering algorithms. It characterized by its ability to detect clusters of various sizes and shapes with the presence of noise, but its performance degrades when data have different densities .In this paper, we proposed a new technique to separate data based on its density with a new samplingtechnique , the purpose of these new techniques is for getting data with homogenous density .The experimental results onsynthetic data and real world data show that the new technique enhanced the clustering of DBSCAN to large extent.

EFFICIENT ROUTING PROTOCOL ALGORITHM FOR WIRELESS SENSOR NETWORKS

June 2018

·

5 Reads

Recently, different applications of wireless sensor networks (WSNs) in the industry fields using different data transfer protocols has been developed. As the energy of sensor nodes is limited, prolonging network lifetime in WSNs considered a significant occurrence. To develop network permanence, researchers had considered energy consuming in routing protocols of WSNs by using modified Low Energy Adaptive Clustering Hierarchy. This article presents a developed effective transfer protocols for autonomic WSNs. An efficient routing scheme for wireless sensor network regarded as significant components of electronic devices is proposed. An optimal election probability of a node to be cluster head has being presented. In addition, this article uses a Voronoi diagram, which decomposes the nodes into zone around each node. This diagram used in management architecture for WSNs.

Figure (1): The proposed authentication protocol architecture 2.1Device Identity Holder (DIH) The DIH (Device Identity Holder) is a smart card need to be integrated to the device s connected to the IoT Networks. Because it is a smart card, some inherent security functions are included in it that is specified to smart cards. It has several security attributes owned by the operating system and chip hardware in smart card. Device Identity Holder includes the required data that helps accessing the device account. Ki and GIDN are generally preserved in every DIH. GIDN (Global IoT Device Number) usually comes with 15 digits (at most) devoted uniquely to the entire mobile device all over the world. Individual device authentication Key (Ki) is a random number (128-bit), the session key generator, is considered as the origin cryptographic to provide these keys and provides the authentication of the device with the network. Individual device authentication Key (Ki) is protected severely and stored in the device's DIH .The DIH is itself protected. Secrecy of Ki and GIDN are in charge of Confidentiality and Authentication of device data. With discovering of these numbers, anyone can impersonate a legitimate device. In every DIH (LA1 and RA1) algorithm IoT device are also implemented. This helps the operator to change and determine this algorithm IoT device independently from hardware manufacturers and the other operators. Therefore, authentication works when a device is peregrination on other
Figure (2): LA1 algorithm
LT10 A LIGHTWEIGHT PROPOSED ENCRYPTION ALGORITHM FOR IOT

June 2018

·

33 Reads

In this paper, algorithm (LT10) which is originally consist of four kasumi elements is proposed as a lightweight encryption algorithm, the proposed algorithm take into account that the IOT devices have a limit computation abilities and the sensitivity of smart homes and IOT network information that need to be exchanged the key length is 128 bit and the block length is 128 bit

Figure(2) : Experimental set-up for technique [10]
Figure (7): Eye diagrams at a distance of 60 km. (a) Technique [9] (b) Technique [10] (c) Technique [11]
COMPARISON OF THREE DIFFERENT EOUTECHNIQUES FOR FIFTH−GENERATION MM−WWIRELESS NETWORKS

December 2019

·

25 Reads

Fifth−generation (5G) and millimeter−waves (MM−W) hold tremendous promise to provide opportunities to revolutionize education, healthcare, business, and agriculture. Nevertheless, the generation of MM−W in the electrical−domain is infeasible due to the bandwidth limitation of electronic components and radio frequency (RF) interference. The capability to generate MM−W in the optical−domain can provide transportation of MM−W with low loss from switching center to remote base stations. The present paper is focusing on electro−optical up−conversion (EOU) techniques for optical generation and transmission of 60−GHz MM−W signal. A comparative study is carried out between three different EOU techniques: frequency−quadrupling, frequency sextupling and frequency−octotupling. The comparative study aims at showing the strengths and weaknesses of three EOU techniques and evaluating each technique in terms of electrical spurious suppression ratio (ESSR), as well as in terms of the influence of non−ideal phase shifting. The performance of the three EOU techniques after transmission over optical fiber is evaluated by eye pattern test. The results of the simulation confirm that the frequency−quadrupling outperforms frequency− sextupling and frequency− octotupling techniques.

Fig.2. the Operation of the Add Round Key [7]
Fig.3. an Example of the AES S-Box [4] 3. Shift Rows
Fig.6. Key Schedule
Fig.16. encrypted sample1 histogram
FEATURE-BASED FACE DETECTION: A SURVEY

December 2018

·

110 Reads

All the important information is exchanged between facilities using the internet and networks, all these data should be secret and secured probably, the personal information of person in each of these institutions day by day need to organized secretly and the need of the cryptography systems is raised which can easily encrypt the personal and critical data and it can be shared with other centers via internet without and concerns about privacy. Chaotic performance is added to different phases of AES but very few apply it on key generation and choosing Chebyshev Polynomial will provide a chaotic map which will led to random strong key. our system based on modified advanced encryption standard (AES) , with encryption and decryption in real time taking to consideration the criticality of data images that been encrypted the main encryption algorithm is the same the modification is done by replacing the key generation algorithm by Chebyshev Polynomial to generate key with the required key size

FEATURE-BASED FACE DETECTION: A SURVEY

June 2018

·

19 Reads

Human and computer vision has a vital role in intelligent interaction with computer, face recognition is one of the subjects that have a wide area in researches, a big effort has been exerted in last decades for face recognition, face detection, face tracking, as yet new algorithms for building fully automated system are required, these algorithms should be robust and efficient. The first step of any face recognition system is face detection, the goal of face detection is the extraction of face region within image, taking into consideration lightning, orientation and pose variation, whenever this step accurate the result of face recognition will be better, this paper introduce a survey of techniques and methods of feature based face detection.

Figure (1):-Wavelet Transform Decomposition tree
IMAGE HIDING ON HIGH FREQUENCY SPEECH COMPONENTS USING WAVELET PACKET TRANSFORM

June 2018

·

12 Reads

This paper propose a method for security threw hiding the image inside the speech signal by replacing the high frequency components of the speech signal with the data of the image where the high frequency speech components are separated and analyzed using the Wavelet Packet Transform (WPT) where the new signal will be remixed to create a new speech signal with an embedded image. The algorithm is implemented on MATLAB 15 and is designed to achieve best image hiding where the reconstruction rate was more than 94% while trying to maintain the same size of the speech signal to overcome the need for a powerful channel to handle the task. Best results were achieved with higher speech resolution (higher number of bits per sample) and longer periods (higher number of samples in the media file).

HOSPITAL PHARMACY MANAGEMENT SYSTEM

December 2018

·

32,923 Reads

Generally, the electronic technology has been implemented to automate the traditional systems. So, different copy of management systems in different scope were presented. These systems include the services provided to company as well as people, such as, healthcare. The traditional data management systems for pharmacy as example, suffer from the capacity, time consuming, medicines accessibility, managing the medicines store as well as the need of qualified staff according to the requirements of employer expectations. In this paper, a hospital e-pharmacy system is proposed in order to facilitate the job, outdo the mentioned problems. A data management system to the Iraqi hospital's pharmacy is proposed which is divided into two main parts: database, and Graphical User Interface (GUI) frames. The database built using SQL Server contains the pharmacy information related to the medicines, patient information….etc. the GUI frames ease the use of the proposed system by unskilled users. The proposal system is responsible on monitoring and controlling the work of pharmacy in hospital in terms of management of medicine issuing ordering and hospital reports.

HOSPITAL PHARMACY MANAGEMENT SYSTEM

December 2018

·

157 Reads

Generally, the electronic technology has been implemented to automate the traditional systems. So, differentcopy of management systems in different scope were presented. These systems include the services provided to company as wellas people, such as, healthcare. The traditional data management systems for pharmacy as example, suffer fromthe capacity, time consuming, medicines accessibility, managing the medicines store as well as the need of qualifiedstaff according to the requirements of employer expectations. In this paper, a hospital e-pharmacy system is proposed in order to facilitate the job, outdo the mentioned problems. A data management system to the Iraqi hospital's pharmacy is proposed which is divided into two main parts: database, and Graphical User Interface (GUI) frames. The database built using SQL Server contains the pharmacy information relatedto the medicines, patient information….etc. the GUI frames ease the use of the proposed system by unskilled users. Theproposal system is responsible on monitoring and controlling the work of pharmacy in hospital in terms of management ofmedicine issuing ordering and hospital reports.

Fig 1: The circuit design. 1.Circuit Components (particular part components) The components of the design circuit are: 1. Arduino UNO it is a microcontroller board built on ATmega328. It has a total of  14 digital input/output pins. 6 of those can be used as PWM outputs.
Figure 2: The Second Circuit Design Using (IR).
Figure 3: The Block Diagram of The Circuit Design. Suppose the first the Lane 1 gets its Green light turned. Hence, in all the other Lanes, their corresponding Red lights are turned on. The emergency case is in the third lane, then the vehicles driver (ambulance) can send message or signal by using GSM or IR to the traffic contrail and the lane 1 is turn off after the time is finished and red light turned, the lane 3 get its green must be open and another lanes gets Red lights turned, in this status the waiting time for emergence case becomes less than the waiting time of the normal case, As well we can reduce the traffic in the intersections by using this proposed. For example, if lane 2 has a lot of vehicles and lane 4 has few vehicles, in this case the waiting time causes crowded in the lane 2. To solve this problem, the time of lane 2 is increased and time of lane 4 is decreased. The working of the traffic light controller is as in Figure 4 below:
Design and Implementation Smart Traffic Light

December 2018

·

13,511 Reads

The increasing in the number of vehicles on streets has led to traffic congestion. In order to reduce the waiting time in cases of emergency, the idea of this work is suggested. This work is divided into two parts, the particular part and software part. The first circular particular part is a model which consists of four lanes junction of a traffic light, it also has GSM system ( Global System for Mobile Communications). The GSM and lamps of the traffic light are connected to Arduino UNO. The Arduino controls every signal which is coming from the inputs (GSM) to software and display to the outputs (lamps) The second circular particular part is a model which consist same components the first circuit except replace the GSM with IR(infrared Remote).The goal from this work is to help us in the emergence cases, the opening and closing of the traffic light are controlled by using GSM system and IR, the time of each lane, is controlled that means reduce the crowding.

WITH THE AGE OF OBJECT ORIENTED PROGRAMMING (OOP). HAVE WE PASSED MCCARTHY'S THEORY?
There are some recent arguments in the community of computer enlightenment respecting the Object Oriented Programing (COP). This discussions focucs on one of the major questions that is "are we starting the duration of theory of computer science? How In this short paper we try to answer the most significant question of With the use of OOP, arc we passed Mccartry's theory ?are we starting new theory of computer science ? " We will first look at McCarthy theory, the principles of OOP. then finally we will attempt to answer the question and show our claims.

Fig.1 Hierarchical relations related to validated and verified requirements [16].
Fig 2: Continuous V&V process in software System Development Life Cycle [9][19].
VERIFICATION AND VALIDATION OF A SOFTWARE: A REVIEW OF THE LITERATURE

June 2020

·

4,533 Reads

With the development of the Internet, making software is often essential, also it is complicated to succeed in the project’s development. There is a necessity in delivering software of top quality. It might be accomplished through using the procedures of Verification and Validation (V&V) via development processes. The main aim of the V&V has been checking if the created software is meeting the needs and specifications of clients. V&V has been considered as collections related to testing as well as analysis activities across the software’s full life cycle. Quick developments in software V&V were of high importance in developing approaches and tools for identifying possible concurrent bugs and therefore verifying the correctness of software. It has been reflecting the modern software V&V concerning efficiency. The main aim of this study has been retrospective review related to various researches in software V&V and conduct a comparison between them. In the modern competitive world related to the software, the developers of software must be delivering on-time quality products, also the developers should be verifying that the software has been properly functioning and validating the product for each one of the client’s requirements. The significance of V&V in the development of software has been maintaining the quality of software. The approaches of V&V have been utilized in all stages of the System Development Life Cycle. Furthermore, the presented study also provides objectives of V&V and describes V&V tools that can be used in the process of software development, the way of improving the software’s quality.

CDW data compared with other domains [9]
Data, ETL Tool and Purpose Perspectives
CLINICAL DATA WAREHOUSE: A REVIEW

December 2018

·

139 Reads

Clinical decisions are crucial because they are related to human lives. Thus, managers and decision makers in the clinical environment seek new solutions that can support their decisions. A clinical data warehouse (CDW) is an important solution that is used to achieve clinical stakeholders’ goals by merging heterogeneous data sources in a central repository and using this repository to find answers related to the strategic clinical domain, thereby supporting clinical decisions. CDW implementation faces numerous obstacles, starting with the data sources and ending with the tools that view the clinical information. This paper presents a systematic overview of purpose of CDWs as well as the characteristics; requirements; data sources; extract, transform and load (ETL) process; security and privacy concerns; design approach; architecture; and challenges and difficulties related to implementing a successful CDW. PubMed and Google Scholar are used to find papers related to CDW. Among the total of 784 papers, only 42 are included in the literature review. These papers are classified based on five perspectives, namely methodology, data, system, ETL tool and purpose, to find insights related to aspects of CDW. This review can contribute answers to questions related to CDW and provide recommendations for implementing a successful CDW.

Table 3 : Data, ETL Tool and Purpose Perspectives
CLINICAL DATA WAREHOUSE: A REVIEW

December 2018

·

6,294 Reads

Clinical decisions are crucial because they are related to human lives. Thus, managers and decision makers in the clinical environment seek new solutions that can support their decisions. A clinical data warehouse (CDW) is an important solution that is used to achieve clinical stakeholders’ goals by merging heterogeneous data sources in a central repository and using this repository to find answers related to the strategic clinical domain, thereby supporting clinical decisions. CDW implementation faces numerous obstacles, starting with the data sources and ending with the tools that view the clinical information. This paper presents a systematic overview of purpose of CDWs as well as the characteristics; requirements; data sources; extract, transform and load (ETL) process; security and privacy concerns; design approach; architecture; and challenges and difficulties related to implementing a successful CDW. PubMed and Google Scholar are used to find papers related to CDW. Among the total of 784 papers, only 42 are included in the literature review. These papers are classified based on five perspectives, namely methodology, data, system, ETL tool and purpose, to find insights related to aspects of CDW. This review can contribute answers to questions related to CDW and provide recommendations for implementing a successful CDW.

مستلزمات واحتياجات تشغيل اجهزة الحاسبة الالكترونية

December 1980

·

2 Reads

مستلزمات واحتياجات تشغيل اجهزة الحاسبة الالكترونية

المواصفات العامة لنظام ادارة التعليم الممكنن

December 1982

·

1 Read

المواصفات العامة لنظام ادارة التعليم الممكنن

اسلوب ادخال الحاسبة الالكترونية في منشأة

June 1982

·

4 Reads

وبعد كل ما تقدم من هذه المناقشات التي جرت طوال اللقاءات المختلفة تبين أن هذه الشفرة الموحدة تشكل الأساس الأون لانطلاق العمل الموحد بين الأقطار العربية الشقيقة لخدمة اللغة العربية من جميع النواحي وانها قابلة للتحسين بشرط أن يكون هذا التحسين عملا جماعيا موحدا في المراحل القادمة على غرار ما حدث الى حلى الآن بالنسبة الى هذه الشفرة الميمونة .

استخدام الحاسبات الالكترونية في تحليل الصور الشعائية

December 1981

في هذا البحث سنقدم طريقة جديدة لتحديد الأعضاء الداخلية الجسم الانسان أوتوماتيكية في الصور الشعاعية الطبقية والمأخوذة بواسطة جهاز التصوير الشعاعي الطبقي ( T - SCANNER)). وكذلك مقارنة هذه الطريقة مع طرائق اخرى معروفة .

تطبيقات الرسم البياني في تمثيل ومعالجة الخرائط والبيانات الاحصائية على الحاسبة الالكترونية

December 1987

·

8 Reads

يرمي هذا البحث الى تصميم حزمة برامج تفاعلية لمعالجة خرائط العراق الجغرافية والبيانات الاحصائية بالاستعانة برسائل الرسم البياني على حاسبة شخصية يتم تجميعها في القطر باسم الوركاء 6001 وباستخدام لغة BASIC N69m كنموذج على اخد تطبيقات الرسم البياني المقيدة في حصول التعليم والتعلم او في مجالات التخطيط الاقليمي والحضري وغير ذلك .

نظرية المباريات تطبيقاتها العسكرية والوصول الى الاستراتيجية المثلى باستخدام الحاسبة الالكترونية

December 1980

·

16 Reads

لنظرية المباريات أهمية كبيرة في المجالات التطبيقية التي تتضمن مواقف تنافسية تشترك فيها جهات متعددة ، ولعل من أهم وأبرز هذه المجالات هي تلك التي تتمثل فيها مواقف الصراع العسكري والحربي والتي تتطلب حلا سريعا ودقيقا التحديد الاستراتيجية المثلى للفوز ، أن جميع الأنظمة الجاهزة المتوفرة لعمل الحاسبة الالكترونية في هذا المجال لا تتعدى استخدام البرمجة الخطية كأسلوب المعالجة هذه المواقف وهي قد وضعت أصلا لمعالجة أمور لا تمت بصلة مباشرة الى نظرية المباريات

قناة الاشارات الالكترونية

June 1981

إن الحاجة الماسة إلى توفير مصادر باللغة العربية حول الحاسبة الالكترونية هي من أهم احد الاسباب التي دعتني إلى التطرق الى هذا الموضوع والذي توفر الجزء الأول منه ، آملا أن يكون نقطة البداية لإنجاز مهام من هذا النوع من الترجمات في المستقبل القريب . أ- أن الموضوع الذي نحن بصدده هو عبارة عن مقالة مترجمة عن کتاب: CH3 Peripheral Devices لمؤلفه : Ivan Flores حيث تتطرق المقالة إلى تعريف قناة الاشارات الالكترونية وعملها ضمن الحاسبة الالكترونية والجهات التي تقوم بالإشراف عليها والجهات التي تنقل منها واليها المعلومات وقد عززت هذه المقالة بالصور التوضيحية المبسطة والتي توضح بجلاء الأفكار والمعلومات التي وردت ضمن هذه المقالة . أن هذه المقالة اذا ما اكتملت بأجزائها جميعا يمكن أن تكون مادة اكاديمية جيدة للعاملين في مجال الحاسبات الالكترونية الذين يودون الحصول على تفاصيل أعمق عن مكونات الحاسبة الالكترونية ، وقد تخدم هذه المقالة كلا من مهندسي الصيانة المبتدئين و مهندسي النظم والمبرمجين .

المنطق المضبب ... مدخل لتصميم انظمة الذكاء الاصطناعي

June 1987

·

37 Reads

يقدم هذا البحث على نحو تمهيدي ، نظرية المجموعات المضببة ومنطقها كنظام منكامل ، يمكن من خلالها تصميم الات ذات ذكاء اصطناعي متفوق . حيث تم شرح مفهوم دالة الامنساب المتدرجة ، وتعرف عمليات المنطق المضبب الاساسية ، ورابطة التضمين المضبب واخواتها . ومن ثم انتقلنا الى امتحان اصول الاستنتاج المضبب بمثال المسطر الذي يستخدم المنطق المضبب ، وفي الختام ، اشرنا الى اسلةب نقل دلالة الالفاظ اللغويه الى خوارزمية مضببة باستخدام طريقة التقريب اللغوي