Identity and access management (IAM) system usually consist of predefined tasks as an information security system. Themain task is the authentication, since it is responsible for user identity proving for service providers that corporate with (IAM).This paper provides a review on intelligent authentication research applicable to IAM systems. These researches areevaluated according to the proposal of intelligent authentication key factors. Depending on this evaluation it could not be foundresearch implement an authentication that satisfies all these key factors.
The evolution of technology has increased the rate of traffic accidents that occurs frequently causing loss of life
and property. Therefore, the automatic traffic monitoring system gradually attracted the attention of researchers in improving
traffic safety through the field of intelligent transport systems. In this paper, a cost-effective approach based on GSM system
for automatic traffic incident detection is proposed. This paper provides an optimal solution to reduce the death rate by using
the vibration sensor and GSM system; the implementation system is based on hardware (circuits) and software to build a
graphical user interface (GUI) using LabView TM to process the data. Sensors are installed into the vehicles on each side of
the vehicle. An SMS will be sent to the user after the accident. This system will assist in search and rescue vehicle involved in
All the important information is exchanged between facilities using the internet and networks, all these data should besecret and secured probably, the personal information of person in each of these institutions day by day need to organized secretlyand the need of the cryptography systems is raised which can easily encrypt the personal and critical data and it can be shared withother centers via internet without and concerns about privacy. Chaotic performance is added to different phases of AES but very few apply it on key generation and choosing ChebyshevPolynomial will provide a chaotic map which will led to random strong key. our system based on modified advanced encryptionstandard (AES) , with encryption and decryption in real time taking to consideration the criticality of data images that beenencrypted the main encryption algorithm is the same the modification is done by replacing the key generation algorithm byChebyshev Polynomial to generate key with the required key size.
This paper describe Web Content Filtering that aimed to block out offensive material by using DistributedAgents. The proposed system using FCM algorithm and other page's features (Title, Metadata , Warning Message) to classifythe websites (using as candidate) into two types:- white that considered acceptable, and black that contain harmful materialtaking the English Pornographic websites as a case study.
There is a significant necessity to compress the medical images for the purposes of communication and storage.Most currently available compression techniques produce an extremely high compression ratio with a high-quality loss. Inmedical applications, the diagnostically significant regions (interest region) should have a high image quality. Therefore, it ispreferable to compress the interest regions by utilizing the Lossless compression techniques, whilst the diagnostically lessersignificant regions (non-interest region) can be compressed by utilizing the Lossy compression techniques. In this paper, a hybridtechnique of Set Partition in Hierarchical Tree (SPIHT) and Bat inspired algorithms have been utilized for Lossless compressionthe interest region, and the non-interest region is loosely compressed with the Discrete Cosine Transform (DCT) technique.The experimental results present that the proposed hybrid technique enhances the compression performance and ratio. Also,the utilization of DCT increases compression performance with low computational complexity.
Internet-based programs and communication techniques have become widely used and respected in the IT industry recently. A persistent source of "big data," or data that is enormous in volume, diverse in type, and has a complicated multidimensional structure, is internet applications and communications. Today, several measures are routinely performed with no assurance that any of them will be helpful in understanding the phenomenon of interest in an era of automatic, large-scale data collection. Online transactions that involve buying, selling, or even investing are all examples of e-commerce. As a result, they generate data that has a complex structure and a high dimension. The usual data storage techniques cannot handle those enormous volumes of data. There is a lot of work being done to find ways to minimize the dimensionality of big data in order to provide analytics reports that are even more accurate and data visualizations that are more interesting. As a result, the purpose of this survey study is to give an overview of big data analytics along with related problems and issues that go beyond technology.
In this paper, a secure chatting application with end to end encryption for smart phones that used the android OS has beenproposed. This is achieved by the use of public key cryptography techniques. The proposed application used the Elliptic Curve DiffieHellman Key Exchange (ECDH) algorithm to generate the key pair and exchange to produce the shared key that will be used for theencryption of data by symmetric algorithms. The proposed Application allows the users to communicate via text messages, voicemessages and photos. For the text message security the standard AES algorithm with a 128 bit key are used. The generated key (160 bit)minimized to 128 bit length by selecting the first 128 bit of the generated key in order to be used by the AES algorithm. For the voice andimage security processes the proposed application used the symmetric algorithm RC4 for this purpose.
Now day’s text Classification and Sentiment analysis is considered as one of the popular Natural Language Processing (NLP) tasks. This kind of technique plays significant role in human activities and has impact on the daily behaviours. Each article in different fields such as politics and business represent different opinions according to the writer tendency. A huge amount of data will be acquired through that differentiation. The capability to manage the political orientation of an online article automatically. Therefore, there is no corpus for political categorization was directed towards this task in Arabic, due to the lack of rich representative resources for training an Arabic text classifier. However, we introduce political Arabic articles dataset (PAAD) of textual data collected from newspapers, social network, general forum and ideology website. The dataset is 206 articles distributed into three categories as (Reform, Conservative and Revolutionary) that we offer to the research community on Arabic computational linguistics. We anticipate that this dataset would make a great aid for a variety of NLP tasks on Modern Standard Arabic, political text classification purposes. We present the data in raw form and excel file. Excel file will be in four types such as V1 raw data, V2 preprocessing, V3 root stemming and V4 light stemming.
The important methods of data mining is large and from these methods is mining of association rule. The miningof association rule gives huge number of the rules. These huge rules make analyst consuming more time when searchingthrough the large rules for finding the interesting rules. One of the solutions for this problem is combing between one of theAssociation rules visualization method and generalization method. Association rules visualization method is graph-basedmethod. Generalization method is Attribute Oriented Induction algorithm (AOI). AOI after combing calls ModifiedAOI because it removes and changes in the steps of the traditional AOI. The graph technique after combing also callsgrouped graph method because it displays the aggregated that results rules from AOI. The results of this paper are ratio ofcompression that gives clarity of visualization. These results provide the ability for test and drill down in the rules orunderstand and roll up.
Audio hiding is method for embedding information into an audio signal. It seeks to do so in a robust Fashion, while not perceivably degrading the host signal (coves audio). Hiding data in audio signals presents & varicty of challenges: due in part to the wider dynamic and differential range of the Human Auditory System (HAS) as compared to other senses. Transform are usually used for robust audio hiding (audio watcrmarking). But, the audio hiding process is affected by the type of transform used. Therefore, this paper presents an evaluation of wavelet transform hiding in comprison with sclccted types of transforms (Walsh transform and cosine transform) hiding. In order to generate the audio stegecover, the research concludce (Wavelet, Walsh, or Cosine) transform of the audio cover, replacing some transformed cover cocfficients with secret audio message coefficients, and inverse (Wavelet, Walsit, or Carsine Irannom for audio cover with replaced coefficients. While, the extracting method concludes (Wavelel, Walsh, or Cosine runsform of the stego ove18 and extracting the secrecy extracted Audio message.
The most important cereal crop in the world is rice (Oryza sativa). Over half of the world's population uses it as a staple food and energy source. Abiotic and biotic factors such as precipitation, soil fertility, temperature, pests, bacteria, and viruses, among others, impact the yield production and quality of rice grain. Farmers spend a lot of time and money managing diseases, and they do so using a bankrupt "eye" method that leads to unsanitary farming practices. The development of agricultural technology is greatly conducive to the automatic detection of pathogenic organisms in the leaves of rice plants. Several deep learning algorithms are discussed, and processors for computer vision problems such as image classification, object segmentation, and image analysis are discussed. The paper showed many methods for detecting, characterizing, estimating, and using diseases in a range of crops. The methods of increasing the number of images in the data set were shown. Two methods were presented, the first is traditional reinforcement methods, and the second is generative adversarial networks. And many of the advantages have been demonstrated in the research paper for the work that has been done in the field of deep learning.
The increased use of computer and internet had been related to the wide use of multimedia information. The requirement for
protecting this information has risen dramatically. To prevent the confidential information from being tampered with, one needs to
apply some cryptographic techniques. Most of cryptographic strategies have one similar weak point that is the information is centralized.
To overcome this drawback the secret sharing was introduced. It’s a technique to distribute a secret among a group of members, such
that every member owns a share of the secret; but only a particular combination of shares could reveal the secret. Individual shares
reveal nothing about the secret. The major challenge faces image secret sharing is the shadow size; that's the complete size of the lowest
needed of shares for revealing is greater than the original secret file. So the core of this work is to use different transform coding
strategies in order to get as much as possible the smallest share size. In this paper Compressive Sharing System for Images Using
Transform Coding and Blackely Method based on transform coding illustration are introduced. The introduced compressive secret
sharing scheme using an appropriate transform (Discrete cosine transform and Wavelet) are applied to de-correlate the image samples,
then feeding the output (i.e., compressed image data) to the diffusion scheme which is applied to remove any statistical redundancy or
bits of important attribute that will exist within the compressed stream and in the last the (k, n) threshold secret sharing scheme, where n
is the number of generated shares and k is the minimum needed shares for revealing. For making a certain high security level, each
produced share is passed through stream ciphering depends on an individual encryption key belongs to the shareholder.
Immersive virtual reality isn’t just for gaming. It’s poised to have a big impact on education as well, giving students an
opportunity to interact with content in three-dimensional learning environments.
Blended learning, according to the Inn sight Institute is "a formal education program in which a student learns at least in
part through online delivery of content and instruction with some element of student' control over time, place, path and/or
pace". On the other hand, there are many disadvantage found in blended learning such as the learners with low motivation or bad study habits may fall behind, and many others. So, there is an essential need to improve and develop the theory of the blended learning by using virtual reality environments and get rid of these disadvantages to develop the faceto-face learning to add a lot of features such as excitement and make it more efficient. As well as affirms clarity of the scientific content of the lecture that the student may be miss them by absent or even mentally, so student can live atmosphere the lecture again and overcome the difficulties resulting from the using blended learning or traditional learning. Firstly, this approach is applied by building a specialized website application that allow using virtual reality feature in order to measure the effectiveness of this study on students, and then a questionnaires was designed and the information result of these questionnaires impact was gathered. It is found that, the most of students were excited, active and they understand the lecture in an easy way with a high Likert Scale (4.74), but they found difficulties in using VR tools which has a low Likert scale (2.66).
The development of modern information technologies in medicine makes actually the creation of the nationalInformation Systems (IS) for joint activities of medical institutions, improve the quality of health services and improvemanagement in the health sector. One of the components of healthcare is the system of Blood Service (BS). In this work the concept ofbuilding the national system is considered on example of the IS of BS. The national IS of BS aims to track relevant information onindicators of the quality of blood products through information integration BS establishments, makes it possible to increase thelevel of infectious safety and quality of transfusion care. The models of integration IS of BS are offered on the conceptual level inthis work for information exchange organization between BS establishments. The analysis of structures of models of integratedsystems is carried out to select the rational national IS of BS.
Inn then last two decades the Content Based Image Retrieval (CBIR) considered as one of the topic of interest for theresearchers. It depending one analysis of the image’s visual content which can be done by extracting the color, texture and shapefeatures. Therefore, feature extraction is one of the important steps in CBIR system for representing the image completely. Color featureis the most widely used and more reliable feature among the image visual features. This paper reviews different methods, namely LocalColor Histogram, Color Correlogram, Row sum and Column sum and Colors Coherences Vectors were used to extract colors featurestaking in consideration the spatial information of the image.
LAN Chat Messenger using TCP/IP offers reliability, security and zero cost communication among staff members of acompany. As well, this study offers file transfer. It helps to solve the communication problems that related to time and cost. The proposedprotocol facilitates information exchange among individuals by providing many communication options. It is a standaloneapplication using JAVA as the programming language and tested at LANs of our institute (college's Labs networks).
Salsa (20) cipher is speedier than AES cipher and its offered superior security. Salsa (8) and Salsa (12) are specifiedfor apps wherever the grade of security is less necessary than speed. The concept of this research is to suggest super salsakeystream utilizing various volumes matrices size (array (4, 4), array (4, 8), array (4, 16)) are used to increase the complexity ofkey stream and make it more reluctant to linear and differential attacks. Furthermore, in each iteration, the diffusion ofgenerated keystream will increase due the effect of changing the volume acting for one element of the array is not fixed. Thegenerated keys of the suggested Super SALSA keystream are depicted as simple operations and a high hardiness randomlykeystream by exceeding the five benchmark tests. Likewise, it's presenting a situation of equilibrium between complexity andspeed for Salsa (8, 12 and 20).
Information security is a huge trending topic in recent year. Many technique and algorithm were designed and developed to
achieve the security of information and/or network across the world. Cryptography is one of the most common tools to provide such
security. Nevertheless, steganography also provide a good security by hiding data within a media in a way that an attacker can't sense
the presence of the secret data. Compression don't normally imply any security , however , it messes up the original encoding of the data
and reduces its size by a measureable amount which makes it perfect for hiding .In this paper a system was proposed where a secret
image is compressed before encryption and hiding. JPEG algorithm was used for the compressing, while at the encryption stage RC4
algorithm was used due to its fast processing speed. LSB (Least Significant Bit) technique then applied to hide the secret data within the
In lately years, data streaming is become more important day by day, considering technologies employed to servethat manner and share number of terminals within the system either direct or indirect interacting with them.Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affectingthe data collectivity and appears clearly with the new technologies provided and the increase for the number of theusers of such systems. This is due to the number of the users and resources available system start to employ the computationalpower to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed dataas an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced atunique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systemstechnologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From thepresented study, the industrial and research communities are capable of gaining information about requirements for creatingFog computing environment with a clearer view about managing resources in the Fog.The main objective of this paper is to provide short brief and information about Data Streaming in Fog ComputingEnvironment with explaining the major research field within this meaning.
This century has a progressive evolution in IT. New techniques gadgets and tools are being invented every day. This
Leeds to consume energy and resources. The planet need a friendly environment in which consuming resources is balanced and
temperature is decreased. So; one of the most important responsibilities of human is providing green industry in order to get a
purity environment This paper is a review of a few vital writings identified with the field of green processing that underscores
the vitality of green registering for reasonable improvement.
In lately years, data streaming is become more important day by day, considering technologies employed to serve
that manner and share number of terminals within the system either direct or indirect interacting with them.
Smart devices now play active role in the data streaming environment as well as fog and cloud compatibility. It is affecting
the data collectivity and appears clearly with the new technologies provided and the increase for the number of the
users of such systems. This is due to the number of the users and resources available system start to employ the computational
power to the fog for moving the computational power to the network edge. It is adopted to connect system that streamed data
as an object. Those inter-connected objects are expected to be producing more significant data streams, which are produced at
unique rates, in some cases for being analyzed nearly in real time. In the presented paper a survey of data streaming systems
technologies is introduced. It clarified the main notions behind big data stream concepts as well as fog computing. From the
presented study, the industrial and research communities are capable of gaining information about requirements for creating
Fog computing environment with a clearer view about managing resources in the Fog.
The main objective of this paper is to provide short brief and information about Data Streaming in Fog Computing
Environment with explaining the major research field within this meaning.
Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most of
the collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convex
optimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state and
control constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses in
minimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis,
the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.
Collision avoidance techniques tend to derive the robot away of the obstacles in minimal total travel distance. Most ofthe collision avoidance algorithms have trouble get stuck in a local minimum. A new technique is to avoid local minimum in convexoptimization-based path planning. Obstacle avoidance problem is considered as a convex optimization problem under system state andcontrol constraints. The idea is by considering the obstacles as a convex set of points which represents the obstacle that encloses inminimum volume ellipsoid, also the addition of the necessary offset distance and the modified motion path is presented. In the analysis,the results demonstrated the effectiveness of the suggested motion planning by using the convex optimization technique.
Many papers have been published about manipulating computer viruses; instructions that impact a computer system and after a period of incubation and reproducion, activate and demonstrate their presence.
mest Viruses were designed to attack microcomputers, sincce microcomputers are widely used nowadays, and have simple operating systems which result in lack of quality of their security systern. Connecting computers with networks and using copies of programs from unreliable sources such as bullet in board systems will increase the :of viral contact and the spread of viruses. Data Encryption disguises data flowing through a network so that it is unintelligible to any one monitor the data, Encryption techniques can also be used to detect file modification which may caused either by unithorized users or by viruses. This paper concern in viruses atracking users on system files (.exe and com) in microcomputer sytems, where viruses Types, how they work, and anti-virus streiagies are going o scussed. Finally, a dccction stralegy depending on Encryption techniques built the operating sysiems Suggested to improve PCs Security and preventing unauthorized users from inserting into programas commands that will cause system corruption.
There is growing interest in automating crime detection and prevention for large populations as a result of the increased usage of social media for victimization and criminal activities. This area is frequently researched due to its potential for enabling criminals to reach a large audience. While several studies have investigated specific crimes on social media, a comprehensive review paper that examines all types of social media crimes, their similarities, and detection methods is still lacking. The identification of similarities among crimes and detection methods can facilitate knowledge and data transfer across domains. The goal of this study is to collect a library of social media crimes and establish their connections using a crime taxonomy. The survey also identifies publicly accessible datasets and offers areas for additional study in this area.
There has been a great deal of discussion about null values in relational databases. The relational model was defined in 1969, and Nulls Was died in 1979. Unfortunately, there is not a generally agreeable solution for rull values problem. Null is a special marker which stands for a value undefined or unknown, which means thut ne entry has been made, a missing valuc mark is not a value and not of a date type and cannot be treated as a value by Database Management System (DBMS). As we know, distributed database users are more than a single database and data will be distributed among several data sources or sites, it must be precise data, the replication is allowed there, so complex problems will appear, then there will be need for perfect practical general approaches for treatment of Nulls. A distributed database system is designed, that is "Hotel reservation control system, based on different data sources at four site, each site is represented as a Hotel, for more heterogeneity different application programming languages there are five practical approaches, designed with their rules and algorithms for Null values treatment through the distributed database sites. (1), (2), (3). 14). 15), (9).
DBSCAN (Density-Based Clustering of Applications with Noise )is one of the attractive algorithms among densitybased clustering algorithms. It characterized by its ability to detect clusters of various sizes and shapes with the presence of noise, but its performance degrades when data have different densities .In this paper, we proposed a new technique to separate data based on its density with a new samplingtechnique , the purpose of these new techniques is for getting data with homogenous density .The experimental results onsynthetic data and real world data show that the new technique enhanced the clustering of DBSCAN to large extent.
Recently, different applications of wireless sensor networks (WSNs) in the industry fields using different data transfer protocols has been developed. As the energy of sensor nodes is limited, prolonging network lifetime in WSNs considered a significant occurrence. To develop network permanence, researchers had considered energy consuming in routing protocols of WSNs by using modified Low Energy Adaptive Clustering Hierarchy. This article presents a developed effective transfer protocols for autonomic WSNs. An efficient routing scheme for wireless sensor network regarded as significant components of electronic devices is proposed. An optimal election probability of a node to be cluster head has being presented. In addition, this article uses a Voronoi diagram, which decomposes the nodes into zone around each node. This diagram used in management architecture for WSNs.
In this paper, algorithm (LT10) which is originally consist of four kasumi elements is proposed as a lightweight encryption algorithm, the proposed algorithm take into account that the IOT devices have a limit computation abilities and the sensitivity of smart homes and IOT network information that need to be exchanged the key length is 128 bit and the block length is 128 bit
Fifth−generation (5G) and millimeter−waves
(MM−W) hold tremendous promise to provide opportunities to
revolutionize education, healthcare, business, and agriculture.
Nevertheless, the generation of MM−W in the electrical−domain is
infeasible due to the bandwidth limitation of electronic components
and radio frequency (RF) interference. The capability to generate
MM−W in the optical−domain can provide transportation of
MM−W with low loss from switching center to remote base
stations. The present paper is focusing on electro−optical
up−conversion (EOU) techniques for optical generation and
transmission of 60−GHz MM−W signal. A comparative study is
carried out between three different EOU techniques:
frequency−quadrupling, frequency sextupling and
frequency−octotupling. The comparative study aims at showing the
strengths and weaknesses of three EOU techniques and evaluating
each technique in terms of electrical spurious suppression ratio
(ESSR), as well as in terms of the influence of non−ideal phase
shifting. The performance of the three EOU techniques after
transmission over optical fiber is evaluated by eye pattern test. The
results of the simulation confirm that the frequency−quadrupling
outperforms frequency− sextupling and frequency− octotupling
All the important information is exchanged between
facilities using the internet and networks, all these data should be
secret and secured probably, the personal information of person
in each of these institutions day by day need to organized secretly
and the need of the cryptography systems is raised which can easily
encrypt the personal and critical data and it can be shared with
other centers via internet without and concerns about privacy.
Chaotic performance is added to different phases of AES but very
few apply it on key generation and choosing Chebyshev
Polynomial will provide a chaotic map which will led to random
strong key. our system based on modified advanced encryption
standard (AES) , with encryption and decryption in real time
taking to consideration the criticality of data images that been
encrypted the main encryption algorithm is the same the
modification is done by replacing the key generation algorithm by
Chebyshev Polynomial to generate key with the required key size
Human and computer vision has a vital role in intelligent interaction with computer, face recognition is one of the subjects that have a wide area in researches, a big effort has been exerted in last decades for face recognition, face detection, face tracking, as yet new algorithms for building fully automated system are required, these algorithms should be robust and efficient. The first step of any face recognition system is face detection, the goal of face detection is the extraction of face region within image, taking into consideration lightning, orientation and pose variation, whenever this step accurate the result of face recognition will be better, this paper introduce a survey of techniques and methods of feature based face detection.
This paper propose a method for security threw hiding the image inside the speech signal by replacing the high frequency
components of the speech signal with the data of the image where the high frequency speech components are separated and analyzed using
the Wavelet Packet Transform (WPT) where the new signal will be remixed to create a new speech signal with an embedded image. The algorithm is implemented on MATLAB 15 and is designed to achieve best image hiding where the reconstruction rate was more than 94% while trying to maintain the same size of the speech signal to overcome the need for a powerful channel to handle the task. Best results were achieved with higher speech resolution (higher number of bits per sample) and longer periods (higher number of samples in the media file).
Generally, the electronic technology has been implemented to automate the traditional systems. So, different
copy of management systems in different scope were presented. These systems include the services provided to company as well
as people, such as, healthcare. The traditional data management systems for pharmacy as example, suffer from
the capacity, time consuming, medicines accessibility, managing the medicines store as well as the need of qualified
staff according to the requirements of employer expectations. In this paper, a hospital e-pharmacy system is proposed in order to facilitate the job, outdo the mentioned problems. A data management system to the Iraqi hospital's pharmacy is proposed which is divided into two main parts: database, and Graphical User Interface (GUI) frames. The database built using SQL Server contains the pharmacy information related
to the medicines, patient information….etc. the GUI frames ease the use of the proposed system by unskilled users. The
proposal system is responsible on monitoring and controlling the work of pharmacy in hospital in terms of management of
medicine issuing ordering and hospital reports.
Generally, the electronic technology has been implemented to automate the traditional systems. So, differentcopy of management systems in different scope were presented. These systems include the services provided to company as wellas people, such as, healthcare. The traditional data management systems for pharmacy as example, suffer fromthe capacity, time consuming, medicines accessibility, managing the medicines store as well as the need of qualifiedstaff according to the requirements of employer expectations. In this paper, a hospital e-pharmacy system is proposed in order to facilitate the job, outdo the mentioned problems. A data management system to the Iraqi hospital's pharmacy is proposed which is divided into two main parts: database, and Graphical User Interface (GUI) frames. The database built using SQL Server contains the pharmacy information relatedto the medicines, patient information….etc. the GUI frames ease the use of the proposed system by unskilled users. Theproposal system is responsible on monitoring and controlling the work of pharmacy in hospital in terms of management ofmedicine issuing ordering and hospital reports.
The increasing in the number of vehicles on streets
has led to traffic congestion. In order to reduce the waiting time
in cases of emergency, the idea of this work is suggested. This
work is divided into two parts, the particular part and software
part. The first circular particular part is a model which consists
of four lanes junction of a traffic light, it also has GSM system (
Global System for Mobile Communications). The GSM and
lamps of the traffic light are connected to Arduino UNO. The
Arduino controls every signal which is coming from the inputs
(GSM) to software and display to the outputs (lamps) The
second circular particular part is a model which consist same
components the first circuit except replace the GSM with
IR(infrared Remote).The goal from this work is to help us in the
emergence cases, the opening and closing of the traffic light are
controlled by using GSM system and IR, the time of each lane,
is controlled that means reduce the crowding.
There are some recent arguments in the community of computer enlightenment respecting the Object Oriented Programing (COP). This discussions focucs on one of the major questions that is "are we starting the duration of theory of computer science? How In this short paper we try to answer the most significant question of With the use of OOP, arc we passed Mccartry's theory ?are we starting new theory of computer science ? " We will first look at McCarthy theory, the principles of OOP. then finally we will attempt to answer the question and show our claims.
With the development of the Internet, making software is often essential, also it is complicated to succeed in the project’s development. There is a necessity in delivering software of top quality. It might be accomplished through using the procedures of Verification and Validation (V&V) via development processes. The main aim of the V&V has been checking if the created software is meeting the needs and specifications of clients. V&V has been considered as collections related to testing as well as analysis activities across the software’s full life cycle. Quick developments in software V&V were of high importance in developing approaches and tools for identifying possible concurrent bugs and therefore verifying the correctness of software. It has been reflecting the modern software V&V concerning efficiency. The main aim of this study has been retrospective review related to various researches in software V&V and conduct a comparison between them.
In the modern competitive world related to the software, the developers of software must be delivering on-time quality products, also the developers should be verifying that the software has been properly functioning and validating the product for each one of the client’s requirements. The significance of V&V in the development of software has been maintaining the quality of software. The approaches of V&V have been utilized in all stages of the System Development Life Cycle. Furthermore, the presented study also provides objectives of V&V and describes V&V tools that can be used in the process of software development, the way of improving the software’s quality.
Clinical decisions are crucial because they are related to human lives. Thus, managers and decision makers in
the clinical environment seek new solutions that can support their decisions. A clinical data warehouse (CDW) is an
important solution that is used to achieve clinical stakeholders’ goals by merging heterogeneous data sources in a central
repository and using this repository to find answers related to the strategic clinical domain, thereby supporting clinical
decisions. CDW implementation faces numerous obstacles, starting with the data sources and ending with the tools that
view the clinical information. This paper presents a systematic overview of purpose of CDWs as well as the characteristics;
requirements; data sources; extract, transform and load (ETL) process; security and privacy concerns; design approach;
architecture; and challenges and difficulties related to implementing a successful CDW. PubMed and Google Scholar
are used to find papers related to CDW. Among the total of 784 papers, only 42 are included in the literature review. These
papers are classified based on five perspectives, namely methodology, data, system, ETL tool and purpose, to find
insights related to aspects of CDW. This review can contribute answers to questions related to CDW and provide
recommendations for implementing a successful CDW.
Clinical decisions are crucial because they are
related to human lives. Thus, managers and decision makers in
the clinical environment seek new solutions that can support
their decisions. A clinical data warehouse (CDW) is an
important solution that is used to achieve clinical stakeholders’
goals by merging heterogeneous data sources in a central
repository and using this repository to find answers related to
the strategic clinical domain, thereby supporting clinical
decisions. CDW implementation faces numerous obstacles,
starting with the data sources and ending with the tools that
view the clinical information. This paper presents a systematic
overview of purpose of CDWs as well as the characteristics;
requirements; data sources; extract, transform and load (ETL)
process; security and privacy concerns; design approach;
architecture; and challenges and difficulties related to
implementing a successful CDW. PubMed and Google Scholar
are used to find papers related to CDW. Among the total of 784
papers, only 42 are included in the literature review. These
papers are classified based on five perspectives, namely
methodology, data, system, ETL tool and purpose, to find
insights related to aspects of CDW. This review can contribute
answers to questions related to CDW and provide
recommendations for implementing a successful CDW.
وبعد كل ما تقدم من هذه المناقشات التي جرت طوال اللقاءات المختلفة تبين أن هذه الشفرة الموحدة تشكل الأساس الأون لانطلاق العمل الموحد بين الأقطار العربية الشقيقة لخدمة اللغة العربية من جميع النواحي وانها قابلة للتحسين بشرط أن يكون هذا التحسين عملا جماعيا موحدا في المراحل القادمة على غرار ما حدث الى حلى الآن بالنسبة الى هذه الشفرة الميمونة .
في هذا البحث سنقدم طريقة جديدة لتحديد الأعضاء الداخلية الجسم الانسان أوتوماتيكية في الصور الشعاعية الطبقية والمأخوذة بواسطة جهاز التصوير الشعاعي الطبقي ( T - SCANNER)). وكذلك مقارنة هذه الطريقة مع طرائق اخرى معروفة .
يرمي هذا البحث الى تصميم حزمة برامج تفاعلية لمعالجة خرائط العراق الجغرافية والبيانات الاحصائية بالاستعانة برسائل الرسم البياني على حاسبة شخصية يتم تجميعها في القطر باسم الوركاء 6001 وباستخدام لغة BASIC N69m كنموذج على اخد تطبيقات الرسم البياني المقيدة في حصول التعليم والتعلم او في مجالات التخطيط الاقليمي والحضري وغير ذلك .
لنظرية المباريات أهمية كبيرة في المجالات التطبيقية التي تتضمن مواقف تنافسية تشترك فيها جهات متعددة ، ولعل من أهم وأبرز هذه المجالات هي تلك التي تتمثل فيها مواقف الصراع العسكري والحربي والتي تتطلب حلا سريعا ودقيقا التحديد الاستراتيجية المثلى للفوز ، أن جميع الأنظمة الجاهزة المتوفرة لعمل الحاسبة الالكترونية في هذا المجال لا تتعدى استخدام البرمجة الخطية كأسلوب المعالجة هذه المواقف وهي قد وضعت أصلا لمعالجة أمور لا تمت بصلة مباشرة الى نظرية المباريات
إن الحاجة الماسة إلى توفير مصادر باللغة العربية حول الحاسبة الالكترونية هي من أهم احد الاسباب التي دعتني إلى التطرق الى هذا الموضوع والذي توفر الجزء الأول منه ، آملا أن يكون نقطة البداية لإنجاز مهام من هذا النوع من الترجمات في المستقبل القريب . أ- أن الموضوع الذي نحن بصدده هو عبارة عن مقالة مترجمة عن کتاب: CH3 Peripheral Devices لمؤلفه : Ivan Flores حيث تتطرق المقالة إلى تعريف قناة الاشارات الالكترونية وعملها ضمن الحاسبة الالكترونية والجهات التي تقوم بالإشراف عليها والجهات التي تنقل منها واليها المعلومات وقد عززت هذه المقالة بالصور التوضيحية المبسطة والتي توضح بجلاء الأفكار والمعلومات التي وردت ضمن هذه المقالة . أن هذه المقالة اذا ما اكتملت بأجزائها جميعا يمكن أن تكون مادة اكاديمية جيدة للعاملين في مجال الحاسبات الالكترونية الذين يودون الحصول على تفاصيل أعمق عن مكونات الحاسبة الالكترونية ، وقد تخدم هذه المقالة كلا من مهندسي الصيانة المبتدئين و مهندسي النظم والمبرمجين .
يقدم هذا البحث على نحو تمهيدي ، نظرية المجموعات المضببة ومنطقها كنظام منكامل ، يمكن من خلالها تصميم الات ذات ذكاء اصطناعي متفوق . حيث تم شرح مفهوم دالة الامنساب المتدرجة ، وتعرف عمليات المنطق المضبب الاساسية ، ورابطة التضمين المضبب واخواتها . ومن ثم انتقلنا الى امتحان اصول الاستنتاج المضبب بمثال المسطر الذي يستخدم المنطق المضبب ، وفي الختام ، اشرنا الى اسلةب نقل دلالة الالفاظ اللغويه الى خوارزمية مضببة باستخدام طريقة التقريب اللغوي