International Journal of Computational Intelligence Systems

Published by Springer Nature

Online ISSN: 1875-6883

·

Print ISSN: 1875-6891

Articles


Fig. 1: Intelligent control model of coke oven heating.
Intelligent control of coke oven
  • Conference Paper
  • Full-text available

February 2010

·

2,967 Reads

Guozhang Jiang

·

Tingting He

·

·

Jianyi Kong
Coke oven is a complex plant with the characteristics of large time-delay, strong non-linear, multivariable coupling and changeable parameters. The longitudinal temperature was affected by many reasons, the control principle of combining the intermittent heating control with the heating gas flow adjustment was adopted. Intelligent control methods, namely fuzzy control and neural network, were proposed to establish intelligent control strategy and model of coke oven, which combined two feedback control, one feed forward control and intelligent control. Initial gas flow was given by heating supplied feed forward model according to coking mechanism, and carbonization index feedback model was proposed in the model to revise the goal temperature to control coking management of coke oven. Flue temperature soft measurement model based on linear regression and neural network was built to supply temperature feedback control. According to artificial operation and actual condition, fuzzy controller was designed. Intelligent control methods were used to adjust stopping heating time and heating gas flow. The practical running results indicate that the system can achieve heating intelligent control of coke oven, stabilize production of coke oven, effectively improve quality of coke and decrease energy consumption, and has great practical value.
Download
Share

A Resource Search Model Based on Semantically Enabled P2P Grid
This paper presents a four layered resource search model that has a good feature in expansibility, autonomy, modularity and loose coupling. In terms of this model a bootstrap based on distributed hashing table (DHT) is added into P2P Grid and a Peer Service is implemented in every node. The semantic- based resource management architecture and RPC protocols are discussed. A scalable vertical search engine prototype is realized with data extraction functionality. This model could be used in P2P Grid systems and heterogeneous P2P systems as a reference for resource searching, sharing and collaboration.

New Multiobjectve PSO Algorithm for Nonlinear Constrained Programming Problems

January 2008

·

19 Reads

A new approach is presented to solve nonlinear constrained programming problems (NLCPs) by using particle swarm algorithm (PSO). It neither uses any penalty functions, nor distinguish the feasible solutions and the infeasible solutions including swarm. The new technique treats the NLCPs as a bi-objective optimization problem, one objective is the original objective of NLCPs, and the other is the degree violation of constraints. As we prefer to keep the ratio of infeasible solutions so as to increase the diversity of swarm and avoid the defect of conventional over-penalization, a new fitness function is designed based on the second objective. In order to make the PSO escape from the local optimum easily, we also design a adaptively dynamically changing inertia weight. The numerical experiment shows that the algorithm is effective.

Similarity Measuring Approach For Engineering Materials Selection

January 2013

·

209 Reads

Advanced engineering materials design involves the exploration of massive multidimensional feature spaces, the correlation of materials properties and the processing parameters derived from disparate sources. The search for alternative materials or processing property strategies, whether through analytical, experimental or simulation approaches, has been a slow and arduous task, punctuated by infrequent and often expected discoveries. A few systematic efforts have been made to analyze the trends in data as a basis for classifications and predictions. This is particularly due to the lack of large amounts of organized data and more importantly the challenging of shifting through them in a timely and efficient manner. The application of recent advances in Data Mining on materials informatics is the state of art of computational and experimental approaches for materials discovery. In this paper similarity based engineering materials selection model is proposed and implemented to select engineering materials based on the composite materials constraints. The result reviewed from this model is sustainable for effective decision making in advanced engineering materials design applications.

Cryptographic Secrecy Analysis of Matrix Embedding

January 2010

·

46 Reads

Matrix embedding has been used to improve the embedding efficiency of steganography, which is an efficient method to enhance the concealment security. The privacy security of matrix embedding has also been studied under the condition of known-stego-object attcack. However, with the development of steganalysis, the attacker could obtain the estimated cover by the cover restoration technique. Consequently, the privacy security under the stronger attack condition should be considered. In this paper we study the secrecy security of matrix embedding using information theory under the circumstance of known-cover attack from the point of the key equivocation. The relation among the wet ratio of covers, embedding rate, and key equivocation is presented. We also proposed a new differential attack to matrix embedding under the circumstance of chosen-stego attack.

Text Categorization Based on Topic Model

May 2008

·

48 Reads

In the text literature, many topic models were proposed to represent documents and words as topics or latent topics in order to process text effectively and accurately. In this paper, we propose LDACLM or Latent Dirichlet Allocation Category Language Model for text categorization and estimate parameters of models by variational inference. As a variant of Latent Dirichlet Allocation Model, LDACLM regard documents of category as Language Model and use variational parameters to estimate maximum a posteriori of terms. Experiments show LDACLM model to be effective for text categorization, outperforming standard Naive Bayes and Rocchio method for text categorization.

Transitive Closure of Interval-valued Fuzzy Relations
In this paper we define interval-valued relations. It is defined reflexive, symmetric and T-transitive properties of interval-valued relations, and the transitive closure of an interval-valued relation. Finally, we propose a algorithm to compute the transitive closure. Some examples are given and some properties are studied.

Fixed point relational fuzzy clustering.

March 2009

·

31 Reads

The proposed relational fuzzy clustering method, called FRFP (fuzzy relational fixed point), is based on determining a fixed point of a function of the desired membership matrix. The ethod is compared to other relational clustering methods. Simulations show the method to be very effective and less computationally expensive than other fuzzy relational data clustering methods. The membership matrices that are produced by the proposed method are less crisp than those produced by NERFCM and more representative of the proximity matrix that is used as input to the clustering process.

Table 1 . Centroid and Standard Deviations of Clusters in Different Variables
Feature-Weighted Mountain Method with Its Application to Color Image Segmentation
In this paper, we propose a feature-weighted mountain clustering method. The proposed method can work well when there are noisy feature variables and could be useful for obtaining initially estimated cluster centers for other clustering algorithms. Results from color image segmentation illustrate the proposed method actually produces better segmentation than previous methods.

A Region-Based Image Segmentation by Watershed Partition and DCT Energy Compaction
An image segmentation approach by improved watershed partition and DCT energy compaction has been proposed in this paper. The proposed energy compaction, which expresses the local texture of an image area, is derived by exploiting the discrete cosine transform. The algorithm is a hybrid segmentation technique which is composed of three stages. First, the watershed transform is utilized by preprocessing techniques: edge detection and marker in order to partition the image into several small disjoint patches, while the three features: region size, mean and variance are used to calculate region energy for combination. Then in the second merging stage, the DCT transform is used for energy compaction which is a criterion for texture comparison and region merging. Finally the image can be segmented into several partitions. The obtained results show good segmentation robustness and efficiency, when compared to other state of the art image segmentation algorithms.

Research of Spatio-temporal Similarity Measure on Network Constrained Trajectory Data

October 2010

·

84 Reads

Ying Xia

·

·

·

[...]

·

Similarity measure between trajectories is considered as a pre-processing procedure of trajectory data mining. A lot of shaped-based and time-based methods on trajectory similarity measure have been proposed by researchers recently. However, these methods can not perform very well on constrained trajectories in road network because of the inappropriateness of Euclidean distance. In this paper, we study spatio-temporal similarity measure for trajectories in road network. We partition constrained trajectories on road network into segments by considering both the temporal and spatial properties firstly, then propose a spatio-temporal similarity measure method for trajectory similarity analysis. Experimental results exhibit the performance of the proposed methods and its availability used for trajectory clustering.

Gait Recognition Based on Outermost Contour

November 2010

·

57 Reads

Gait recognition aims to identify people by the way they walk. In this paper, a simple but effective gait recognition method based on Outermost Contour is proposed. For each gait image sequence, an adaptive silhouette extraction algorithm is firstly used to segment the images and a series of postprocessing is applied to the silhouette images to obtain the normalized silhouettes with less noise. Then a novel feature extraction method based on Outermost Contour is proposed. Principal Component Analysis (PCA) and Multiple Discriminant Analysis (MDA) are adopted to reduce the dimensionality of the feature vectors and to optimize the class separability of different gait image sequences simultaneously. Two simple pattern classification methods are used on the low-dimensional eigenspace for recognition. Experimental results on a gait database of 100 people show that the accuracy of our algorithm achieves 97.67%.

From Fuzzy Models to Granular Fuzzy Models

August 2011

·

22 Reads

Fuzzy models occupy one of the dominant positions on the research agenda of fuzzy sets exhibiting a wealth of conceptual developments and algorithmic pursuits as well as a plethora of applications. Granular fuzzy modeling dwelling on the principles of fuzzy modeling opens new horizons of investigations and augments the existing design methodology exploited in fuzzy modeling. In a nutshell, granular fuzzy models are constructs built upon fuzzy models or a family of fuzzy models. We elaborate on a number of compelling reasons behind the emergence of granular fuzzy modelling, and granular modeling, in general. Information granularity present in such models plays an important role. Given a fuzzy model M, the associated granular model incorporates granular information to quantify a performance of the original model, facilitate collaborative pursuits of knowledge management and knowledge transfer. We discuss several main categories of granular fuzzy models where such categories depend upon the formalism of information granularity giving rise to interval-valued fuzzy models, fuzzy fuzzy model (fuzzy2 models, for short), and rough -fuzzy models. The design of granular fuzzy models builds upon two fundamental concepts of Granular Computing: the principle of justifiable granularity and an optimal allocation (distribution) of information granularity. The first one supports a construction of information granules of a granular fuzzy model. The second one emphasizes the role of information granularity being treated as an important design asset. The underlying performance indexes guiding the design of granular fuzzy models are discussed and a multiobjective nature of the construction of these models is stressed.

Fig. 1. Grid Components Interact with Replica Manager in Grid
Fig. 2. A grid organization with 81 nodes, each of the node has a data file a, b,…, and y, respectively
Fig. 3. Algorithm of DR2M protocol
Quorum Based Data Replication in Grid Environment
Replication is a useful technique for distributed database systems and can be implemented in a grid computation environment to provide a high availability, fault tolerant, and enhance the performance of the system. This paper discusses a new protocol named Diagonal Data Replication in 2D Mesh structure (DR2M) protocol where the performance addressed are data availability which is compared with the previous replication protocols, Read-One Write-All (ROWA), Voting (VT), Tree Quorum (TQ), Grid Configuration (GC), and Neighbor Replication on Grid (NRG). DR2M protocol is organized in a logical two dimensional mesh structure and by using quorums and voting techniques to improve the performance and availability of the replication protocol where it reduce the number of copies of data replication for read or write operations. The data file is copied at the selected node of the diagonal site in a quorum. The selection of a replica depends on the diagonal location of the structured two dimensional mesh quorum where the middle node is selected because it is the best location to get a copy of the data if every node has the equal number of request and data accessing in the network. The algorithm in this paper also calculates the best number of nodes in each quorum and how many quorums are needed for N number of nodes in a network. DR2M protocol also ensures that the data for read and write operations is consistency, by proofing the quorum must not have a nonempty intersection quorum. To evaluate DR2M protocol, we developed a simulation model in Java. Our results prove that DR2M protocol improves the performance of the data availability compare to the previous data replication protocol, ROWA, VT, TQ, GC and NRG.

Formulas for the controller parameters in the Ziegler-Nichols closed loop method.
A). Analysis of Variance for MSE (sum of squares type III).
B). Analysis of Variance for EC (sum of squares type III).
The MSE obtained under several types of perturbations.
(A) Method of 95,0% LSD for MSE of control.
A New Adaptive and Self Organizing Fuzzy Policy to Enhance the Real Time Control Performance

May 2014

·

647 Reads

In this paper, a temperature control in real time control process was presented using several control algorithms. A quantitative comparison based on the real power consumption and (the precision and the robustness) of these controllers during the same control process and under the same conditions will be done. The proposed Adaptive and Self Organizing Fuzzy policy has been able to prove its superiority against the remaining controllers. The new Adaptive and Self Organizing Fuzzy Logic Controller starts the control with a very limited information about the controlled process (delay and the monotonicity sign) and without any kind of offline pre-training, the adaptive controller acts online to collect the necessary background to adapt their rules consequents and to self organize their membership functions from the real behavior of the controlled process. During 200 minutes and under the same conditions all the performed controllers have been used to control the room temperature, each simulation has been repeated five times with two different sets of set points. These amounts of results was used as a set of sampling for the statistical tool ANOVA (Analysis of Variance) that can prove and illustrate the validity and the extrapolability of the conclusions extracted from several stages of this work.

Table 4 .
Table 6 .
MAS test bed
Local Semantic Indexing for Resource Discovery on Overlay Network Using Mobile Agents

May 2014

·

47 Reads

One of the most crucial problems in a peer-to-peer system is locating of resources that are shared by various nodes. Various techniques suggested in literature suffer from drawbacks viz. saturation of network, inability to locate multi-keyword based resource or locate resource based on semantics. We present the solution that is more efficient and effective for discovering shared resources on a network that is influenced by content shared by nodes. To reduce the search load on nodes that have uncorrelated content, an efficient migration route is proposed for mobile agent that is based on cosine similarity of content shared by nodes and user query and minimum support. Results show reduction in search load and traffic due to communication, and increase in locating of resources defined by multiple keys using mobile agent that are logically similar to user query. Furthermore, the results indicate that by use of our technique the relevance of search results is higher; that is obtained by minimal traffic generation/communication and hops made by mobile agent.

Extended 2-tuple linguistic hybrid aggregation operators and their application to multi-attribute group decision making

July 2014

·

116 Reads

The aim of this paper is to develop some new 2-tuple linguistic hybrid aggregation operators, which are called the extended 2-tuple linguistic hybrid arithmetical weighted (ET-LHAW) operator, the extended 2-tuple linguistic hybrid geometric mean (ET-LHGM) operator, the induced ET-LHAW (IET-LHAW) operator and the induced ET-LHGM (IET-LHGM) operator. These operators do not only consider the importance of the elements but also reflect the importance of their ordered positions. Meantime, some desirable properties are studied, such as idempotency, boundary, etc. When the information about linguistic weight vectors is partly known, the models for the optimal linguistic weight vectors on an expert set, on an attribute set and on their ordered sets are established, respectively. Moreover, an approach to multi-attribute group decision making under linguistic environment is developed. Finally, a numerical example is offered to verify the developed method and to demonstrate its practicality and feasibility.

Consistency and Stability in Aggregation Operators: An Application to Missing Data Problems

May 2014

·

56 Reads

An aggregation operator [1, 5, 7, 8, 9, 12] is usually defined as a real function A n such that, from n data items x 1, …, x n in [0,1], produces an aggregated value A n (x 1,…,x n ) in [0,1] [4]. This definition can be extended to consider the whole family of operators for any n instead of a single operator for an specific n. This has led to the current standard definition [4, 15] of a family of aggregation operators (FAO) as a set {A n :[0,1]n → [0,1],n ∈ N}, providing instructions on how to aggregate collections of items of any dimension n. This sequence of aggregation functions {A n } n ∈ N is also called extended aggregation functions (EAF) by other authors [15, 5].KeywordsAggregation ProcessAggregation FunctionAggregation OperatorStable FamilyRecursive RuleThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Software Fault Estimation Framework based on aiNet

July 2014

·

212 Reads

Software fault prediction techniques are helpful in developing dependable software. In this paper, we proposed a novel framework that integrates testing and prediction process for unit testing prediction. Because high fault prone metrical data are much scattered and multi-centers can represent the whole dataset better, we used artificial immune network (aiNet) algorithm to extract and simplify data from the modules that have been tested, then generated multi-centers for each network by Hierarchical Clustering. The proposed framework acquires information along with the testing process timely and adjusts the network generated by aiNet algorithm dynamically. Experimental results show that higher accuracy can be obtained by using the proposed framework.

Associative data mining for alarm groupings in chemical processes

October 2007

·

146 Reads

Complex industrial processes such as nuclear power plants, chemical plants and petroleum refineries are usually equipped with alarm systems capable of monitoring thousands of process variables and generating tens of thousands of alarms which are used as mechanisms for alerting operators to take actions to alleviate or prevent an abnormal situation. Overalarming and a lack of configuration management practices have often led to the degradation of these alarm systems, resulting in operational problems such as the Three-Mile Island accident. In order to aid alarm rationalization, this paper proposed an approach that incorporates a context-based segmentation approach with a data mining technique to find a set of correlated alarms from historical alarm event logs. Before the set of extracted results from this automated technique are used they can be evaluated by a process engineer with process understanding. The proposed approach is evaluated initially using simulation data from a Vinyl Acetate model. The approach is cost effective as any manual alarm analysis of the event logs for identifying primary and consequential alarms could be very time and labour intensive.

Table 9 .
Application of Artificial Capital Market in Task Allocation in Multi-robot Foraging

May 2014

·

113 Reads

Because of high speed, efficiency, robustness and flexibility of multi-agent systems, in recent years there has been an increasing interest in the art of these systems. Artificial market mechanisms are one of the well-known negotiation multi-agent protocols in multi-agent systems. In this paper artificial capital market as a new variant of market mechanism is introduced and employed in a multi-robot foraging problem. In this artificial capital market, the robots are going to benefit via investment on some assets, defined as doing foraging task. Each investment has a cost and an outcome. Limited initial capital of the investors constrains their investments. A negotiation protocol is proposed for decision making of the agents. Qualitative analysis reveals speed of convergence, near optimal solutions and robustness of the algorithm. Numerical analysis shows advantages of the proposed method over two previously developed heuristics in terms of four performance criteria.

Motion Key-frames extraction based on amplitude of distance characteristic curve

May 2014

·

236 Reads

The key frames extraction technique extracts key postures to describe the original motion sequence, which has been widely used in motion compression, motion retrieval, motion edition and so on. In this paper, we propose a method based on the amplitude of curve to find key frames in a motion captured sequence. First we select a group of joint distance features to represent the motion and adopt the Principal Component Analysis (PCA) method to obtain the one dimension principal component as a features curve which will be used. Then we gain the initial key-frames by extracting the local optimum points in the curve. At last, we get the final key frames by inserting frames based on the amplitude of the curve and merging key frames too close. A number of experimental examples demonstrate that our method is practicable and efficient not only in the visual performance but also in the aspect of the compression ratio and error rate.

Motion Key-frames extraction based on amplitude of distance characteristic curve

January 2014

·

15 Reads

The key frames extraction technique extracts key postures to describe the original motion sequence, which has been widely used in motion compression, motion retrieval, motion edition and so on. In this paper, we propose a method based on the amplitude of curve to find key frames in a motion captured sequence. First we select a group of joint distance features to represent the motion and adopt the Principal Component Analysis (PCA) method to obtain the one dimension principal component as a features curve which will be used. Then we gain the initial key-frames by extracting the local optimum points in the curve. At last, we get the final key frames by inserting frames based on the amplitude of the curve and merging key frames too close. A number of experimental examples demonstrate that our method is practicable and efficient not only in the visual performance but also in the aspect of the compression ratio and error rate.

Statistical analysis for tradeoff points
Integrated ANN-HMH Approach for Nonlinear Time-Cost Tradeoff Problem

October 2014

·

95 Reads

This paper presents an integrated Artificial Neural Network - Hybrid Meta Heuristic(ANN-HMH) method to solve the nonlinear time-cost tradeoff(TCT) problem of real life engineering projects. ANN models help to capture the existing nonlinear time-cost relationship in project activities. ANN models are then integrated with HMH technique to search for optimal TCT profile. HMH is a proven evolutionary multiobjective optimization technique for solving TCT problems. The study has implication in real time monitoring and control of project scheduling processes.

A Hybrid MCDM for Private Primary School Assessment Using DEMATEL Based on ANP and Fuzzy Cognitive Map

July 2014

·

209 Reads

Primary school selection is actually a decision making problem that should be supported with several source of information. Parents usually handle this problem in an unstructured manner which is indeed a fuzzy mental ranking of schools with respect to the effectiveness. In this study, an analytical hybrid multiple criteria decision making (MCDM) model is proposed and explained with details for private primary school selection problem. A case study indicating primary school choices of parents in Turkey is also presented.

Top-cited authors