Science topic

# Computational Intelligence - Science topic

Computational methodologies inspired by naturally occuring phenomena.

Questions related to Computational Intelligence

QSARpro webinar covers:

1) Understanding of the innovative Group QSAR (GQSAR) method and its application

2) How to unleash the strength of your molecule dataset

3) How to customise and build smart QSAR models

4) How to enhance the scope of your predicted results by addressing Inverse QSAR problem

The webinar lasts for about 30 minutes followed by an interactive Q & A session.

Come prepared with your questions because you can ask your questions to the wellknown QSAR expert Dr. Subhash Ajmani, PhD, Scientist & Senior Management staff at NovaLead Pharma.

Has anybody used RevoDeployR? I would like to make web applications that run my R codes. I came across RevoDeployR but I am not sure if it is a good tool.

Are artificial immune system (AIS) algorithms considered as population-based? How about the newly developed chemical reaction optimization (CRO) algorithm? What are the similarities and differences between them? How do they differ from the robust and well-known Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Simulated Annealing (SA), Tabu Search (TS) and many more in terms of their behavior and operator characteristics?

By categorization i mean assigning text to a particular class

Where 'R^-1' is the inverse of the auto correlation matrix, 'K' is a very large number and 'I' is Identity matrix. Why does 'K' have to be very large?

Pca

Hi
Any ideas about "" implementation of Business Continuity Plan ( PCA in french) ""
thanks

Emotional influence on the integration of sensory modalities in cognitive architectures.

Why is that the Neural network can't be over trained....?

I heard something wrong with the over training...!!

Since the AIS algorithms is a subset of swarm intelligence whereas known as "collective" intelligence that more focused on decentralized, self-organized, and learning ability. In contrast, considered SA, its an algorithm with more focused search whereas the single-point iterations is adopted. However, I couldn't see any way that can hybridize these two algorithm since the differences in nature that both shown. Any suggestion?

Regarding my Neural Network training.

I want to know an effective way to make an analysis of Population based GA, PSO for time and space complexity analysis.

Are you aware of this nice conference track: Knowledge Discovery and Business Intelligence (KDBI - EPIA2013) http://lnkd.in/HHeYgc ?

Papers in Springer LNAI (ISI indexed). Best papers in Journal Expert Systems (ISI JCR). Deadline: 15/3/2013

Related to properties of data sets

I am working on classifying mammogram images using computational intelligence. Is there a database with images that can be opened in Windows 7. There are a few but they are supported by a Unix environment. If anyone has experience working in the field, please do share.

To be a programmer who can come up with novel solutions means a certain amount of creativity is required, but how do we enable students to develop it?

Conference Paper Problem solving and creativity for undergraduate engineers: ...

If you read an article or journal paper or book, which this referred some information relevant to the original source. The reference will give reference to the source or origin or base or initiation of research work. If the common struture is there, then literature-collection time (the search time) for the research will reduce. Otherwise literature-search will convert into a research problem, because of heavy data.

With reference to perception action cycle

Which NN model are suitable?

What are the other uses of a mathematical model for acute pain?

Could a model for Acute pain have uses outside of medicine or biology? Could it have applications in robotics?

I have a text classification task. I used sliding windows method to pupolate my data set. The problem is that the size of the data set is huge and the data points are very similar in my data set. I would like to reduce the data set without losing informative data points. I am aware of variable selection techniques such as "kruskal.test", "limma", "rfe", "rf", "lasso", .... But how can I choose a suitable method for my problem without doing computationaly intensive operations.

What are the appropriate computational intelligent model to improve the user satisfaction on m-government services?

Are you aware of this nice workshop (2nd CFP): Knowledge Discovery and Business Intelligence (KDBI - EPIA2013) ? :)

Papers in Springer LNAI (ISI indexed). Best papers in Journal Expert Systems (ISI JCR). Deadline: 15/3/2013, see more at: http://lnkd.in/HHeYgc

Robots due to cost are a limited resource for teaching but useful. They engage students and make concrete principles but it is not possible to have one robot per student for both cost and space reasons. How can I get the same benefits of robots for teaching AI through other methods?

Article Neural nets

Problem solving is central to computing and engineering education, but it is not clear what is the best way to engage students with developing problem-solving skills. Are there better ways in other disciplines?

Do you need an exciting congress to publish you recent work on Computational Intelligence?

Next 8-11th September, in the stunning beach of 'Porto de Galinhas'-Brazil, researchers from several approaches of Computational Intelligence will attend 2013 BRICS-CCI and 11th CBIC. Website => http://brics-cci.org The *NEW* Submission Deadline is 20th May. -Send to us your latest work!

Collecting data samples from a power plant requires time and funds. In order to make sure that you make use of the limited time that you spent on the plant capturing the data, a specific data selection and preparation scheme would be very helpful to be used as guideline. Can anyone suggest papers that could be a reference for a data preparation method?

I am a bit confused with "No of clusters" and "No of Seeds" in K-Mean clustering algorithms. Kindly provide an example for understanding the point of view. What is the effect if we change either?

What is the aim of the AI community?

What does the AI community want to achieve in terms of development?

Which machine learning algorithm is able to gauge a student's domain level (Beginner, Intermediate and Advance) in an online quiz? Once this has been established, is there a metric/algo that would be able to ascertain a smooth transition from each domain level, both with supervised and unsupervised learning? Which one is more suitable? What are some features to use to determine each domain level?

Computing with discrete compositional data structures in analogue computers

Slides & video of my talk "Vector Symbolic Architectures: Computing with discrete compositional data structures in analogue computers" are available from the Talks section of my home page: http://bit.ly/RossGayler
This is relevant to cognitive scientists interested in how neural networks might handle complex data structures (as may occur in linguistic processing). It should also be relevant to computer scientists interested in unconventional computation methods.

In the English language there are a lot of tools to represent the knowledge from document in the conceptual graph, like CharGare, etc. at http://conceptualgraphs.org/.

Conceptual Graph from the knowledge representation and in the semantic notion. In the English language there are a lot of tools to represent the knowledge from document in the conceptual graph, like CharGare, Ameen at http://conceptualgraphs.org/.

Can we convert or make theory class also as interactive as practical or lab session by using the follwoing techniques:

1. Teach using some simple equivalent model

2. Overlaping/sandwitch way of learning: Teach and practice and teach practice use simulation tool such as MATLAB, PSIM etc

3. Instead of calculator MATLAB can be used as tool to solve the problem, it save paper also.

CFP NICSO 2013 September 2nd - 4th, 2013 Canterbury, UK (http://www.nicso2013.org)

The VI International Workshop on Nature Inspired Cooperative Strategies for Optimization
http://www.nicso2013.org
Full paper submission (extended): May 5, 2013
Acceptance notification: May 25, 2013
Final camera ready: June 5, 2013
NICSO: September 2-4, 2013

Are you aware of this nice R tool package: New rminer v1.3 (R package that eases data mining classification and regression) ?

Package available at CRAN: http://cran.r-project.org/web/packages/rminer/
Feedback about the package is welcome.

It would be interesting to know what is the most promising prospect in an attempt to develop AI

Details of the research work will follow if there are capable contributors.

Experimental work is identified as one of the best practices for students to gain knowledge and to develop skills with interest. It is observed that attendance is also good for the lab, when compared to the theory.

The lab will help student test, impliment some concept priciples, new idea and to some extent or some time it act like an incubation centre for solving real world problems.

What is the difference between Knowledgeable and Knowledge as a term?

What is the difference between Knowledgeable and Knowledge as a term?

If the algorithm is very greedy (e.g. local search), then restarts of that algorithm might resemble a memetic algorithm. Let's consider a memetic algorithm with a very expensive local optimization technique -- we would like to run the local optimizer less often, so we might want to filter the starting points. This can be difficult since a start point with a good fitness evaluation might not have that much more room for improvement. So, what we need are ways to estimate the room for improvement for a given (e.g. completely random or somewhat random) solution.

I'm primarily interested in continuous search spaces. Given two solutions x1 and x2, is there a way to find out/estimate the probability that the (local) optimum near x1 is better than the local optimum near x2 when f(x1) is worse than f(x2)?

Can human brain understands itself fully?Just as in mathematics a function cannot be used to define itself.

Just as in mathematics a function or theoretical concept/term cannot be used to define itself. we have to use our another understanding-apart from function/concept etc in question itself- to make sense of the function /term/concept., so I suppose should be the case with the brain. But brain is also the highest and only tool for understanding. what else can be used to understands it? O it will be case of different component of Brain understanding each other. But then can there be ever full understanding of Brain.?

Bayesian Regulation in MATLAB

I've got an assignment to create plot eigen energy in x-axis to interval energy in y-axis using trainbr in MATLAB. In my case, matrix N and E are equivalent, but when I try created plot E to N using trainbr I got stuck, can anyone please explain to me how to use trainbr in MATLAB? Your answer will be much appreciated, thks

I'm currently considering two optimization algorithms (artificial immune system & simulated annealing) mainly on solving scheduling problems in manufacturing. However, I would like some opinion either to consider hybrids of two or more algorithms or non-hybridized algorithm. Basically, what degree of "hybrid" between two or more algorithms can be considered as a hybrid algorithm? How actually hybrid conducted or what trends that mostly adopted out there in order for two or more algorithms successfully hybridized? On what aspect actually hybrid actually needed?

I want to know the algorithm of random generators.

How would i apply user behaviour in library e-service with computational intelligent?

For example, assume we have a frame representation for a castle, how can we tell the computer to draw the castle from that knowledge?

What do you think about publication?

Prof. Dr. David Parnas(a pioneer in Software Engineering) has joined the group of scientists which openly criticize the number-of-publications-based approach towards ranking academic production. On his November 2007 paper Stop the Numbers Game, he elaborates on several reasons on why the current number-based academic evaluation system used in many fields by universities all over the world (be it either oriented to the amount of publications or the amount of quotations each of those get) is flawed and, instead of generating more advance of the sciences, it leads to knowledge stagnation.

Genetic Lifeform and Disk Operating System 2 is the aperture science second mind

Hi.

I am working on a specific evolutionary algorithm improvement.

Can anybody suggest a paper or book about evolutionary algorithms comparison rules?

Is saw some approaches used in other papers for comparison , but i'd like to have a comprehensive reference about these rules.(if there is any written rules).

Thanks

Hi everybody..can any one help me to know that how to developed any small Intelligent system using any programming language ..may be JAVA....basic problem it that how can we add intelligence into software?...

I am looking for modern platforms that support automated defeasible reasoning and decision support for a medical application. My graduate work (back in the mid-90s) helped build an agent-based defeasible reasoner and implemented a primitive "Medical Diagnostic Advisor". I'm now working with a company that needs a *defeasible* rule-based engine to support decisionmaking in real time based on input from a monitoring system. I was hoping someone could point me in the right direction for the latest developments in defeasible reasoners, preferably LISP based, although C/++/objective is fine too. Thanks!

Integrated statistical techniques are in the process of being launched in India that can curb credit card and debit card fraud. The techniques are a combination of Baum-Welch and multivariable gaussian distribution besides many other integrated technologies. We are looking at international tie-ups with agencies that are already working in this sphere. You may write-in to arijayc@gmail.com

Hi for all,

would anybody introduce a book to learn how a good fitness(cost) function i can select ; for classification purposes,

thanks

Use of Computational Intelligence / Artificial Intelligence/ Artificial Life To Solve Millennium Prize Mathematical Problems.

If Computational Intelligence / Artificial Intelligence / Artificial Life could be used to solve the Millennium Prize Mathematical Problems please send me feedback on ian.ajzenszmidt@alumni.unimelb.edu.au. Success in this endeavor would be a great public relations and prestige coup.

http://www.claymath.org/millennium/ .is the source of the following:

In order to celebrate mathematics in the new millennium, The Clay Mathematics Institute of Cambridge, Massachusetts (CMI) has named seven Prize Problems. The Scientific Advisory Board of CMI selected these problems, focusing on important classic questions that have resisted solution over the years. The Board of Directors of CMI designated a $7 million prize fund for the solution to these problems, with $1 million allocated to each. During the Millennium Meeting held on May 24, 2000 at the Collège de France, Timothy Gowers presented a lecture entitled The Importance of Mathematics, aimed for the general public, while John Tate and Michael Atiyah spoke on the problems. The CMI invited specialists to formulate each problem.

One hundred years earlier, on August 8, 1900, David Hilbert delivered his famous lecture about open mathematical problems at the second International Congress of Mathematicians in Paris. This influenced our decision to announce the millennium problems as the central theme of a Paris meeting.

The rules for the award of the prize have the endorsement of the CMI Scientific Advisory Board and the approval of the Directors. The members of these boards have the responsibility to preserve the nature, the integrity, and the spirit of this prize.

hi all, can anyone give a suggesstion To find the role of metacognition , the appropriate soft computing principles

Support Vector Machine can play important role in Protein fold recognition. HMM and ANN also play there Role in it..

Here we are faced with global economic challenges. Business are indeed struggling to cope with different aspects including unreliable labor.

Support Vector Machine can play important role in Protein fold recognition. HMM and ANN also play there Role in it..

hey all wonderful people here..

I just need your views for best Artificial Neural Network approach, if we are using neural network for strategic decision making in our application..

thanks in advance,

warm regards

Chetan

Hi everyone,

I'm working on software sensor (Biomass). And due to the complexity of the bioprocess, i'm using black box ( neural networks) to predict the biomass concentration during the culture. For that i'm using matlab to implement my RBF-NN (Radial Base Function - NN) and PCA for pre-processing the data. But here is my problem i can found how to do this cause i'm new in matlab and i don't want to use the matlab NN module. Is there a better NN or pre-processing method for modeling bioprocess? Please i need some help?

Best regards.

hey all, I am working for building a firewall for prevention of SQL Injection in website. I am planning this through artificial neural network. need your comments and suggestions on this.

if anybody is working on similar then guidance is needed..

thanks in advance

Chetan

Dear group members

What are the most advantages of LOLIMOT(Locally LInear MOdel Tree algorithm)in comparison to MLP Neural networks?

VLifeSCOPE (SCOPE) is Structure Based Compound Optimization, Prioritization & Evolution computational method. SCOPE brings together two powerful approaches – one, comparative binding energy analysis based method for lead optimization and two, score based approach for activity prediction.

Comparative binding energy analysis is a receptor-dependent analogue method to enable better understanding of ligand - receptor interactions. For each of the ligands under consideration, intermolecular and intramolecular energies are calculated for the ligand - receptor complexes, the unbound ligands and the receptor.

Advantages of VLifeSCOPE:

1. Identification of residues that are the key to modulating the ligand activity in a target

2. Predicting activity of newly designed compounds docked into a target

3. Prioritization of docked compounds based on their predicted activity

VLifeSCOPE is now available with VLifeMDS 3.5 as advanced module

Access VLifeSCOPE webinar archives here:http://www.vlifesciences.com/webinar/Webinar.php

Now a days we have hex core microprocessor in market, but even after this so much improvement in microprocessor we don't have any remarkable change in performance overall(speed must be six times to Pentium).

Don't you think we must think for hardware improvement, or some idea how can we interface our hard drive to microprocessor in such way so that we can minimize the misses. Now miss rate is very high(as hard disks are quite slower comparatively).

All suggestions or comments are most welcomed as I want to work on it and looking for new ideas to work.

Partners in this project are also welcomed.

Thanks in advance..

Warm Regards,

Chetan

Hi all, I am working on fuzzy logic implementation in PL\SQL or SQL queries. I need help from someone, who is working on same or have worked.

Thanks a lot

Chetan

UniCSE CFP

What do you think about publication?

Prof. Dr. David Parnas(a pioneer in Software Engineering) has joined the group of scientists which openly criticize the number-of-publications-based approach towards ranking academic production. On his November 2007 paper Stop the Numbers Game, he elaborates on several reasons on why the current number-based academic evaluation system used in many fields by universities all over the world (be it either oriented to the amount of publications or the amount of quotations each of those get) is flawed and, instead of generating more advance of the sciences, it leads to knowledge stagnation.

The basic structuring methods presented are the array, the record, the set, and the sequence. More complicated structures are not usually defined as static types, but are instead dynamically generated during the execution of the program, when they may vary in size and shape. Such structures, and include lists, rings, trees, and general, finite graphs. Variables and data types are introduced in a program in order to be used for computation. To this end, a set of operators must be available. For each standard data type a programming languages offers a certain set of primitive, standard operators, and likewise with each structuring method a distinct operation and notation for selecting a component. The task of composition of operations is often considered the heart of the art of programming. However, it will become evident that the appropriate composition of data is equally fundamental and essential. The most important basic operators are comparison and assignment, i.e., the test for equality (and for order in the case of ordered types), and the command to enforce equality.

The fundamental difference between these two operations is emphasized by the clear distinction in their denotation throughout this text.

Test for equality : x = y (an expression with value TRUE or FALSE)

Assignment to x : x: = y (a statement making x equal to y)

These fundamental operators are defined for most data types, but it should be noted that their execution may involve a substantial amount of computational effort, if the data are large and highly structured. For the standard primitive data types, we postulate not only the availability of assignment and comparison, but also a set of operators to create (computer) new values. Thus we introduce the standard operations of arithmetic for numeric types and the elementary operators of propositional logic for logical values.

Mind modeling, relevant knowledge base, knowledge representation, cognition, computation

for e.g. the input values can take finite states only, say 10 symbols {A, B, C, D, E, F, G, H, I, J}

If I have 1000 instances of false class and 50 instances of true class, which practice is aimed for a better result? To select False and True on the basis of equal ratio or equal number?

I've read a couple of papers from the WWW conference on how to predict click-through rates of text ads. Does anyone know any good research on image ads?

Something that covers PSO, DE, ES, EDA, etc.

Call for Voting for Top 10 Questions in Intelligent Informatics/Computing (Top10Qi)

*****************************************************************************
Call for Voting for Top 10 Questions in Intelligent Informatics/Computing (Top10Qi)
*****************************************************************************
Sixty years ago, Alan M. Turing, raised the essential question, "Can machines think?" in his article "Computing Machinery and Intelligence", which can be regarded as one of seeds for AI and other intelligent computing.
The Top10Qi open forum is trying to offer a common platform for all of us to work together to think about the basic questions and further figure out the top 10 questions in Intelligent Informatics/Computing (Top10Qi).
Thanks the great supports to Top10Qi from many people, we have received 128 questions from all over the world, which can be viewed at Top10Qi web site:
http://wi-consortium.org/blog/top10qi/index.html
We are now in the 2nd stage, i.e., voting for the top 10 questions which are to be discussed in the panel session at WIC 2012. You are cordially invited to vote for up to 10 questions from the question list in the Top10Qi voting system at
http://top10qi.org/
After clicking the <Non-TC Member> button, you will enter a sign-in page in which you should provide the following information:
Email Address: (to send an acknowledge message including your voted questions)
Name: Your Name (Optional but preferred)
--------------------------------------------------
The vote deadline is November 24, 2012.
--------------------------------------------------
We are looking forward to having your strong support and contribution!
If having any questions, please send an email to <Top10Qi@gmail.com>.
Best Regards
Top10Qi Organizing Committee

Noam Chomsky:

Ref1: "Noam Chomsky on Where Artificial Intelligence Went Wrong"

Peter Norvig:

Ref2: "On Chomsky and the Two Cultures of Statistical Learning"

Why computer scientists and AI researchers are attracted by the idea of Oracle?

Oracle is a misleading word. In Ancient Greece these were people who knew the meanings of events because of their connections to Gods. Oracle in today computer science is a fancy word for a random number generator. But random numbers do not create meanings. May be mathematicians should better learn psychology and trying to develop mathematical models of the meanings and creativity.

What are you ideas about Danger Theory Algorithm? There are some algorithms to model and implement Artificial Immune System like Negative Selection and Clonal selection and recently Danger Theory. I want work on Dendritic Cell Algorithm (DCA), but it is a little ambiguous. Any recommendations?

Algorithms to deal with unbalanced clusters for classification?

Data Mining of Big Data using tools like SVM, clusters, trees, MCMC, and NN have emerged in place of conventional statistics in order to handle large size and complexity. These are well-suited to *spot patterns in large, static data sets*.

But, the inevitable demand for *rapid analysis of streaming data* reveals the limitations of Data Mining methods, especially where data streams are unstable, chaotic, non-stationary or have concept drift. And that covers many important areas (!), like human behavior (economic, social, commercial, etc.) My focus is mostly computational finance.

Data Mining methods lag in adjusting to changes in the observed behavior. A key problem is the uncertainty whether estimates calculated from past data still are good enough to apply in the often changing future. How frequently and when does the model for prediction or classification need to be updated? How responsive to incoming data should the estimating procedure be to achieve the needed reliability, without getting whipsawed or lagging amidst shifts?

Having a machine learning tool that self-corrects to minimize prediction and classification errors is the challenge. A forgetting factor, as in the dynaTree R package, could be effective if it adjusted automatically. A gain factor, as in Kalman Filtering, can be set pretty well for steady systems (physics), but is sluggish in chaotic settings. GARCH and its relatives provide particularly clumsy structures. Many other approaches exist like Dynamic Model Averaging, adaptive ensembles. Some models must work well, like real-time demand estimators within Google’s Borg, which load-balances its servers.

Have you had success in this area? Can you cite methods, sources, examples or software? I would be glad to discuss this more if you have interest. Thanks.

Tom

Here is a link http://www.eann.org.uk/eann2012 for the Engineering Applications of Neural Networks Conference that will take place in London, between 20-23 September 2012 (EANN 2012).

Considering the embodiment process of an organism, in which the autopoiesis plays its role across all the body cells, for Varela ad Maturana, "a cognitive system is a system whose organization defines a domain of interactions in which it can act with relevance to the maintenance of itself." (This domain of interactions seems to be the sufficient condition for a system, to be considered a cognitive system, so the "neurality" seems not to be necessary...)

I'm looking at graphical models for solving some 2d vision problems (specifically, energy-minimization models such as boltzmann machines) and I've been thinking about the problem of translation invariance. Obviously, there are many machine vision algorithms that have translation invariance, and there are many graphical models for vision, but I have not yet seen any purely graphical models that are also translation invariant (without clever 'tricks' such as translating the image over and over and applying the whole network again each time). Any pointers or links would be appreciated, as I really don't know where to look at this point. By 'graphical models' I preferably mean neural networks or Bayesian networks, though other types of networks would also be interesting to study.

Even assuming the strong relationship between form and context, would it be possible for a computer system to take into account that context is subjective?

What is the latest technology in advanced machine learning ( AML), beside Natural Language Processing and Neural Network?

Can any one suggest me some basic papers where I can find neural network & rough set theory ´combined together?

Can I get more details for Hidden Markov Models and it's equations to recognize images?

I am in need of a free texture analysis software or matlab code. Can you recommend one?

I am working on an application in which I am using a similar pattern of vectors.

Can anybody tell me how to use SOM plotted data in matlab to find out the similarity? It's only showing the relative input vectors in every neuron.

Please suggest with an example using data mining technique. How can I know the attacks in wireless sensor network from a dataset?

I need to classify raw data and hope I could use particle swarm optimization method to classify part of the data.

I hope to predict the remaing data.

When ones is handling a sequence of equations, what about his semantic perpective? How he is really "aware" about those symbols (and so on)?