Science topic
Computational Intelligence - Science topic
Computational methodologies inspired by naturally occuring phenomena.
Questions related to Computational Intelligence
2025 4th International Conference on Computer, Artificial Intelligence and Control Engineering (CAICE 2025) is to be held in Hefei, China in January 10-12, 2025.
Conference Website: https://ais.cn/u/YNfu22
---Call for papers---
The topics of interest for submission include, but are not limited to:
(I) Computer
· Edge Computing and Distributed Cloud
· Architecture, Storage and Virtualization
· Cloud Computing Technologies
· Deep Learning and Big Data
· Computer networks
......
(II) Artificial Intelligence
· Artificial Intelligence Applications
· Pattern Recognition and Machine Learning
· AI Languages and Programming Techniques
· Cybersecurity and AI
· Artificial Intelligence and Evolutionary Algorithms
......
(III) Control Engineering
· Automatic Control Principles and Technology
· Design, modeling and control of precision motion systems
· Vibration analysis and control
· Fuzzy control and its applications
· Fractional order system and control
· Flexible robotics, soft robotics
· Smart automation
---Publication---
All papers will be reviewed by two or three expert reviewers from the conference committees. After a careful reviewing process, all accepted papers will be published in ACM conference proceeding, and submitted to EI Compendex, Scopus for indexing.
---Important Dates---
Full Paper Submission Date: December 19, 2024
Notification Date: December 25, 2024
Registration Date: December 29, 2024
Conference Dates: January 10-12, 2025
--- Paper Submission---
Please send the full paper(word+pdf) to Submission System:
* apologies for cross-posting*
*** LAST CFP ***
IEEE Symposium on Explainable, Responsible, and Trustworthy CI (IEEE CITREx)
2025 IEEE Symposium Series on Computational Intelligence(SSCI)
17-20 March 2025 - Trondheim, Norway
***************************************************
The development of explainable, responsible, and trustworthy computational intelligence (CI) and artificial intelligence (AI) is crucial for transparency, accountability, and user confidence. By ensuring AI/CI systems are understandable, monitored, and reliable, we can make decisions that comply with regulations and ethical principles, thereby reducing risks and biases. This empowers users to engage with these technologies confidently, promoting a harmonious integration of AI into daily life.
This symposium aims to discuss the ethical principles governing AI/CI technology in light of current regulatory efforts and the impact on operators, users, and stakeholders. It will explore how to make AI/CI decisions clear and interpretable to enhance transparency and trust. Additionally, the symposium will address balancing technological ecological footprints with economic benefits, managing automation's workforce impact, protecting privacy, and the legal implications of AI/CI in autonomous systems among some othertopics of interest.
IEEE SSCI is widely recognized for cultivating the interchange of state-of-the-art theories and sophisticated algorithms within the broad realm of Computational Intelligence Applications. The Symposia provide for cross-pollination of research concepts, fostering an environment that facilitates future inter and intra collaborations.
IMPORTANT DATES UPDATED
Title and Abstract Submission:NEW DEADLINE 22nd Sept, 2024
Full/Short Papers Submission: NEW DEADLINE 1st Oct, 2024
SYMPOSIUM KEYNOTE SPEAKER
- Federico Cabitza (Università degli studi di Milano-Bicocca)
CONFERENCE KEYNOTE SPEAKERS
- Keeley Crockett(Manchester Metropolitan University and Chair of the IEEE Technical Committee SHIELD: Ethical, Legal, Social, Environmental and Human Dimensions of AI/CI)- Physical Intelligence of Small-scale Robots
- Metin Sitti(President and Professor of Koç University, Istanbul) - Social Robots and LLMs: a step towards to behave more human like
- Nadia Magnenat Thalmann(Research Director of MIRALab, University of Geneva, Switzerland)
We are looking forward to seeing you in Trondheim!
会议征稿:第四届计算机、物联网与控制工程国际学术会议(CITCE 2024)
Call for papers: 2024 4th International Conference on Computer, Internet of Things and Control Engineering (CITCE 2024) will be held on November 1-3, 2024 in Wuhan, China as a hybrid meeting.
Conference website(English):https://ais.cn/u/IJfQVv
重要信息
大会官网(投稿网址):https://ais.cn/u/IJfQVv
大会时间:2024年11月1-3日
大会地点:中国-武汉
收录类型:EI Compendex,Scopus
会议详情
第四届计算机、物联网与控制工程国际学术会议(CITCE 2024)将于2024年11月1-3日在中国-武汉召开。CITCE 2024将围绕计算机、物联网与控制工程的最新研究领域,为来自国内外高等院校、科学研究所、企事业单位的专家、教授、学者、工程师等提供一个分享专业经验、扩大专业网络、展示研究成果的国际平台,以期推动该领域理论、技术在高校和企业的发展和应用,也为参会者建立业务或研究上的联系以及寻找未来事业上的全球合作伙伴。大会诚邀国内外高校、科研机构专家、学者,企业界人士及其他相关人员参会交流。
征稿主题
1. 计算机科学:演算法、图像处理、计算机视觉、机器学习、智能数据分析与数据挖掘、数学和计算机建模、人工智能、神经网络、系统安全、机器人与自动化、信息系统、高性能计算、网路通讯、人机交互、电脑建模等;
2. 物联网:人工智能技术与应用、CPS技术与智能信息系统、物联网环境中的多网资源共享、物联网技术体系架构 、物联网中的云计算与大数据、边缘智能与区块链、智慧城市、物联网可穿戴设备 、智能家居、物联网与传感器技术等;
3. 控制工程:系统和自动化、电气系统、过程控制、工业控制技术、计算机科学与工程、电子工程学、软件工程、控制技术、传感器网络、移动互联网、无线网络和系统、计算机控制系统、自适应和最优控制、智能控制、电气自动化、智能控制和智能系统、智能管理和决策、分布式控制、驱动电机和控制技术、动力电池管理和维护技术、微传感器和执行器、移动机器人等
**其它相关主题均可
会议出版
会议投稿经过2-3位组委会专家严格审核后,最终所录用的论文将被ACM ICPS (ACM International Conference Proceeding Series)(ISBN:979-8-4007-1184-8) 出版论文集,并提交至ACM Digital library,EI Compendex, Scopus检索。目前该会议论文检索非常稳定。
参会方式
1、作者参会:一篇录用文章允许一名作者免费参会;
2、参会类型
(1)口头汇报:10-15分钟的全英PPT演讲;
*开放给所有投稿作者与自费参会人员;针对论文或者论文里面的研究做一个10-15min的英文汇报,需要自备PPT,无模板要求,会前根据会议邮件通知进行提交,详情联系会议秘书。
(2)海报展示:自制电子版海报,会议安排展示;
*开放给所有投稿作者与自费参会人员;格式:全英-A1尺寸-竖版,需自制;制作后提交海报图片至会议邮箱IC_CITCE@163.com,主题及海报命名格式为:海报展示+姓名+订单号。
(3)仅参会:非投稿作者,现场听众参会。
*仅开放给自费参会人员
(4)投稿参会链接:https://ais.cn/u/IJfQVv
Dear fellow language educators, linguistic researchers, computational intelligence scientists and any interested peers:
I would like to introduce you to the International Council of Academics for Progressive Education (I.C.A.P.E.). We are a global initiative engaging in research, publishments and connection with proactive dialogue and discussion among interested peers, in order to improve language education in secondary and higher education, challenge archaic curricula and learning models, and identify, communicate and eventually implement necessary changes, novel approaches and brilliant ideas in educational science, with a special focus on language education and its intersections with other related research fields, such as the integration of A.I. in curriculum development.
We are happy to welcome you on board, publish your ideas, get you in contact with interested language schools, researchers and scientists around ther globe. We maintain close relationships with journals, university faculties, research projects and much else.
For more information, please visit www.icape-edu.com. Self-evidently, there is no spam, no fee structure, and all of our endeavors are not-for-profit. Our only mission is contributing to better education and accelerating the implementation of changes.
Best regards!
2024 3rd International Conference on Biomedical and Intelligent Systems (IC-BIS 2024) will be held from April 26 to 28, 2024, in Nanchang, China.
It is a comprehensive conference which focuses on Biomedical Engineering and Artificial Intelligent Systems. The main objective of IC-BIS 2024 is to address and deliberate on the latest technical status and recent trends in the research and applications of Biomedical Engineering and Bioinformatics. IC-BIS 2024 provides an opportunity for the scientists, engineers, industrialists, scholars and other professionals from all over the world to interact and exchange their new ideas and research outcomes in related fields and develop possible chances for future collaboration. The conference also aims at motivating the next generation of researchers to promote their interests in Biomedical Engineering and Artificial Intelligent Systems.
Important Dates:
Registration Deadline: March 26, 2024
Final Paper Submission Date: April 22, 2024
Conference Dates: April 26-28, 2024
---Call For Papers---
The topics of interest for submission include, but are not limited to:
- Biomedical Signal Processing and Medical Information
· Biomedical signal processing
· Medical big data and machine learning
· Application of artificial intelligent for biomedical signal processing
......
- Bioinformatics & Intelligent Computing
· Algorithms and Software Tools
· Algorithms, models, software, and tools in Bioinformatics
· Biostatistics and Stochastic Models
......
- Gene regulation, expression, identification and network
·High-performance computational systems biology and parallel implementations
· Image Analysis
· Inference from high-throughput experimental data
......
For More Details please visit:
Dear Colleague,
We are currently accepting submissions for our upcoming Special Issue entitled "Applications of Computational Intelligence in Electrical Power Systems", which will be published in the Learning and NonLinear Models (LNLM) journal in the 2nd semester of 2024. The Special Issue is open to both original research articles and review articles, and the deadline for submission is March 30, 2024. More details can be found in the Call for Papers attached to this e-mail.
Learning and NonLinear Models (LNLM, http://abricom.org.br/lnlm, ISSN 1676-2789) is the official journal of the Brazilian Society for Computational Intelligence (SBIC), which has been published online since 2003. The journal publishes papers reporting theoretical and practical advances in several areas of Computational Intelligence, and all papers are Open-Access and indexed with DOI. Since LNLM is in the process of internationalization, all papers included in this Special Issue must be written in English. More details can be found in http://abricom.org.br/lnlm/special-issues/.
Best regards,
=============
Guest Editors
=============
André E. Lazzaretti. Federal University of Technology Paraná (Curitiba, Brazil). https://orcid.org/0000-0003-1861-3369
Wesley Angelino de Souza, Federal University of Technology Paraná (Cornélio Procópio, Brazil). https://orcid.org/0000-0002-3431-6359
do you work with computational intelligence/ML/AI and big data? if so the deadline for submissions to the IEEE CIBD https://lnkd.in/eUkKcBZr track of the SSCI have been extended until the 15th of August. submit your papers here https://lnkd.in/eQVDk8gy
Greetings to everyone! I had a small question. I'm currently trying to do my bachelor's thesis with a title called "Estimation of the wind potential through the computational intelligence tool for the production of electrical energy in the district of Ocucaje, Ica". I really wanted to combine wind energy and IA in one thesis but I'm kind of stuck right now. The region of Ica as a whole (Ocucaje city is part of the region of Ica) already has a wind potential estimation made by the government in 2016. So I'm unsure if I'm repeating the same thing as the government did or not, should I shape my title differently or add any other thing? I would appreciate any comments. Have a good day
As part of ongoing technological advances and continually improving, could artificial intelligence equipped with new Big Data analytics solutions and full access to data on the Internet be equipped in future with solutions for self-learning and self-improvement and perhaps also with a kind of artificial consciousness and therefore ultimately prove to be more intelligent than humans?
Year after year, microprocessors will be developed to process ever faster and ever larger data sets. More and more perfect artificial intelligence solutions will also be built, and successive generations of artificial intelligence will be created, e.g. much more perfect than what is currently offered, for example, by the technology on which ChatGPT was developed, which is offered on the Internet in open access. However, the ever-increasing computational capabilities of ICT information technology and the ever-larger, multi-terabyte datasets that can be efficiently processed do not yet ensure the success of a specific, highly intelligent outcome. An important issue is the use of increasingly sophisticated, multi-criteria, complex analytical models, including models composed of multiple algorithms and equipped with the capacity for self-learning and self-improvement using large amounts of data and information retrieved from, for example, the Internet. But it cannot be ruled out that perhaps in the future, a constantly improved artificial intelligence equipped with new Big Data Analytics solutions and full access to data on the Internet may prove to be more intelligent, but it will not be smarter than humans even if it is equipped with a kind of artificial awareness in the future. Perhaps citizens will receive on their smartphones in the future a kind of much more enhanced personal advisors than those currently in use, who will provide highly professional answers to the various questions asked by smartphone users on the basis of data available on the Internet and processed in real time on the Big Data Analytics platform and on the basis of the new generations of machine learning, deep learning and artificial intelligence technology used in these automated advisory solutions. And beyond that, whether such solutions can significantly deepen the issue of new cybercrime techniques and new cyber security systems.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Within the framework of the technological advances that are taking place and which are constantly being improved, could artificial intelligence equipped with new Big Data Analytics solutions and full access to data on the Internet in the future be equipped with solutions for self-learning and self-improvement and perhaps also with a kind of artificial consciousness and therefore ultimately prove to be more intelligent than humans?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
Are there any online Summer Schools designed for post-doc and young faculties who are willingly to deepen their skills in computational intelligence and related areas or earn some credit transfer points. The preferable areas are engineering, science and Intellectual property rights. The objective is to stimulate to involve in rapidly evolving fields, and to foster participation in the adventure of research.
Quantum computing is fundamentally a synergistic combination of fields from quantum physics, classical information theory, and computer science. As a discipline to evolve, computationally intensive systems came into existence in the late 1990s. Quantum-inspired computational intelligence refers to an emergent field of research that concentrates on applying the principles of quantum computing.
source: Applied Quantum Computing (bonviewpress.com)
Hi,
Most of the researchers knew R Views website which is:
Please, I am wondering if this website contains all R packages available for researchers.
Thanks & Best wishes
Osman
Hi,
Does anyone know how to get rid of wicked production editors of Hindawi, especially those who take care of the journal "computational intelligence and neuroscience". A manuscript was accepted and APC was made some 4 months ago, since then nothing was shown by them.
Also, does anyone know how to contact Hindawi for a complaint? I have contacted academic editors, production editors, research integrity specialists, help, journal, press, and all of the attempts in these 4 months have gone unfruitful.
Does Hindawi considered a prestigious publishing house???
I have submitted my article to the "Computational intelligence and Neuroscience" journal under Hindawi publisher.
Does it worth publishing an article here?
How can we support innovation management through computational intelligence tools?
dear researchers
I would be appreciative if you let me know your opinion about the disadvantages of the SOM clustering algorithm.
I am programming a scheduling system using simulated annealing and I want to know if this heuristic is suitable?
At present, the dominant opinion is that artificial intelligence will find many applications in IT, teleinformatics, internet new media, robotics, industrial automation etc. in the 21st century.
There are many potential applications. On the other hand, there are research concepts that there will not be one artificial intelligence but many different versions and levels of artificial intelligence.
In view of the above, the current question is: What will be the potential uses of artificial intelligence in the future?
Please, answer, comments. I invite you to the discussion.
According to Kardashev Scale, can a Type I Civilization be able to develop artificially intelligent system that surpasses it's own intelligence to match the intelligence of higher civilizations in the hierarchy?
I have applied metaheuristic algorithms such as PSO, GA in my research field which is recommender system but what I have found is these algorithms are very time consuming and really not practical, though the result is better than the existing algorithms. In recommender systems, we need fast algorithm. Thank you.
What are the feasible research areas related to coronavirus (COVID-19) in the context of dynamics of epidemiological disease transmission, networks defined by graphs, computational intelligence and machine learning?
I'm seeking for new ways of using dynamical mathematical models and evolving computational intelligence systems to forecast early warning signs that could possibly slow down pathogens spread and also finding better ways in the future to fight infectious outbreaks.
Suppose I have a network with 32x128x256x64. For example there are two kernels out of 256 at the third layer, to learn some features.
My main question: Is it possible for these two kernels to learn the same (or nearly similar) feature while being on the same layer?
Questions:
- If yes to the above question, is there any way to avoid this, and what is the method?
- if yes, Is it possible to use eigen value or any other decomposition method to retain only independent signals (kernels) and discard the similar ones?
I am in need of a free texture analysis software or matlab code. Can you recommend one?
- Could you please point me out to some Computer science, and Computer Engineering applications modeled, described, or analyzed using partial differential equations?
- Preferably, involving heat, reaction-diffusion, Poisson, or Wave equation.
- If possible in fuzzy environment.
Best regards
Sarmad.
Robots due to cost are a limited resource for teaching but useful. They engage students and make concrete principles but it is not possible to have one robot per student for both cost and space reasons. How can I get the same benefits of robots for teaching AI through other methods?
Article Neural nets
Recently, several works have been published on predictive analytics:
- Prediction-based Resource Allocation using LSTM and Minimum Cost and Maximum Flow Algorithm by Gyunam Park and Minseok Song (https://ieeexplore.ieee.org/abstract/document/8786063)
- Using Convolution Neural Networks for Predictive Process Analytics by Vincenzo Pasquadibisceglie et al. (https://ieeexplore.ieee.org/document/8786066)
Besides, there is a paper on how to discover a process model using neural networks:
My questions for this discussion are:
- It seems, that the field for machine learning approaches in process mining in not limited to predictions/discovery. Can we formulate the areas of possible applications?
- Can we use process mining techniques in machine learning? Can we, for example, mine how neural networks learn (in order to better understand their predictions)?
- If you believe that the subjects are completely incompatible, then, please, share your argument. Why do you think so?
- Finally, please, share known papers in which: process mining (PM) is applied in machine learning (ML) research, ML is applied in PM research, both PM and ML are applied to solve a problem. I believe, this will be useful for any reader of this discussion.
For my project I need to analyze plant structures in 3d.
I know PCL is a good library for this, but perhaps is there another better library which is growing faster?
Can I get more details for Hidden Markov Models and it's equations to recognize images?
.i need urgent answer please share link or few of SCI indexed research papers
Hi
i want three possible research problem in Artificial Intelligence (machine learning, Computational intelligence, data mining, etc.)
Any help pleas ?
I'm searching a valid tutorial book to understand the principle of machine learning theory in order to apply directly to solve some problem in the field of bio-photonics to classify some species. I use Matlab and so I will prefer to use it as solver.
hey all, I am working for building a firewall for prevention of SQL Injection in website. I am planning this through artificial neural network. need your comments and suggestions on this.
if anybody is working on similar then guidance is needed..
thanks in advance
Chetan
Hello.
Is there a fully functional NSGA-III implementation?
There is an implementation in Java and C++, jMetal bases their implementation on : http://web.ntnu.edu.tw/~tcchiang/publications/nsga3cpp/nsga3cpp-validation.htm, the author of which says the algorithm does not scale well beyond 8 objectives.
I'm looking to use NSGA-III on many-objective problems (which it is designed to handle) i.e. 15 objectives.
Thank you
Evolutionary Algorithms: https://www.youtube.com/watch?v=L--IxUH4fac
Multi-Objective Problems: https://www.youtube.com/watch?v=56JOMkPvoKs
I have written some fuzzy rules for emotion modelling. But manually I can write a very limited number of rules only. Is there some mechanics to make these rules generated automatically?
I need to get software capable of running models of cellular automata or multi-agent systems to simulate urban growth. Preferably free and if they support in ArcGIS environment better. The aim is to analyze their applicability to simulate urban growth in Argentine cities.
I have seen journals that used fuzzy logic algorithms for energy conservation of wireless sensor networking. But is it possible to logical to implement fuzzy set rules in underwater sensor network as the parameters and properties are different?
Hello Sir/Mam,
I wants suggestion about Open Problems/Challenges in field of Evolutionary Computing and Computational Intelligence Field for Thesis work.
Kindly Guide me.
Thanks
I am working on computational Intelligence. I want to start research in innovative ideas in computational Intelligence. I am interested in ACO and PSO. Can anyone suggest me the best sub research domain which is active nowadays?
Global search vs local search.
There are many different kinds of soft computing methods used for identification of complex system including robotic manipulators and mechatronic systems. But for making an intelligent choice to extract the best dynamic of under study systems, we will not be successful to model them very well.
I'm trying to write rules (if-then) on Protege 4.2 using rule editor, but am unable to find out how to execute those rules using a pellet reasoner and deduce some inferences or reasoning from them.
In decision making applications based on neutrosophic logic, how to sort in best to worse order for following:
(T, F, I) : (1,0,0) (1,0,1), (1,1,0), (1,1,1), (0,0,0), (0,1,1), (0,1,0)
- Could anyone please help me to some Biochemistry, Genetics and Molecular Biology modeled, described, or analyzed using partial differential equations? "the model is very appriciated"
- Preferably, involving heat, reaction-diffusion, Poisson, or Wave equation, an If possible in fuzzy environment.
Best regards
Sarmad.
did anyone work on image segmentation (edge detection) using meta-heuristics methods swarm intelligence (ACO, PSO, ABC and GA) using graph theory on matlab?
would you cite me some articles or matlab code?
thank you
What is relation between Machine Intelligence and Smart Networks?
As I'm Doing my desertion work in Data stream with concept drift using MOA tool. I have seen in one thesis they have used real data-sets as well as the synthetic data-set to find concept drift in that.
I am working on classifying mammogram images using computational intelligence. Is there a database with images that can be opened in Windows 7. There are a few but they are supported by a Unix environment. If anyone has experience working in the field, please do share.
I and my friend have read the paper about FURIA, but we are still confused.After reading the code java of FURIA, so many rows. T_T
We got exercise from my lecture if we use and apply this algorithm, we must know and understand well about FURIA.
After looking more paper, unfortunately, we are confused again.
Maybe, you can help us, there are some questions:
- Please explain about the step from input dataset until output rule
- Dataset is partitioned by growing and pruning data, default 3 fold. Is it true that 2 fold for growing data (training data) and 1 fold for pruning data (testing data)? Then how the model when using cross-validation 10 fold?
- What the function Apriori distribution there, is about count frequent itemset, to prepare the candidate to rule ?
- Then, how calculate the purity or count and processing the fuzzification phase?
- Then, how count the CF in the final step. How count confidence from the rule?
Your explanation is helpful.
thanks
Hi,
Active Learning is based on learning for labelled and unlabeled pairs together, and all the data have the same characteristics, and the limitation is guessed the unlabeled pairs during the learning procedure.
Please, How can Supervised Learning replace Active Learning in Learning to Rank? How is the guessing for unlabeled pairs data can cause bias in the ranking model?
Thanks & Best wishes
Osman
Hi,
Clustering terms or clustering document existing on document collection is mainly based on the similarity based on the term-weighting scheme used.
Please, What is the shortage and limitation in clustering techniques in Information Retrieval when using TF-IDF and its variations?
Thanks & Best wishes,
Osman
Projects for Speech Recognition; prototype and small applications preffered in Delphi
Hi
I am working on sarcasm detection in Telugu text. Can anyone tell why rule-based approach outperforms decision tree and other classifiers?
Dear Colleagues,
We are organizing a session at the 2017 Fall AGU meeting, titled: " Applications of machine learning and novel statistical approaches to the study of weather and climate using large datasets" by Jiali Wang, Paul Loikith, Won Chang, and Roman Olson.
The motivation and goals of the session are described below.
“The increasing volume of climate data from observations, analysis products, and climate model output presents the climate science community with unprecedented data analysis challenges and opportunities. This challenge becomes greater when targeting extreme events as standard data reduction techniques like multi-model ensemble averaging reduce the magnitude of extremes. To address this, several sophisticated statistical and machine learning algorithms have been developed and applied to distill critical information from this growing trove of climate data. This session invites speakers that develop and/or apply machine learning or related novel statistical techniques to mine climate data for the study of climate projection, climate/weather extreme events, and more.”
We hope that you are interested in submitting an abstract to this session.
Thanks,
Jiali
Hi,
Please, what is the LETOR dataset that contains user clicks and the user dwell time as features?
I also Please, I am wondering if is there available any tool for Information Retrieval can make similar to this dataset?
Thanks & Best Wishes
Osman
Hi
I have a question about multi objective virtual machine placement! Shall we use jmetal or any other multi objective framework for VMP problem in cloudsim? or no? is there anyone to clarify this problem for me . JMETAL is quite confusing!
how can I see the result of BFD VM placement in cloudsim?
and one more thing, Is there anyone who applied ACO or PSO or GA or BBO algorithm in a multi objective manner or single objective manner in cloudsim ? So that I can be familiar with this procedure?
Thanks
I'm modelling a small amount of data and the performance plot shows that the data is clearly overfitting. It's a prediction problem and i'm using a back propagation neural network. The input data is divided across different binary systems. I was thinking about combining the systems and model the data since neural network performance generally increases with increase in amount of data but on the other hand, modelling the systems individually will bring a more direct relationship for the given pair of data since each system have their own properties. The regression plot i obtained showed showed a very good agreement between output and target data with R values of 1. Though the performance plot showed that the test curve increased significantly before the validation curve increased which is a clear sign of overfitting. Even after initializing the weights i still observed the same trend. So i was looking for ways to improve the network performance.
If the regression value is 1 for the test set do i need to decrease the fraction of the Training and Validation set to find out the limit when the results on the test set start getting below some target (e.g. 0.995)?
How to select an Evolutionary algorithm (like GA, PSO, ACo etc.) among the lot of existing ones for feature optimization? I mean what are the parameters which we need to see when we select any such algorithm for any pattern classification problem?
Please reply me with the python code for training and testing agents in an FPS Game AI resarch platform similar to vizdoom. VizDoom is a Doom-based AI research platform for visual reinforcement learning. besides code any materials like ppt, word,pdf etc will really be appriciated...Thanks
Hi,
Most of the machine learning and computational intelligence techniques did not include a comparison in terms of computational runtime Vs accuracy nor available packages for research replication. Most of the researchers claimed that their proposed approaches outperformed other machine learning or computational intelligence techniques. However, if these other machine learning techniques were adapted for more training time, they may outperform the new proposed approaches in terms of accuracy.
Please, I am wondering why the most of the researchers did not include the clear comparison in terms of time Vs accuracy with packages available for replicate their studies. These needs because the machine learning and similar disciplines are heuristics and research replication may be varied from researcher to another.
thanks
Osman
I would like to learn how Artificial intelligence methods such as rule-based system are used in document summarization.
I'm still working to understand estimation of distribution algorithms (EDA) as applied to genetic algorithms. Can the probabilistic models used by EDA for generating new solutions be used by itself? For example, in Bayesian optimization algorithms (BOA) can the Bayesian network that is produced be extracted and used separately as a Bayesian classifier?
Hello Friends,
I recently learnt that there is knowledge discovery in databases (KDD, "data mining") software framework developed for use in research and teaching named ELKI. Which includes implementation of many clustering Algorithm. Additionally they are far more memory efficient.
My intention is to use those implementation in python and yet ELKI implemeted everything in Java. Has anyone tried to do similar thing ? If yes, Please express your thought on my approach.
Thanks in advance
Abhishek
In my artificial intelligence application, I am working in a high-dimensional space with the following metric: The set of dimensions is partitioned into domains and the distance between two points is computed by measuring the Euclidean distance with respect to each of these domains and then summing up these group-wise distances. This is basically a combination of the Euclidean (within-domain) and the Manhattan (between-domains) distances.
For my application, I need to compute the hypervolume of a hyperball in this space (i.e., the set of all points with a distance of less than r to the origin). I managed to find a formula for this hypervolume and also a mathematical proof for it. As I spent quite some time on developing this, I would like to publish my findings somewhere.
The problem is that the proof itself is not really related to artificial intelligence any more, so I highly doubt that any of the typical AI venues would publish this. It seems that this is more of a mathematical problem than anything else, but I don't know what could be an appropriate journal/conference/workshop for something like this.
If anyone can point me towards something, that would be greatly appreciated. Please let me also know if I need to elaborate more. Thanks in advance!
I am reading a paper where i find the term "The reference template histogram used in histogram matching was generated by pooling all the image data used in this study, and subsequently scaled from". I am not able to understand that how histogram is generated from multiple images. Can anyone explain? Here is the reference of the paper
Chang, P. D. "Fully convolutional neural networks with hyperlocal features for brain tumor segmentation." Proceedings MICCAI-BRATS Workshop. 2016.
Trying to initialize the particles’ velocity and position with the structure of Cellular Neural Network in Matlab.
Weka had an add on the topic but details of the algorithm are not available.
The basic model for morphogenesys proposed by Turing is not difficulto to grasp, mathematically, but my problem is really how to create the real algorithms in pseudo code that will fit in every cell of a Swarm of agents.
dG/dt = -m1G + f(P)
dP/dt = -m2P + f(G)
How are classifiers like KNN or K-Means or Naive Bayes or models like GMM from Machine Learning implemented in practicality in the field of Software Engineering? What is the basic intuition behind the practical approach?
Hello,
I read the article "Utilizing Association Rules for
Improving the Performance of
Collaborative Filtering ", I did not understand the phase of its alogorithm: A = {a1, a2 ...} and B = {b1, b2 ...}, how can we generate association rules from these two sets, Knowing that I understood the rules of association and I know for example the algorithm Apriori but there I can not understand.
In the article Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy (Peng, Long, Ding, 2005) in section 2.3, the theorem about equivalence of first-order incremental search for mRMR and Max-Dependency criterion is presented (using mutual information and MID).
I have created dataset (details in attachment), for which greedy algorithms mRMR and Max-Dependency select different subsets with two features. What is wrong with my example?