Science topic

Scientific Computing - Science topic

Explore the latest questions and answers in Scientific Computing, and find Scientific Computing experts.
Questions related to Scientific Computing
  • asked a question related to Scientific Computing
Question
2 answers
Will the development of intelligent chatbots available on the Internet based on generative artificial intelligence negatively or rather positively affect the development of science, the development of scientific research, the analysis of data from research conducted, the description of results obtained from research conducted, the writing and publishing of scientific texts, etc.?
Recently, rapid development of ICT and Industry 4.0/5.0 technologies is taking place, including Big Data, Internet of Things, cloud computing, digital twins, multi-criteria simulation models, machine learning, deep learning and generative artificial intelligence, among others. Developments in generative artificial intelligence technology are being made through the use of artificial neural networks, among others. New applications of generative artificial intelligence are determined by the previously carried out process of GAI system training, i.e. teaching the implementation of specific skills, performing complex tasks, performing new functions, solving specific problems intelligently using deep learning technology. Increasingly, generative artificial intelligence technology is being trained to intelligently perform complex research and analysis processes. Among other things, this kind of application of generative artificial intelligence is the implementation of this technology for business analytics carried out using large sets of data and information, i.e. analytics carried out on computerized business intelligence and Big Data Analytics platforms. This type of analytics is being applied in various fields of knowledge, various sectors of the economy, various companies, enterprises, financial and public institutions. This type of analytics is also increasingly used in improving research processes and increasing the efficiency of complex analytical processes carried out as part of ongoing research in various scientific disciplines. Since OpenAI's ChatGPT chatbot was made available on the Internet, i.e. since November 2022, more similar intelligent chatbots created by other leading technology companies have been successively appearing. The intelligent chatbots made available on the Internet are used, among other things, in the development of the results of scientific research conducted, in the execution of certain stages of analytical processes, in the processing of results obtained from scientific research conducted, etc. The increase in the application of intelligent chatbots in research and analytical processes is due to the simplicity of operation of these chatbots, their availability on the Internet in the formula of open access, the ability of these tools to implement complex research processes, multi-criteria analysis, intelligent problem solving. On the other hand, the possibilities of applying the aforementioned chatbots in the processes of conducted scientific research are still severely limited due to the many imperfections of the databases on which certain generative artificial intelligence systems were trained. It still happens that the databases of data and information on which the said GAI systems were trained contain data and information in many respects outdated, incomplete, in the course of the "work" of these tools certain data and information can be "creatively" combined so that in the results of the work of a certain intelligent chatbot there are often "fictitious facts", ie. generated new and factually inconsistent content, factual errors, misrepresentations, falsehoods, which can be presented and described within the phraseologically, syntactically, stylistically correct essays, papers, articles, etc., written by generative artificial intelligence. Besides, the textual and other studies created by these tools often do not show all the data sources, all the source publications, all the materials that the chatbot used in drawing certain data and information while creating the commissioned human textual, graphic work, etc. Besides, even if the sources of data and information are partially shown, they are often shown in an incomplete way, inconsistent with the current standards for showing and compiling source and bibliographic footnotes. Perhaps, in the future, the aforementioned, used currently made available on the Internet intelligent chatbots will be sufficiently improved, corrected, supplemented so that they can be used by researchers and scientists in specific research, analytical processes within the framework of ongoing scientific research to a fuller extent and without the currently existing risks. Therefore, the development of intelligent chatbots available on the Internet based on generative artificial intelligence currently both negatively and positively can affect the development of science, the development of scientific research, the analysis of data from conducted research, the description of results obtained from conducted research, the writing and publishing of scientific texts, etc. Whether serious risks are generated or rather positive aspects prevail with the application of currently available intelligent chatbots on the Internet in certain aspects of the research and analysis processes carried out as part of the scientific research conducted depends on a number of factors. On the one hand, it depends on whether the technology company developing the said intelligent chatbots keeps improving them, enhancing them and expanding them with new functions and skills. On the other hand, it also depends on whether such cultivated specific research and analytical tools are used prudently by researchers and scientists with knowledge of the drawbacks and limitations associated with the use of these tools.
I described the key issues of opportunities and threats to the development of artificial intelligence technology in my article below:
OPPORTUNITIES AND THREATS TO THE DEVELOPMENT OF ARTIFICIAL INTELLIGENCE APPLICATIONS AND THE NEED FOR NORMATIVE REGULATION OF THIS DEVELOPMENT
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Will the development of intelligent chatbots available on the Internet based on generative artificial intelligence negatively or rather positively affect the development of science, the development of scientific research, the analysis of data derived from research conducted, the description of results obtained from research conducted, the writing and publishing of scientific texts, etc.?
Will the development of chatbots based on generative artificial intelligence negatively or rather positively affect the development of science?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
Relevant answer
Answer
We have the tools ready. But the way we practice with it based on our experience and ethics is what really influences the outcome.
  • asked a question related to Scientific Computing
Question
1 answer
Dear Master Degree, MPhil, and PhD Research Scholars,
We are thrilled to extend an invitation to the AESIM School on Computational Mathematics, focusing on "Differential Equations, Numerical Methods, and Their Applications with Scientific Computing." This event is organized by the Central Department of Mathematics at Tribhuvan University, Nepal and supported by CIMPA.
Key Details:
Venue:Central Department of Mathematics, Tribhuvan University, Nepal.
Program Dates: May 12-23, 2024
Registration Deadline: June 1, 2024
This unique school offers an excellent opportunity for Master Degree, MPhil, and PhD Research Scholars to delve into the realms of Computational Mathematics, specifically exploring Differential Equations, Numerical Methods, and their real-world applications through Scientific Computing.
Best regards,
Jeevan Kafle,
Relevant answer
Answer
Dear Dr. Jeevan Kafle,
Does this program have funding? If yes, what does the funding include?
Thank you in advance.
  • asked a question related to Scientific Computing
Question
4 answers
I am working on a research project involving a system of differential equations with one ordinary differential equation (ODE) and two partial differential equations (PDEs).
I would like to discuss methods and approaches for solving this system efficiently using Numerical method and Machine Learning.
Can you recommend Python/Matlab code for using numerical techniques and ML
Any guidance or references would be greatly appreciated.
Relevant answer
Answer
That's a good question. I rarely do that so not at answering this but you can find some nice tutorials for that from experts in this field.
Tutorials:
  • asked a question related to Scientific Computing
Question
3 answers
Can the conduct of analysis and scientific research be significantly improved through the use of Big Data Analytics, artificial intelligence and quantum computers?
Can the possibilities of Big Data Analytics applications supported by artificial intelligence technology in the field increase significantly when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers, which also apply Industry 4.0/5.0 technologies, including generative artificial intelligence and Big Data Analytics technologies?
Can the scale of data processing carried out by the most powerful quantum computers be comparable to the data processing that is carried out in the billions of neurons of the human brain?
In recent years, the digitization of data and archived documents, digitization of data transfer processes, etc., has been progressing rapidly.
The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Accordingly, developed economies in which information and computer technologies are developing rapidly and finding numerous applications in various economic sectors are called information economies. The societies operating in these economies are referred to as information societies. Increasingly, in discussions of this issue, there is a statement that another technological revolution is currently taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies classified as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence, including generative artificial intelligence with artificial neural network technology also applied and subjected to deep learning processes. As a result, the computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are gradually increasing. There is a rapid increase in the processing of ever larger sets of data and information. The number of companies, enterprises, public, financial and scientific institutions that create large data sets, massive databases of data and information generated in the course of a specific entity's activities and obtained from the Internet and processed in the course of conducting specific research and analytical processes is growing. In view of the above, the opportunities for the application of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted, are also growing rapidly. By using the combined technologies of Big Data Analytics, other technologies of Industry 4.0/5.0, including artificial intelligence and quantum computers in the processing of large data sets, the analytical capabilities of data processing and thus also conducting analysis and scientific research can be significantly increased.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and implemented by the most powerful quantum computers?
Can the applicability of Big Data Analytics supported by artificial intelligence technology in the field significantly increase when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers?
And what is your opinion about it?
What do you think about this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
Relevant answer
Answer
All three areas are in the development stage and they are helping in every research area. But the development of quantum computers would solve all the problems because the universe follows quantum physics, not classical physics.
  • asked a question related to Scientific Computing
Question
4 answers
What are the possibilities for the applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?
The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Currently, another technological revolution is taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies categorized as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence. The computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are successively increasing. The processing of ever-larger sets of data and information is growing. Databases of data and information extracted from the Internet and processed in the course of conducting specific research and analysis processes are being created. In connection with this, the possibilities for the application of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research being conducted, are also growing rapidly.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the possibilities of applications of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?
What are the possibilities of applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques?
What do you think on this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
The above text is entirely my own work written by me on the basis of my research.
Copyright by Dariusz Prokopowicz
On my profile of the Research Gate portal you can find several publications on Big Data issues. I invite you to scientific cooperation in this problematic area.
Dariusz Prokopowicz
Relevant answer
Answer
In today's world AI is the hot topic
of modern digital era but that is not ensure
AI not able to replace human intelligence
  • asked a question related to Scientific Computing
Question
3 answers
With growing and ever enhancing scientific validation techniques, are rule of thumb still valid? Are they still useful?
Relevant answer
Answer
Rules of thumb or heuristics will always be useful since they are quick and do not need many resources to apply.
  • asked a question related to Scientific Computing
Question
3 answers
I try to run headers files containing in the nr3 the art of scientific computing with c/c++. Is there some tips ?
Relevant answer
Answer
Anselme Russel Affane Moundounga I would recommend you look at the source code of numerical algorithms in the GNU scientific library, those contain state-of- the-art implementations in C/C++ for algorithms found in NR3, but optimized + adapted for edge/real world cases. It's a great exercise to compare your own code against it as well.
Hope this helps,
Ben
  • asked a question related to Scientific Computing
Question
53 answers
What kind of scientific research dominate in the field of Scientific research in the era of Industry 4.0?
Please, provide your suggestions for a question, problem or research thesis in the issues: Scientific research in the era of Industry 4.0.
Please reply.
I invite you to the discussion
Best wishes
Relevant answer
Answer
Industry 4.0 as component of the current context of the business:
"Impact of Industry 4.0 on Business Management Style".
  • asked a question related to Scientific Computing
Question
30 answers
There is probably no other science portal that would offer all the same functions for researchers as the Research Gate portal.
Do you agree with me on the above matter?
In the context of the above issues, I am asking you the following question:
Does the Research Gate research portal offer the most information services for researchers that researchers and scientists need?
Please reply
I invite you to the discussion
Thank you very much
Best wishes
Relevant answer
Answer
A bit too much of the results is presented on Researchgate with the lack of the backyard and preparation process for research. It could be good to see also the kitchen of the dinner of the researchers.
  • asked a question related to Scientific Computing
Question
8 answers
Many scientists suggest that a good way to analyze the level of innovation in action, generate innovation in financial institutions, eg in banks, is conducting surveys among managers and department directors, departments in these institutions.
How should such surveys be carried out? What method of surveys is the most effective? Do online questionnaire forms are an effective instrument for carrying out surveys?
What other research techniques can be used to investigate the level of innovation in operation, generate innovation in financial institutions?
Please reply
Best wishes
Relevant answer
Answer
Dear Mohammed Jaafar Ali Alatabe,
Thanks and invite you to joint discussions on the subject of innovation in financial institutions.
Regards,
Dariusz Prokopowicz
  • asked a question related to Scientific Computing
Question
18 answers
Dear Friends and Colleagues from RG,
I wish You all the best in the New Year. I wish you a successful continuation and successes in scientific work, achieving interesting results of scientific research in the New Year 2019 and I also wish you good luck in your personal life, all the best.
In the New Year, I wish You success in personal and professional life, fulfillment of plans and dreams, including successes in scientific work, All Good.
In the ending year, we often ask ourselves:
Have we successfully implemented our research plans in the ending year? We usually answer this question that a lot has been achieved, that some of the plans a year ago have been realized, but not all goals have been achieved.
I wish You that the Next Year would be much better than the previous ones, that each of us would also achieve at least some of the planned most important goals to be achieved in personal, professional and scientific life.
I wish You dreams come true regarding the implementation of interesting research, I wish You fantastic results of research and effective development of scientific cooperation.
I wish You effective development of scientific cooperation, including international scientific cooperation, implementation of interesting research projects within international research teams and that the results of scientific research are appreciated, I wish You awards and prizes for achievements in scientific work.
I wish You many successes in scientific work, in didactic work and in other areas of your activity in the New Year, and I also wish you health, peace, problem solving, prosperity in your personal life, all the best.
Thank you very much.
Best wishes.
I wish you the best in New Year 2019.
Happy New Year 2020.
Dariusz Prokopowicz
Relevant answer
Answer
Dear Colleagues and Friends from RG,
Hello Dear Everyone,
In the New Year of 2021, I wish all researchers, scientists and users of the Research Gate portal fulfill their plans and dreams, success in their professional work and personal life, success in the field of research and publication of their results, all the best. I wish all my colleagues and friends from the Research Gate portal that the New Year 2021 will be better than the previous years, that the SARS-CoV-2 (Covid-19) coronavirus pandemic ends as soon as possible, that there is a quick return to "normal", to the pre-pandemic state . The year 2021 begins a new decade of many new challenges related to solving key problems of the development of civilization, solving problems resulting from the pandemic crisis, economic, social and climate crises, etc. Let us hope that the development of science, further scientific research, new technologies, the current fourth technological revolution will enable the solution of key problems of civilization development.
Happy New Year 2021
Best regards, Stay healthy!
Dariusz Prokopowicz
  • asked a question related to Scientific Computing
Question
2 answers
Does anybody knows anything about INFORMATICA JOURNAL (SCI expanded), which publishes an article at a cost of 423 US/or equivalent bitcoin? Is this journal fake or true?
Relevant answer
Answer
The journal with website http://informaticajournal.com/ (or http://www.informaticajournal.com/ ) is fake, it is the hijacked version of the authentic one.
The journal real journal indeed has:
-The SCIE is real (https://mjl.clarivate.com/search-results use keyword informatica and you find the journal also linked to https://www.iospress.nl/journal/informatica/
Both sites redirect you to the authentic journal: https://informatica.vu.lt/journal/INFORMATICA they warn for the fake version: https://informatica.vu.lt/journal/INFORMATICA/information/attention
The real version (open access) clearly state it is free of costs https://informatica.vu.lt/journal/INFORMATICA/information/article-processing-charges-apc
Your information seems to confirm that the one you looked at is fake, since:
-the costs you mentioned are not mentioned anywhere (so no transparency) on their site
-they present themselves as subscription based so why do you need to pay while it is a subscription-based journal?
So, well-spotted your gut feeling told you that there might be something odd going here. Stay away from the fake one and consider the free of cost authentic one.
Best regards.
PS. If I have some time I will add your find to my list of hijacked journals not yet included in the Beall’s list of hijacked journals (https://beallslist.net/hijacked-journals/ ): https://www.researchgate.net/post/New_very_misleading_type_of_scam_Anyone_with_recent_examples
  • asked a question related to Scientific Computing
Question
5 answers
It seems that using machine/deep learning to solve PDEs is very popular (actually, not only in scientific computing, but also in all fields). So I want to know the reasons behind this. And is the prospect cheerful?
Relevant answer
Answer
For a variety of machine learning approaches, Differential Equations are often important, mostly based on comparison to some mathematical physical models.
Differential equations are a crucial method in physics for modelling a system's dynamics. Differential equations essentially tie the change rate of one quantity
to other device characteristics (with many variations on this subject).
So you are simply talking about unequal equal treatment when you try to model the learner as a dynamic model (for example, a network of neural weights that shift in compliance with some rule).
Unfortunately, it is not as easy to overcome differential equations as solve a (ordinary) equation. Rather, you must know some theory about how multiple
forms of differential equations generate different types of solutions.
There is also a large body of literature on numerical differential equations, which can be very overwhelming. Most examples I know are from computer vision, for example when the visual flow of an image is to be calculated from image sequences. But it is also possible that differential equations will pop any time a learning algorithm arrives from a comparison with a physical device.
  • asked a question related to Scientific Computing
Question
13 answers
Dear Colleagues,
I have recently graduated with a BSc in Mechanical Engineering. During my BSc, I assisted research and projects on a variety of fields ranging from nanomechanics of advanced materials (experimental), predictive analysis of stochastics data input for control (MATLAB), human balance control (theoretical), dynamical modeling of fluid/solid coupling problems, and corresponding CFD in OpenFOAM, computational aerodynamics with HPC. Upon my graduation, I joined a research team at ETH Zurich as a scientific assistant to work on vortex kinematics (theoretical and computational).
My main interest areas are:
  • Nonlinear Dynamics and Chaos, Stochastic Systems, Machine Learning of Dynamical Systems and Fluid Dynamics, Prediction, Nonlinear Control
  • Computational Finance, Financial Analytics
  • Numerical Methods, Computing and Algorithm Development
Clearly, all of the fields mentioned above require a decent knowledge of mathematical modeling, analysis, and computation (mostly by parallel computing over HPCs). One can also argue that these areas are not really far from each other as they can be all classified into an umbrella field of Dynamical Systems Theory.
I will soon start my MSc in Computational Science and Engineering at ETH Zurich. However, I am struggling to decide which specialization area I should choose.
As a part of the program I have to enroll at least in two of the following CORE SUBJECTS:
  • Advanced Numerical Methods for CSE
  • Optimization for Data Science
  • Computational Statistics
  • Advanced Systems Lab (Fast Numerical Codes)
Of this, I am planning to take all as they are rich in content, relevant to my multidisciplinary taste, and beneficial for my future plans. They are also fairly complementary to one another.
I will also have to take two mandatory subjects as a part of the admission requirement:
  • Numerical Methods for CSE
  • High-Performance Computing Lab for CSE
*The program requires me to take 5 courses in my selected specialization area. The rest of the credits necessary to graduate can be chosen freely from any department.
ETH is a top-notch institute for education and research in all three of Control & Robotics, Fluid Dynamics, and Applied/Computational Mathematics. This at least ensures that whatever I choose I will still get a quality education and have a chance to do quality research.
As we all know, modern areas such as robotics, data science, software engineering, neuroscience, computational biology and etc. have rather well-defined career paths. These people would not have as many troubles as a multidisciplinary guy (e.g. my MSc program) to decide what subjects to take and what to focus on.
Now, I lost 2 lost years between the high school and university and I believe this has eliminated some of my flexibility in this kind of decision, especially given that I am in a distance relationship of which I have to also take care of. It is likely that I will prefer to stay at ETH for my Ph.D. or work some time here before my Ph.D. I may also choose to do my Ph.D. in one of the other top schools.
I really appreciate your opinions and advice!
Thank you for your time and patience!
Kind Regards
Relevant answer
Answer
Dear Mirlan,
My congratulations on your graduation. Regarding your question about future studies at ETH, I have looked at the outline of the courses mentioned in your question.
You are probably familiar with these sites but I have included the links just for the documentation:
Advanced numerical methods
Optimization for data science
Computational statistics
Advanced systems lab
Based on the analysis of the above, I will probably choose the Advanced Numerical Methods and the Advanced Systems Lab. (I really like these courses and I think they are very useful regardless of the future specialization).
I wonder how the current situation (related to Covid-19) with online courses is evolving at ETH? Will this constraint change schedules and plans?
In any case, my best wishes for the success of your program.
Kind Regards
  • asked a question related to Scientific Computing
Question
11 answers
What kind of scientific research dominate in the field of Research Gate knowledge and science portal?
Please, provide your suggestions for a question, problem or research thesis in the issues: Research Gate knowledge and science portal.
Please reply.
I invite you to the discussion
Thank you very much
Best wishes
Relevant answer
Answer
... a questioner's intention (i.e., seeking information or discussion) is greater than disciplinary factors in some circumstances....responses to questions provide various resources, including experts' contact details, citations, links to Wikipedia, images, and so on...l implications of the understanding of scholarly information exchange and the design of better academic social networking interfaces, which should stimulate scholarly interactions by minimizing confusion, improving the clarity of questions, and promoting scholarly content management... Jeng, W., DesAutels, S., He, D., & Li, L. (2017). Information exchange on an academic social networking site: a multidiscipline comparison on ResearchGate Q&A. Journal of the Association for Information Science and Technology, 68(3), 638-652.
  • asked a question related to Scientific Computing
Question
4 answers
Approximation theory of interpolation is of foundational importance in numerical analysis especially for various scientific computing problems.
Considerable amount of literature got accumulated on Lagrange, Hermite, Lacunary and Pal-type interpolation in past few years. Working out the interpolation on the real line has seen a numerical justification by many researchers, but on the complex plane, particularly I would say the unit disk hasn't seen much of the justification done numerically by the use of different programming platforms.
I would request other researchers who are part of this discussion to help me find out some useful papers in such direction.
I am also currently working through programming platform MATHEMATICA to view out numerical aspects of my research works.
Hope to see you guys with some good results in future discussion.
Relevant answer
Answer
see also
On five-diagonal Toeplitz matrices and orthogonal polynomials on the unit circle
J. M. Montaner, M. AlfaroJournal:Numerical AlgorithmsYear:1995
  • asked a question related to Scientific Computing
Question
12 answers
Hello i would like someone to tell me how to test trained artificial neural network in matlab for linear predictions.
Relevant answer
Answer
Training the neural network requires the use of the simulate function (sim).
First, if you have normalized your data before training, you need to do same for the testing data using one of the normalization algorithms like (mapminmax or mapstd etc.)
Second, simulating the network with the new set of data requires you calling the saved network (if you have saved it before). You can do so by using the command "load NetworkName".
Third, to simulate, then use the following syntax,
[output]= sim(net, TestData);
The output can the un-normalized. using the previous un-normalization function coupled with the structure for the output data.
NB: All these are to be done in the MATLAB environment though.
I hope this helps.
  • asked a question related to Scientific Computing
Question
8 answers
Segundo llamado a ponencias capítulo Habana del III Simposio Internacional Ciencia e Innovación Tecnológica 2019
Incluye:
  • Segundo taller de investigadores en educación a distancia
  • Primer taller científico “informática y deporte” (cidep2019)
Se anexan convocatoria (segundo llamado) y plantilla para la elaboración de los trabajos.
Relevant answer
Answer
Los felicito por el objetivo del evento, a saber, "crear un espacio de intercambio entre profesionales que estudian en diferentes campos de la ciencia y la tecnología y participan activamente en la generación de más espacios humanos para la vida", y esto no solo nos determina, sino que nos hace especialmente importantes. Somos conscientes de la importancia de las comunicaciones científicas en los desafíos sociales de hoy.
  • asked a question related to Scientific Computing
Question
29 answers
What indexing bases of scientific publications do you recommend in addition to Research Gate?
To which other databases of indexing scientific publications or scientific journals you enter your publications in addition to the Research Gate research base?
What are the bases for scientific publications in which there are many indexed citations or articles and other scientific publications?
The Research Gate scientific base is an excellent platform for the exchange of scientific experiences, establishing scientific cooperation, establishing research teams and indexing scientific publications.
But do you also use other databases for the indexation of scientific publications or scientific journals in a situation where you are looking for additional materials on scientific issues?
Please reply
Thank you very much for all the information
Best wishes
Relevant answer
Answer
I prefer RG, but there is also google scholar and etc.
  • asked a question related to Scientific Computing
Question
6 answers
Director of UGIVIA
We have a local government agreement to create a new science musseum at Mallorca, and we would like to apply the ideas of VIMM project in this area mixing Cultural Heritage, Usability and Accesibility, RV/RA and Tourism.
Many thanks in advance.
Dr. Francisco José Perales López
Catedrático de Universidad CCeIA
Director Unidad de Gráficos y Visión por Ordenador e IA
Dep. Matemáticas e Informatica, UIB
Ed. A. Turmeda. EPS. Crta Valldemossa Km.7.5, 07122
Palma de Mallorca, Illes Balears
España
Relevant answer
Answer
Follow
  • asked a question related to Scientific Computing
Question
41 answers
It seems that using machine/deep learning to solve PDEs is very popular (actually, not only in scientific computing, but also in all fields). So I want to know the reasons behind this. And is the prospect cheerful?
Relevant answer
Answer
I think the applicability of deep-learning/AI/neural networks is being over hyped these days, and the main reason behind this hype is the availability of funding for research on these technologies. Everybody, irrespective of the field of research, wants to grab the fruits before they disappear. Many people I have seen (in Engineering) are applying these technologies without actually thinking much. They are doing it just because of ease of getting research grants when compared against conventional research methodologies.
Success of research grants these days are highly decided by the fancy buzz words. The more fancy words you put in your research grant, the higher the chances of success. Gone are the days when research panels used to evaluate the proposals based on the quality of the methodology and applications rather than how fancy it sounds.
  • asked a question related to Scientific Computing
Question
3 answers
Could anyone share his/her experience in Haskell scientific computing? I'm interested in the state-of-art usage and application to computational fluid dynamics and similar domains.
Relevant answer
Answer
In the past, most programmers were skeptical about the use of Haskell in the large scale scientific computing because of poor performance with arrays. For more details please see:
The recent examples are more favorable
and it is possible that the more complicated codes will analyzed in the future.
  • asked a question related to Scientific Computing
Question
8 answers
I'm looking for opinions (or actual examples) on using Spark (R, Scala or Python) compared to Fortran for scientific computing.
I did some searches but most of the conversations I found are several years old.
Relevant answer
Answer
It all depends on what you want to do.
For really computationally heavy problems, its clearly either Fortran or C, because they are by far the fastest languages out there and easily parallelized for HPCs via mpi.
If you are more interested in prototyping and testing of various approaches, Python + packages are very useful since one can write code much faster.
Another option would be Julia, which tries to combine the simplicity of syntax and the speed of Fortran/C. However I am not sure about its flexibility and package support.
Otherwise, I would recommend to orientate yourself on your peers in the field. If 90% are using fortran, then it probably is a wise choice to go down that path.
  • asked a question related to Scientific Computing
Question
6 answers
Could anybody share the source code of mathematical model of biofiltration, biosoption or adsoprtion? Usually, model is represented by system of PDE that includes mass transfer kinetic and dynamic equations, sorption isotherm and in the case of biological treatment - equation of biofilm degradation and microbiological growth.
I will appreciate any help provided. Thank you!
Relevant answer
Answer
Dr Sheth, Yes, if I achieve some results, I will make it open source.
  • asked a question related to Scientific Computing
Question
43 answers
For me, I am very sure it is solved. If you have interest, first download the problem, run it. Then, read my paper and think, then, you may also be sure.              
How to use the program
1. I believe that most of people who download my program would be professionals. So I please you leave your contacting message and welcome your opinions if you download my program. You can leave your message here or to my email: edw95@yahoo.com. Thanks a lot.
2. This program is an informal one, also it is not the quickest one. But it includes my algorithm, also it can work correctly and works very well. No fails for it.
3. How to use: if you have a 0-1 matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 0 to vertex n-1, you only press the “ReadMatrix” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 0 to vertex n-1 in the new file.
4. How to use: if you have an edges matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 1 to vertex n, you only press the “ReadEdges” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 1 to vertex n in the new file. If without such a path, you get a message “no...”. The input file format: each row: 1,3 or 1 3. It means that an edge from vertex 1 to vertex 3.  
5. The maximum degree is 3. Though I am very sure my algorithm can calculate any degree of undirected graphs, but this program not. The maximum vertex number is 3000, due to that the PC memory is limited.
6. I would like to thank Professor Alexander Chernosvitov very much. He and his one student take a long time to write a program (different from mine) to implement my algorithm and he give me and my work a good comment (see the web codeproject.com and researchgate.net). Mr. Wang, xiaolong also. Before them, no body trust me. Some not smart enough editors and reviewers reject me just on this logic: for such a hard problem, Lizhi Du is not a famous man, so he cannot solve it. Some editor or reviewer does not use his or her brain, say: your paper is apparently wrong, or, your paper cannot be understood. “apparently wrong”, funny! I study it for a lot of years, apparently wrong! If a reviewer is really powerful and use his brain and cost his time, he surely can understand my paper. If you think I am wrong, tell me where is wrong, then I explain to you that is not wrong. If you think my paper cannot be understood, tell me where cannot be understood, I explain to you. In my paper, in the Remarks, I told how to understand my algorithm and proof. I think it is very clear.
7. I studied this problem for a lot of years. I put a lot of versions of my papers on arxiv. Though the former versions have this or that problems, I am very sure the newest version of my paper is the final version and it is surely correct. It may contains some little bugs due to my English expression, but this does not affect the correctness and I can explain or revise the little bugs easily.
8. Surely I think I have proved NP=P and have solved the problem NP vs. P.
9. Thank you for you pay your attention and time on my algorithm!
Relevant answer
Answer
Apparently, Norbert Blum is convinced of a negative solution to the question. See "A Solution of the P versus NP Problem" up on arXiv.
  • asked a question related to Scientific Computing
Question
5 answers
I have come across a fair few books and talks on C++ which teach us how to write good, maintainable code using good programming practices. Scott Meyers, Bjarne Stroustrup, Chandler Carruth, all have great ideas of efficiency, error-proofing, et al.
However, books on scientific C++ use naked pointers *a instead of smart pointers, C-style arrays instead of C++ vectors/arrays, and many more such. This leads to a situation where books on scientific C++ programming teach very bad programming practices/styles, and books on good practices/style don't really focus on scientific computing (PDE solving, nonlinear optimization, linear solvers...).
Welcoming all suggestions for books, talks, videos, tutorials, whatever, on good scientific C++. Even general advice on the process of learning good C++ will be greatly appreciated.
  • asked a question related to Scientific Computing
Question
6 answers
Hi,
I am doing a computational demanding time series analysis in R with a lot of for-loops which do the same analysis several times (e.g. for 164 patients, for 101 different time series per patient or for different time lags). In the end, the results of these analyses are summarized to one score per patient, but till this point, they work absolutely independent of each other. To shorten the computing time, I would like to parallelize the analysis. The independent parts could be analyzed parallel using not only one of the 8 cores of my processor.
I read some postings about performing functions like apply with more than one core, but I am not shure how to implement the approaches.
Does anybody know a simple and comprehensible way of translating a classical sequential for-loop into a procedure which uses different cores simultaneously to run a few of the analyses parallel?
Thank you very much for every comment!
Best,
Brian
Relevant answer
Answer
The easiest way to take advantage of multiprocessors is the multicore package which includes the function mclapply(). mclapply() is a multicore version of lapply(). So any process that can use lapply() can be easily converted to an mclapply() process. However, multicore does not work on Windows. I wrote a blog post about this last year which might be helpful. The package Revolution Analytics created, doSMP, is NOT a multi-threaded version of R. It's effectively a Windows version of multicore.
If your work is embarrassingly parallel, it's a good idea to get comfortable with the lapply() type of structuring. That will give you easy segue into mclapply() and even distributed computing using the same abstraction.
Things get much more difficult for operations that are not "embarrassingly parallel".
  • asked a question related to Scientific Computing
Question
7 answers
I hope this is the right place to ask, but I've read a lot of good comments on other topics here so I'll just ask. At the moment I'm searching for a topic / idea for my undergraduate thesis. I am doing parallel programming by OpenMP right now. I am a new about parallel programming. What is the interesting topics / idea about parallel programming? I'd greatly appreciate any help in pointing me to interesting topics / idea about parallel programming. Best regards, Litu
Relevant answer
Answer
The key point of parallelism is either and both of:
  • solving a given problem faster
  • solving a bigger problem (using more elements)
Search for Speedup and Scaleup as key concepts here.
In practical, the mentioned concepts of the former answers are perfect fields of engagement. I would do image analysis or mathematical analysis as points to start with. However keep in mind that many things you will find are so called "regular" problem (using matrices etc). If you drill deeper, you may find very interesting but also complex aspects of irregular problems (such as using very sparse matrices efficiently; graph problems, simulations for water through the oceans etc where you have a variing density of input values).
  • asked a question related to Scientific Computing
Question
48 answers
I am gonna start finite element programming for watershed runoff analysis so was wondering whether fortran is good??
Relevant answer
Answer
The only reason to use Fortran, in my opinion, is because of legacy code. I do not think that it is a good idea to start new code in Fortran.
Sure, as others have mentioned, Fortran became a modern language. There is a lot of object orientation going on in Fortran nowadays. However, I have been shown in a Fortran course taught by a member of the standardization committee that using these new features will slow down your numerical code by a factor of up to 100. In C++, however, the same features usually do not incur any overhead in computation times (at least this is true for GCC, not so much for the Intel C++ compiler).
One disadvantage of Fortran is the lack of general purpose libraries (or I just don't know of them). There is nothing like a vector class so that I do not have to care for the size of arrays and reallocation. And even if there were something like this using them would be a real pain because of the syntax. In my opinion, a modern language needs a good standard library (or everything needs to be built-in as in scripting languages). So, while Fortran is good for the numerical part, everything else is better left to other languages.
I am always pro C++. With the right libraries the syntax for formulas is as good (and sometimes better) as with Fortran. And a core concept of C++ is to put as much as possible into libraries instead of the language itself. This is sufficient as you cannot design a language feature for handling matrices and vectors in better way within the language than it is with a library. Maybe compilation is slower because of this, but not runtime.
Since somebody mentioned debugging I'd like to add my own experience. One huge disadvantage of Fortran is that it does not have static type checking. This is really bad for subroutine calls. A while ago we turned on interface checking in Fortran. It took us quite long to get our software to compile again with interface checking turned on. Some subroutine calls were totally wrong (somehow the software still worked correctly). You should have as much static checking as possible. Fortran as a language does not have enough, in my opinion.
  • asked a question related to Scientific Computing
Question
5 answers
Some works in the literature (Goldberg/Bridges, for instance) have demonstrated that standard Genetic Algorithms (GAs) usually cause the building blocks disruption of the solutions. On the other hand, some current papers have shown that the use of a GA special implementation may be a viable alternative in order to overcome this issue. In this sense, I would like to know what you think about that. Please, try answering my question in a short clear response if possible.
Thank you very much in advance!
Relevant answer
Answer
Dear Lauro Cássio:
The building blocks problem depends on the nature of the concrete task you are solving. But there are a variety of algorithms that makes evolve solutions codes populations (one or more) helped by a diversity of operators, GA could be included in this variety. Some of these algorithms could be more effective for a diversity of tasks.
 The first article I wrotte devoted to this method "The Integration of Variables method". an additional information you can find in other works that you can download in my researchgate page, like Analysis and Synthesis of Engineering Systems.
Excepting the GA, the rest of the algorithms included in the method doesn't present the building blocks problem.
Best wishes,
José Arzola
  • asked a question related to Scientific Computing
Question
9 answers
Hi All, I have difficulty in writing a mix of string and string arrays  by FORTRAN, for example (the following two lines):
write (cout,1052) (cd(ii),ii=1+1,9),(cy(ib),ib=10,(inmt*2))
1052 format ('c=[c1',8(';c',a1),79(';c',a2),'];')
this piece of code is working, where cd(ii) and cy(ib) are string arrays. 
I need to write variable number of the string arrays cells (cy(ib)) followed by a bracket, so I need to replace "79" in the format line with "(inmt*2)-9", because each case has a different number of cells to be written, when I do that, it is not anymore working.
Any ideas please?
Thanks,
Relevant answer
Answer
It looks like you are coming from a C++ background (or at least the first implementers of your code have). The 'cout' is giving you away. Most of the time you can just use
write(*,*) 'Some text.'
Furthermore, in modern Fortran you would rather provide the format string directly instead of using labels:
write(*,'(I)') 1
You can also provide the format string as a string variable. In this case you can also first compose the format string:
write(format_string,'(A,I,A)') '(A,', 3, 'E)'  ! format string to print some text and 3 reals
write(*,format_string) 'x=', x(1:3)
I see that you have now used a version without formatting. But, if you ever want to introduce formatting again, this might be an easy option.
  • asked a question related to Scientific Computing
Question
11 answers
I 've devolved a molecular dynamic simulation approach for laser-material interaction using the direct simulation Monte-Carlo algorithm and my code run very slowly on the computer of the group research which I belong to. I am wondering are there any free supercomputers one can connect to through internet? any other suggestions to remove this problem.
Relevant answer
Answer
Hi,
If you have a current collaboration with french people, you may ask for an account on the network Grid'5000 based in several french towns. Even if this network is localized in France, the support is done in English. The registration is free but required for accessing the ressources.
Here is the homepage of the project :
I recommend reading the user charter (see the section 'Get an account') before asking for an account.
Hope it helps...
Alexandre
  • asked a question related to Scientific Computing
Question
93 answers
I need software which enables me to change, connect, and make my own figures but in a professional way.
Relevant answer
Answer
I create graphs in R or SciDavis andexport them to PDF (Vector format). Then I do post-editing using Inkscape.
FYI
R, SciDavis, and Inkscape are open-source tools.
  • asked a question related to Scientific Computing
Question
3 answers
I want to do parallel computing in scialb or octave. I want to know which is better and how many maximum cores it allow to access?
Relevant answer
Answer
greetings, i am also doing my project in sci lab (image processing) and i want to to do parallel processing (becaus it take 2h to complte one img)so if you know the answer please share with me
  • asked a question related to Scientific Computing
Question
10 answers
A bivariate chebyshev spectral collocation quasilinearization method for non linear evolution parabolic equations..
I am studying through this paper and i just want matrix system detail even solved for 2 chebyshev nodes..
Thanks in advance
Relevant answer
Answer
  • asked a question related to Scientific Computing
Question
6 answers
In mathematica, my matrix output has symbolic expressions like 2D+3E^2. First I want to extract it and then output to be shown in FortranForm i.e. 2*D+3*E**2. I want to use the matrix elements in Fortran code so for this they have to be in FortranForm. So please, if anyone can provide syntax for this purpose. Persons who know mathematica software can help me with this. 
I think there should be a single line statement that will do both functions- extracting elements and converting them to FortranForm.
Relevant answer
Answer
Thanks Brian G Higgins . 
This is what i wanted.
  • asked a question related to Scientific Computing
Question
1 answer
It has been shown that WENO-Z is less dissipative than WENO-JS.
Are the conclusion and numerical results hold in the attached article if one replaces the baseline 
WENO-JS scheme with WENO-Z scheme and that Compact Reconstructed WENO scheme is stiill substantially faster and more accurate than the pure WENO-Z scheme?  How about for a higher order scheme, say 9th order WENO schemes?
Note, please take the sensitivity parameter epsilon = O(dx^3) and power parameter p=2 in the definition of WENO-Z nonlinear weights.
For reference, see my list of publications on WENO-Z scheme.
Relevant answer
Answer
This question has been answered.
  • asked a question related to Scientific Computing
Question
5 answers
I am currently working on a complex network in MATLAB2010b because we have the license in the college for it only but it only allows execution on 8 cores. I am a little familiar with Scilab but on one of the page (http://julialang.org/) comparing octave, matlab, julia,python etc.(except Scilab), octave was significantly slower than matlab. But what is the execution speed of running a program in scilab. Can anyone suggest me, some sample parallelised code in Scilab as its website lacks comprehensive information on its usage?
Also, can anyone suggest how to read about julia with example codes as its very fast among others?
Relevant answer
Rohan, I have not worked with Scilab yet, but I suggest you to use Matlab. It is and have been a very helpful tool in many areas of study. Matlab may be considered as a complete tool. If you have a NVIDIA's graphics card, you can parallelize some portion of your code just using the Matlab built-in function. You do not have to worry about parallelism details, the Parallel Computing Toolbox makes everything implicit for you. Try it and you'll see.
Best regards!
  • asked a question related to Scientific Computing
Question
11 answers
I'm working on a tracking problem and I have the true target with the red path and the noise with the blue path.
How can I parametrize these two paths to differentiate between them mathematically?
Relevant answer
Answer
What is the primary goal of making distinction between the red and the blue?
In objects tracking tasks,observations (e.g. GPS readings or positions established from image processing) are usually disturbed with noise. Red trajectory seems  to be an observation of a moving object with very little noise, for blue it rather represents positions of a still object with the noise quite high.  Do you want to distinguish between still and moving objects or to estimate the noise level?
A typical solution that allows to predict real position of a tracked objects is to apply filtration. I suggest using Kalman filter. In  https://www.researchgate.net/publication/262567914_An_Incremental_Map-Matching_Algorithm_Based_on_Hidden_Markov_Model?ev=prf_pub there is an illustration of trajectory smooothing with Kalman filter.
I think, applying Kalman filter to the blue trajectory will reveal a still object, and the noise level can be estimated based on updated covariance matrix.   
 
 
  • asked a question related to Scientific Computing
Question
1 answer
GridSim and SimGrid are two frameworks/toolkits widely used for research in Grid. Apart from those, Which simulators directly support simulation of workflow scheduling?
Relevant answer
Answer
WorkflowSim supports simulation of workflow scheduling.
  • asked a question related to Scientific Computing
Question
6 answers
Share your view in the light of performance tradeoffs, resource requirements and monetary costs of creating and deploying applications on each platform.
Relevant answer
Answer
Cloud computing in my opinion should be used for storage only. While I have reservation about privacy and viability, most IT technicians have the idea that cloud computing can take the place of a true server,,,it does not. Systems such as NAS should be what they are and nothing more. NAS is more cost effective for what it does and nothing more...my thoughts.
  • asked a question related to Scientific Computing
Question
3 answers
Does anyone know how I can assign to 2 processors on a dual core computer for the combination of Simpson's rule (y[n+2]-y[n]=(h/3)*(f[n+2]+4f[n+1]+f[n])) and two-step Adams Moulton method (y[n+2]-y[n+1]=(h/12)*(5f[n+2]+8f[n+1]-f[n])) to solve ordinary differential equations to produce results for y[n+1] and y[n+2] simultaneously from each processor?
Relevant answer
Answer
I may be wrong. But I do not think that this task is suitable for multi-threading. You task has very heavy dependencies. The resulted very frequent transferring of data would almost be sure to slow down the program. ( If you use MPI, you would need to call MPI_barrier immediately after one line of code in the loop. The faster thread always has to wait for the slower thread. The overhead would be excessive. )
Instead, I would seek instruction level parallelism in a single thread by writing the loops well enough for the compiler to generate packed double assembly instructions. Maybe this is the best we can do.
  • asked a question related to Scientific Computing
Question
5 answers
I'm using workstation wtth 8 core processors. When I'm trying to do double or triple integration in Mathematica using 'Integrate[]' it seems to take very long time (sometimes it may take an hour). I think it is using only one core to integrate the equation. So, is it possible to use all the 8 cores for such integration to get the result quicker? If possible, how can I do parallel integration for my equation?
NB: I'm using Mathematica 8.
Relevant answer
Answer
Mathematica cannot parallel compute Integrate or Nintegrate by just using Parallelize. You have to cut the integration domain into sub-domains and then use ParallelTable or ParallelCombine. Here is an example,
How to cut the domain is tricky. You have to know the integrand very well, such as whether there is any singularity.
I remember there is a super computing engine for Mathematica, which has built-in functions like ParallelNintegrate, but I've never tried it.
  • asked a question related to Scientific Computing
Question
5 answers
Recently the submission of a scientifically bogus paper to a large number of peer-reviewed journals (http://www.cbc.ca/news/technology/bogus-science-paper-reveals-peer-review-s-flaws-1.2054004) revealed how the current peer-review process is broken. As one commentator simply put it "... First, and foremost, we need to get past the antiquated idea that the singular act of publication – or publication in a particular journal – should signal for all eternity that a paper is valid, let alone important. Even when people take peer review seriously, it is still just represents the views of 2 or 3 people at a fixed point in time. To invest the judgment of these people with so much meaning is nuts. And its far worse when the process is distorted – as it so often is – by the desire to publish sexy papers, or to publish more papers, or because the wrong reviewers were selected, or because they were just too busy to do a good job. ..". This is a very serious problem which should be addressed by the scientific community in a time where proliferation of papers is omnipresent and the assessment of their validity overwhelming.
Relevant answer
Answer
A possible answer which is more of a provocation could be:
Could Open Peer-Review be a possible answer? I personally believe it isn't because together with the poor performance of the peer-review process we also have an extraordinary proliferation of scientific papers which would be impossible to handle by and Open Peer-Review process. At the bottom of this proliferation there is the need for many researchers to publish to enhance their career chances. In the best circumstances this means that many people fraction their results in many papers, each one one containing a tiny fraction more than previous publications. Perhaps we should start revolutionize the way we value the productivity of a scientist by stopping to look at the sheer number of his publications. Perhaps we should start Peer-Review people rather then their publications....
  • asked a question related to Scientific Computing
Question
1 answer
There are many algorithms based on FFT
Relevant answer
  • asked a question related to Scientific Computing
Question
5 answers
I am currently working on a problem where i have to write 3 dependent variables and 1 independent variable. I am using Manipulate and Plot command for interactive graphs. Now I want to write these dependent and independent values asa table and as a .xls file. Is there anybody who can help me with this problem? Thanks in advance.
Relevant answer
Answer
Thanks dear Eric Hall. Ur suggestions are very good.
  • asked a question related to Scientific Computing
Question
21 answers
Is Matlab the only language to be used for this. If so how can individuals use this?
Relevant answer
Answer
R is very similar to Matlab and freely available. Python with the SciPy and Numpy packages is very powerful and will give you better performance. IPython will give you a Python environment that functions similar to the interactive consoles in Matlab and R. For even better performance you can use C++ with the Eigen library or something else.
It entirely depends on what you are trying to do, but Matlab is by no means the only or even the best choice.
  • asked a question related to Scientific Computing
Question
3 answers
I am thinking about how to capture the 3D data procedures (e.g., creating as-built 3D models of buildings for construction quality analysis) from multiple people, and explore automated approaches that can synthesize these procedures into optimal data procedures for given 3D data processing tasks. I know some basics about 3D data processing algorithms, and some challenges about how to decompose workflows into sections and optimizing the executions of workflows in a web or cloud computing environment, but would like to solicit some suggestions from friends here about specific challenges related to this problem.
Relevant answer
Answer
Biggest problem I always find is the plethora of APIs, data structures, and software control interfaces that you need to cover to glue many tools together. If you are using commercial tools, you are beholden to the APIs that are designed for you. If you are doing all the tools yourself, then you have the option to 'normalize' the data and have the tools use a common access API. Much easier on the integration, but now you are responsible for the whole stack.
If you have a system integrator you work with, letting them create the normalization interface might be a reasonable approach, but it depends a bit on the proximity of your normalized data structure to the commercial tool chain's data structure. As most commercial CAE software is really old software, this normalization task is typically the most time consuming and most error prone phase of the integration.