Science topic

# Scientific Computing - Science topic

Explore the latest questions and answers in Scientific Computing, and find Scientific Computing experts.

Questions related to Scientific Computing

I am working on a research project involving a system of differential equations with one ordinary differential equation (ODE) and two partial differential equations (PDEs).

I would like to discuss methods and approaches for solving this system efficiently using Numerical method and Machine Learning.

Can you recommend Python/Matlab code for using numerical techniques and ML

Any guidance or references would be greatly appreciated.

**Can the conduct of analysis and scientific research be significantly improved through the use of Big Data Analytics, artificial intelligence and quantum computers?**

**Can the possibilities of Big Data Analytics applications supported by artificial intelligence technology in the field increase significantly when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?**

Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and realized by the most powerful quantum computers?

What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers, which also apply Industry 4.0/5.0 technologies, including generative artificial intelligence and Big Data Analytics technologies?

Can the scale of data processing carried out by the most powerful quantum computers be comparable to the data processing that is carried out in the billions of neurons of the human brain?

In recent years, the digitization of data and archived documents, digitization of data transfer processes, etc., has been progressing rapidly.

The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Accordingly, developed economies in which information and computer technologies are developing rapidly and finding numerous applications in various economic sectors are called information economies. The societies operating in these economies are referred to as information societies. Increasingly, in discussions of this issue, there is a statement that another technological revolution is currently taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies classified as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence, including generative artificial intelligence with artificial neural network technology also applied and subjected to deep learning processes. As a result, the computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are gradually increasing. There is a rapid increase in the processing of ever larger sets of data and information. The number of companies, enterprises, public, financial and scientific institutions that create large data sets, massive databases of data and information generated in the course of a specific entity's activities and obtained from the Internet and processed in the course of conducting specific research and analytical processes is growing. In view of the above, the opportunities for the application of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted, are also growing rapidly. By using the combined technologies of Big Data Analytics, other technologies of Industry 4.0/5.0, including artificial intelligence and quantum computers in the processing of large data sets, the analytical capabilities of data processing and thus also conducting analysis and scientific research can be significantly increased.

In view of the above, I address the following question to the esteemed community of scientists and researchers:

*Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and implemented by the most powerful quantum computers?*

*Can the applicability of Big Data Analytics supported by artificial intelligence technology in the field significantly increase when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?*

*What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers?*

And what is your opinion about it?

What do you think about this topic?

Please answer,

I invite everyone to join the discussion,

Thank you very much,

Best regards,

Dariusz Prokopowicz

*The above text is entirely my own work written by me on the basis of my research.*

*In writing this text I did not use other sources or automatic text generation systems.*

*Copyright by Dariusz Prokopowicz*

**What are the possibilities for the applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?**

The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Currently, another technological revolution is taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies categorized as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence. The computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are successively increasing. The processing of ever-larger sets of data and information is growing. Databases of data and information extracted from the Internet and processed in the course of conducting specific research and analysis processes are being created. In connection with this, the possibilities for the application of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research being conducted, are also growing rapidly.

In view of the above, I address the following question to the esteemed community of scientists and researchers:

*What are the possibilities of applications of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?*

*What are the possibilities of applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques?*

What do you think on this topic?

What is your opinion on this issue?

Please answer,

I invite everyone to join the discussion,

Thank you very much,

Best wishes,

The above text is entirely my own work written by me on the basis of my research.

Copyright by Dariusz Prokopowicz

On my profile of the Research Gate portal you can find several publications on Big Data issues. I invite you to scientific cooperation in this problematic area.

Dariusz Prokopowicz

With growing and ever enhancing scientific validation techniques, are rule of thumb still valid? Are they still useful?

I try to run headers files containing in the nr3 the art of scientific computing with c/c++. Is there some tips ?

What kind of scientific research dominate in the field of

*?***Scientific research in the era of Industry 4.0**Please, provide your suggestions for a

*, problem or***question***in the issues:***research thesis***Scientific research in the era of Industry 4.0*.Please reply.

I invite you to the discussion

Best wishes

There is probably no other science portal that would offer all the same functions for researchers as the

**portal.***Research Gate*Do you agree with me on the above matter?

In the context of the above issues, I am asking you the following question:

*Does the Research Gate research portal offer the most information services for researchers that researchers and scientists need?*

Please reply

I invite you to the discussion

Thank you very much

Best wishes

Many scientists suggest that a good way to analyze the level of

**in action, generate innovation in***innovation***, eg in banks, is conducting surveys among managers and department directors, departments in these institutions.***financial institutions*How should such surveys be carried out? What method of surveys is the most effective? Do

**are an effective instrument for carrying out surveys?***online questionnaire forms*What other

**can be used to investigate the level of innovation in operation, generate***research techniques***?***innovation in financial institutions*Please reply

Best wishes

Dear Friends and Colleagues from RG,

I wish You all the best in the

*. I wish you a***New Year***and***successful continuation***, achieving interesting results of***successes in scientific work***in the***scientific research***and I also wish you good luck in your personal life, all the best.***New Year 2019**In the

*, I wish You success in personal and professional life, fulfillment of plans and dreams, including successes in scientific work,***New Year***.***All Good**In the ending year, we often ask ourselves:

Have we successfully implemented our

**in the ending year? We usually answer this question that a lot has been achieved, that some of the plans a year ago have been realized, but not all goals have been achieved.***research plans*I wish You that the

**would be much better than the previous ones, that each of us would also achieve at least some of the planned most important goals to be achieved in personal, professional and***Next Year***.***scientific life*I wish You dreams come true regarding the implementation of interesting research, I wish You

**and effective development of***fantastic results of research***.***scientific cooperation*I wish You effective

**, including international scientific cooperation, implementation of interesting research projects within***development of scientific cooperation***and that the results of***international research teams***are appreciated, I wish You awards and prizes for achievements in scientific work.***scientific research*I wish You many successes in

*, in didactic work and in other areas of your activity in the***scientific work***, and I also wish you health, peace, problem solving, prosperity in your personal life,***New Year***.***all the best**Thank you very much.

Best wishes.

I wish you the best in New Year 2019.

**Happy New Year 2020.**Dariusz Prokopowicz

Does anybody knows anything about INFORMATICA JOURNAL (SCI expanded), which publishes an article at a cost of 423 US/or equivalent bitcoin? Is this journal fake or true?

It seems that using machine/deep learning to solve PDEs is very popular (actually, not only in scientific computing, but also in all fields). So I want to know the reasons behind this. And is the prospect cheerful?

Dear Colleagues,

I have recently graduated with a BSc in Mechanical Engineering. During my BSc, I assisted research and projects on a variety of fields ranging from nanomechanics of advanced materials (experimental), predictive analysis of stochastics data input for control (MATLAB), human balance control (theoretical), dynamical modeling of fluid/solid coupling problems, and corresponding CFD in OpenFOAM, computational aerodynamics with HPC. Upon my graduation, I joined a research team at ETH Zurich as a scientific assistant to work on vortex kinematics (theoretical and computational).

**My main interest areas are:**- Nonlinear Dynamics and Chaos, Stochastic Systems, Machine Learning of Dynamical Systems and Fluid Dynamics, Prediction, Nonlinear Control
- Computational Finance, Financial Analytics
- Numerical Methods, Computing and Algorithm Development

Clearly, all of the fields mentioned above require a decent knowledge of mathematical modeling, analysis, and computation (mostly by parallel computing over HPCs). One can also argue that these areas are not really far from each other as they can be all classified into an umbrella field of Dynamical Systems Theory.

I will soon start my MSc in Computational Science and Engineering at ETH Zurich. However, I am struggling to decide which specialization area I should choose.

**As a part of the program I have to enroll at least in two of the following CORE SUBJECTS:**- Advanced Numerical Methods for CSE
- Optimization for Data Science
- Computational Statistics
- Advanced Systems Lab (Fast Numerical Codes)

Of this, I am planning to take all as they are rich in content, relevant to my multidisciplinary taste, and beneficial for my future plans. They are also fairly complementary to one another.

**I will also have to take two mandatory subjects as a part of the admission requirement:**- Numerical Methods for CSE
- High-Performance Computing Lab for CSE

*The program requires me to take 5 courses in my selected specialization area. The rest of the credits necessary to graduate can be chosen freely from any department.

ETH is a top-notch institute for education and research in all three of Control & Robotics, Fluid Dynamics, and Applied/Computational Mathematics. This at least ensures that whatever I choose I will still get a quality education and have a chance to do quality research.

As we all know, modern areas such as robotics, data science, software engineering, neuroscience, computational biology and etc. have rather well-defined career paths. These people would not have as many troubles as a multidisciplinary guy (e.g. my MSc program) to decide what subjects to take and what to focus on.

Now, I lost 2 lost years between the high school and university and I believe this has eliminated some of my flexibility in this kind of decision, especially given that I am in a distance relationship of which I have to also take care of. It is likely that I will prefer to stay at ETH for my Ph.D. or work some time here before my Ph.D. I may also choose to do my Ph.D. in one of the other top schools.

I really appreciate your opinions and advice!

Thank you for your time and patience!

Kind Regards

What kind of scientific research dominate in the field of

*?***Research Gate knowledge and science portal**Please, provide your suggestions for a

*, problem or***question***in the issues:***research thesis***Research Gate knowledge and science portal*.Please reply.

I invite you to the discussion

Thank you very much

Best wishes

Approximation theory of interpolation is of foundational importance in numerical analysis especially for various scientific computing problems.

Considerable amount of literature got accumulated on Lagrange, Hermite, Lacunary and Pal-type interpolation in past few years. Working out the interpolation on the real line has seen a numerical justification by many researchers, but

*on the complex plane, particularly I would say the unit disk hasn't seen much of the justification done numerically by the use of different programming platforms.*I would request other researchers who are part of this discussion to help me find out some useful papers in such direction.

I am also currently working through programming platform

**to view out numerical aspects of my research works.***MATHEMATICA*Hope to see you guys with some good results in future discussion.

Hello i would like someone to tell me how to test trained artificial neural network in matlab for linear predictions.

**Segundo llamado a ponencias capítulo Habana del III Simposio Internacional Ciencia e Innovación Tecnológica 2019**

**Incluye**:

- Segundo taller de investigadores en educación a distancia
- Primer taller científico “informática y deporte” (cidep2019)

Se anexan convocatoria (segundo llamado) y plantilla para la elaboración de los trabajos.

What

*do you recommend in addition to Research Gate?***indexing bases of scientific publications**To which other

*you enter your publications in addition to the Research Gate research base?***databases of indexing scientific publications or scientific journals**What are the bases for scientific publications in which there are many

*indexed citations or articles and other scientific publications?*The

*is an excellent platform for the exchange of scientific experiences, establishing scientific cooperation, establishing research teams and indexing scientific publications.***Research Gate scientific base**But do you also use other databases for the indexation of

*in a situation where you are looking for additional materials on scientific issues?***scientific publications or scientific journals**Please reply

Thank you very much for all the information

Best wishes

Director of UGIVIA

We have a local government agreement to create a new science musseum at Mallorca, and we would like to apply the ideas of VIMM project in this area mixing Cultural Heritage, Usability and Accesibility, RV/RA and Tourism.

Many thanks in advance.

Dr. Francisco José Perales López

Catedrático de Universidad CCeIA

Director Unidad de Gráficos y Visión por Ordenador e IA

Dep. Matemáticas e Informatica, UIB

Ed. A. Turmeda. EPS. Crta Valldemossa Km.7.5, 07122

Palma de Mallorca, Illes Balears

España

It seems that using machine/deep learning to solve PDEs is very popular (actually, not only in scientific computing, but also in all fields). So I want to know the reasons behind this. And is the prospect cheerful?

Could anyone share his/her experience in Haskell scientific computing? I'm interested in the state-of-art usage and application to computational fluid dynamics and similar domains.

I'm looking for opinions (or actual examples) on using Spark (R, Scala or Python) compared to Fortran for scientific computing.

I did some searches but most of the conversations I found are several years old.

Could anybody share the source code of mathematical model of biofiltration, biosoption or adsoprtion? Usually, model is represented by system of PDE that includes mass transfer kinetic and dynamic equations, sorption isotherm and in the case of biological treatment - equation of biofilm degradation and microbiological growth.

I will appreciate any help provided. Thank you!

For me, I am very sure it is solved. If you have interest, first download the problem, run it. Then, read my paper and think, then, you may also be sure.

How to use the program

1. I believe that most of people who download my program would be professionals. So I please you leave your contacting message and welcome your opinions if you download my program. You can leave your message here or to my email: edw95@yahoo.com. Thanks a lot.

2. This program is an informal one, also it is not the quickest one. But it includes my algorithm, also it can work correctly and works very well. No fails for it.

3. How to use: if you have a 0-1 matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 0 to vertex n-1, you only press the “ReadMatrix” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 0 to vertex n-1 in the new file.

4. How to use: if you have an edges matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 1 to vertex n, you only press the “ReadEdges” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 1 to vertex n in the new file. If without such a path, you get a message “no...”. The input file format: each row: 1,3 or 1 3. It means that an edge from vertex 1 to vertex 3.

5. The maximum degree is 3. Though I am very sure my algorithm can calculate any degree of undirected graphs, but this program not. The maximum vertex number is 3000, due to that the PC memory is limited.

6. I would like to thank Professor Alexander Chernosvitov very much. He and his one student take a long time to write a program (different from mine) to implement my algorithm and he give me and my work a good comment (see the web codeproject.com and researchgate.net). Mr. Wang, xiaolong also. Before them, no body trust me. Some not smart enough editors and reviewers reject me just on this logic: for such a hard problem, Lizhi Du is not a famous man, so he cannot solve it. Some editor or reviewer does not use his or her brain, say: your paper is apparently wrong, or, your paper cannot be understood. “apparently wrong”, funny! I study it for a lot of years, apparently wrong! If a reviewer is really powerful and use his brain and cost his time, he surely can understand my paper. If you think I am wrong, tell me where is wrong, then I explain to you that is not wrong. If you think my paper cannot be understood, tell me where cannot be understood, I explain to you. In my paper, in the Remarks, I told how to understand my algorithm and proof. I think it is very clear.

7. I studied this problem for a lot of years. I put a lot of versions of my papers on arxiv. Though the former versions have this or that problems, I am very sure the newest version of my paper is the final version and it is surely correct. It may contains some little bugs due to my English expression, but this does not affect the correctness and I can explain or revise the little bugs easily.

8. Surely I think I have proved NP=P and have solved the problem NP vs. P.

9. Thank you for you pay your attention and time on my algorithm!

I have come across a fair few books and talks on C++ which teach us how to write good, maintainable code using good programming practices. Scott Meyers, Bjarne Stroustrup, Chandler Carruth, all have great ideas of efficiency, error-proofing, et al.

However, books on scientific C++ use naked pointers *a instead of smart pointers, C-style arrays instead of C++ vectors/arrays, and many more such. This leads to a situation where books on scientific C++ programming teach

*very*bad programming practices/styles, and books on good practices/style don't really focus on scientific computing (PDE solving, nonlinear optimization, linear solvers...).Welcoming all suggestions for books, talks, videos, tutorials, whatever, on good scientific C++. Even general advice on the process of learning good C++ will be greatly appreciated.

Hi,

I am doing a computational demanding time series analysis in R with a lot of for-loops which do the same analysis several times (e.g. for 164 patients, for 101 different time series per patient or for different time lags). In the end, the results of these analyses are summarized to one score per patient, but till this point, they work absolutely independent of each other. To shorten the computing time, I would like to parallelize the analysis. The independent parts could be analyzed parallel using not only one of the 8 cores of my processor.

I read some postings about performing functions like apply with more than one core, but I am not shure how to implement the approaches.

Does anybody know a simple and comprehensible way of translating a classical sequential for-loop into a procedure which uses different cores simultaneously to run a few of the analyses parallel?

Thank you very much for every comment!

Best,

Brian

I hope this is the right place to ask, but I've read a lot of good comments on other topics here so I'll just ask. At the moment I'm searching for a topic / idea for my undergraduate thesis. I am doing parallel programming by OpenMP right now. I am a new about parallel programming. What is the interesting topics / idea about parallel programming? I'd greatly appreciate any help in pointing me to interesting topics / idea about parallel programming. Best regards, Litu

I am gonna start finite element programming for watershed runoff analysis so was wondering whether fortran is good??

When I ask this question, I understand that the question is quite general and there might be many possible answers depending on the data type, the desired accuracy, smoothness, computational time, etc. I just want to know which one do your prefer? why ? Conditional answers will also be appreciated.

Some works in the literature (Goldberg/Bridges, for instance) have demonstrated that standard Genetic Algorithms (GAs) usually cause the building blocks disruption of the solutions. On the other hand, some current papers have shown that the use of a GA special implementation may be a viable alternative in order to overcome this issue. In this sense, I would like to know what you think about that. Please, try answering my question in a short clear response if possible.

Thank you very much in advance!

Conference Paper Variable Selection for Multivariate Calibration in Chemometr...

Hi All, I have difficulty in writing a mix of string and string arrays by FORTRAN, for example (the following two lines):

write (cout,1052) (cd(ii),ii=1+1,9),(cy(ib),ib=10,(inmt*2))

1052 format ('c=[c1',8(';c',a1),79(';c',a2),'];')

this piece of code is working, where cd(ii) and cy(ib) are string arrays.

I need to write variable number of the string arrays cells (cy(ib)) followed by a bracket, so I need to replace "79" in the format line with "(inmt*2)-9", because each case has a different number of cells to be written, when I do that, it is not anymore working.

Any ideas please?

Thanks,

I 've devolved a molecular dynamic simulation approach for laser-material interaction using the direct simulation Monte-Carlo algorithm and my code run very slowly on the computer of the group research which I belong to. I am wondering are there any free supercomputers one can connect to through internet? any other suggestions to remove this problem.

I need software which enables me to change, connect, and make my own figures but in a professional way.

I want to do parallel computing in scialb or octave. I want to know which is better and how many maximum cores it allow to access?

A bivariate chebyshev spectral collocation quasilinearization method for non linear evolution parabolic equations..

I am studying through this paper and i just want matrix system detail even solved for 2 chebyshev nodes..

Thanks in advance

In mathematica, my matrix output has symbolic expressions like 2D+3E^2. First I want to extract it and then output to be shown in FortranForm i.e. 2*D+3*E**2. I want to use the matrix elements in Fortran code so for this they have to be in FortranForm. So please, if anyone can provide syntax for this purpose. Persons who know mathematica software can help me with this.

I think there should be a single line statement that will do both functions- extracting elements and converting them to FortranForm.

It has been shown that WENO-Z is less dissipative than WENO-JS.

Are the conclusion and numerical results hold in the attached article if one replaces the baseline

WENO-JS scheme with WENO-Z scheme and that Compact Reconstructed WENO scheme is stiill substantially faster and more accurate than the pure WENO-Z scheme? How about for a higher order scheme, say 9th order WENO schemes?

Note, please take the sensitivity parameter epsilon = O(dx^3) and power parameter p=2 in the definition of WENO-Z nonlinear weights.

For reference, see my list of publications on WENO-Z scheme.

I am currently working on a complex network in MATLAB2010b because we have the license in the college for it only but it only allows execution on 8 cores. I am a little familiar with Scilab but on one of the page (http://julialang.org/) comparing octave, matlab, julia,python etc.(except Scilab), octave was significantly slower than matlab. But what is the execution speed of running a program in scilab. Can anyone suggest me, some sample parallelised code in Scilab as its website lacks comprehensive information on its usage?

Also, can anyone suggest how to read about julia with example codes as its very fast among others?

I'm working on a tracking problem and I have the true target with the red path and the noise with the blue path.

How can I parametrize these two paths to differentiate between them mathematically?

GridSim and SimGrid are two frameworks/toolkits widely used for research in Grid. Apart from those, Which simulators directly support simulation of workflow scheduling?

I am using software packages for mathematical modelling (Mathematica and Matlab), data analysis (R) and spatial information processing (Idrisi and ArcGIS). I often use the output from a function in one program as input for a function in another one.

Is there an easy way to use the functions from all packages in one main script? What programming language is recommended for this? c++, Python, or better a bash script?

Share your view in the light of performance tradeoffs, resource requirements and monetary costs of creating and deploying applications on each platform.

Does anyone know how I can assign to 2 processors on a dual core computer for the combination of Simpson's rule (y[n+2]-y[n]=(h/3)*(f[n+2]+4f[n+1]+f[n])) and two-step Adams Moulton method (y[n+2]-y[n+1]=(h/12)*(5f[n+2]+8f[n+1]-f[n])) to solve ordinary differential equations to produce results for y[n+1] and y[n+2] simultaneously from each processor?

I'm using workstation wtth 8 core processors. When I'm trying to do double or triple integration in Mathematica using 'Integrate[]' it seems to take very long time (sometimes it may take an hour). I think it is using only one core to integrate the equation. So, is it possible to use all the 8 cores for such integration to get the result quicker? If possible, how can I do parallel integration for my equation?

NB: I'm using Mathematica 8.

Recently the submission of a scientifically bogus paper to a large number of peer-reviewed journals (http://www.cbc.ca/news/technology/bogus-science-paper-reveals-peer-review-s-flaws-1.2054004) revealed how the current peer-review process is broken. As one commentator simply put it "... First, and foremost, we need to get past the antiquated idea that the singular act of publication – or publication in a particular journal – should signal for all eternity that a paper is valid, let alone important. Even when people take peer review seriously, it is still just represents the views of 2 or 3 people at a fixed point in time. To invest the judgment of these people with so much meaning is nuts. And its far worse when the process is distorted – as it so often is – by the desire to publish sexy papers, or to publish more papers, or because the wrong reviewers were selected, or because they were just too busy to do a good job. ..". This is a very serious problem which should be addressed by the scientific community in a time where proliferation of papers is omnipresent and the assessment of their validity overwhelming.

There are many algorithms based on FFT

I am currently working on a problem where i have to write 3 dependent variables and 1 independent variable. I am using Manipulate and Plot command for interactive graphs. Now I want to write these dependent and independent values asa table and as a .xls file. Is there anybody who can help me with this problem? Thanks in advance.

I am just trying to compile and run the heat transfer program from the CUDA by Example book by Sanders and Kandrot. I am using MS Visual Studio 2012 and CUDA 5.0. However, there is a significant performance difference between the program running on my PC (around 1230 ms per frame) and what's reported in the book (21 ms). The GPU I am using is GeForce GT 555M (3GB). I was just wondering, has anyone come across a similar problem?

Is Matlab the only language to be used for this. If so how can individuals use this?

I am thinking about how to capture the 3D data procedures (e.g., creating as-built 3D models of buildings for construction quality analysis) from multiple people, and explore automated approaches that can synthesize these procedures into optimal data procedures for given 3D data processing tasks. I know some basics about 3D data processing algorithms, and some challenges about how to decompose workflows into sections and optimizing the executions of workflows in a web or cloud computing environment, but would like to solicit some suggestions from friends here about specific challenges related to this problem.