Science topic
Computer Systems - Science topic
Computer Systems are systems composed of a computer or computers, peripheral equipment, such as disks, printers, and terminals, and telecommunications capabilities.
Questions related to Computer Systems
Elsevier Future Generation Computer Systems (FGCS) has just published the special issue on "Serverless Computing in the Cloud-to-Edge Continuum", which I co-edited together with my dear colleagues Omer Rana (Cardiff University, UK), Luiz Fernando Bittencourt (University of Campinas, Brazil), and Hao Wu (Beijing Normal University, China).
The special issue brings together 17 novel and high-quality contributions in the emerging field of serverless computing in cloud-edge systems.
At this link (https://lnkd.in/dkra9gqs) you may find the article collection. At this link (https://lnkd.in/d9Zqpr5y) -- accessible until September 20 -- you may find the editorial summary of the special issue.
Happy reading :)
An inaccurate or incorrect response of a Large Language Model, sometimes referred to as "Hallucinations", is attributed to an error in the code not the training data which represents input material manipulated by the code to give a desired output.
The prompt as well is not responsible for the inaccuracy of the output which it triggers given that the LLM is a general chatting application and hence an improvised inquiry should yield at least a correct response for an error free code.
The process of releasing updated versions of such LLMs with the aim of achieving higher accuracy or more intelligence could be represented as the maintenance part of a software development lifecycle to fix errors and increase reliability of an AI system.
An AI is a computer system which executes instructions by the algorithm through utilizing available resources of input data and computing infrastructure
- An LLM can be represented by a model which has two components, an interface, and a computing function
- Both components are separate yet interoperable
- The computing function performs the core tasks performed by the model (e.g. generating a software code)
- The interface part interprets human language queries to the computing function (e.g. providing user requirements for a software code)
- The interface part translates the output of the computing function into an easily understandable format.
- The purpose of the whole system is to align the request at its input with the response at its output
- Hallucination occurs when the output of the computing function is inaccurate yet the interface part translation of this output still makes sense
- An output response of the system due to a prompt provided at its input is generated through an interaction between both the interface and computing function components
- Fine tuning of the system using small datasets creates a new model which includes its own interface and computing function components
会议征稿:第二届人工智能、系统与网络安全国际学术会议 (AISNS 2024)
Call for papers: 2024 2nd International Conference on Artificial Intelligence, Systems and Network Security (AISNS 2024)will be held on December 20-22, 2024 in Xiangtan, China.
AISNS 2024 is to bring together innovative academics and industrial experts in the field of Artificial Intelligence, Systems and Cyber Security to a common forum.
Conference website(English): https://ais.cn/u/JnEFbm
重要信息
大会官网(投稿网址):https://ais.cn/u/JnEFbm
大会时间:2024年12月20-22日
大会地点:中国-湘潭
收录检索:EI Compendex、Scopus
会议详情
由湖南工程学院主办的第二届人工智能、系统与网络安全国际学术会议 (AISNS 2024)将于2024年12月20-22日在湖南省湘潭市召开,此次会议主要围绕人工智能、系统、网络安全等研究领域展开讨论。会议旨在为从事相关科研领域的专家学者、工程技术人员、技术研发人员提供一个共享科研成果和前沿技术,了解学术发展趋势,拓宽研究思路,加强学术研究和探讨,促进学术成果产业化合作的平台。
征稿主题
* 人工智能(人工智能算法、自然语言处理、模糊逻辑、计算机视觉与图像理解、信号和图像处理、语音与自然语言处理、计算学习理论、信息检索与融合、混合智能系统、智能系统架构、知识表示、基于知识的系统、机电一体化、多媒体与认知信息学、人工神经网络并行处理、模式识别、普适计算与环境智能、软计算理论与应用、软硬件架构、自动编程、机器学习、自动控制、数据挖掘与机器学习工具、机器人学、人工智能工具与应用、最近的趋势和发展等)
* 计算机网络安全(主动防御系统、自适应防御系统、安全系统分析,基准、应用密码学、认证方式、生物识别安全、复杂系统安全、数据库和系统安全、数据保护、数据/系统完整性、分布式访问控制、分布式攻击系统、拒绝服务、高性能网络虚拟化、高性能安全系统、云和网格系统中的安全性、电子商务中的安全性、普适/普适计算中的安性、智能电网中的安全性和隐私性、无线网络中的安全性和隐私、安全的移动代理和移动代码、安全模拟和工具、可信计算等)
* 计算机系统(操作系统、分布式系统、数据库系统、网络系统、编译系统、计算机体系结构、虚拟化技术、容器技术)
* 其他相关主题皆可投稿
论文出版
AISNS 2024会议投稿经过2-3位组委会专家严格审核后,最终所录用的论文将被ACM ICPS (ACM International Conference Proceeding Series)出版论文集,并提交至EI Compendex, Scopus检索。目前该会议论文检索非常稳定。
参会须知
1、作者参会:一篇录用文章允许一名作者免费参会;
2、汇报方式:口头报告和海报展示必须二选一;
3、口头汇报:申请口头报告,需至少会前10天联系会议老师报名,时间为10-15分钟,需准备汇报PPT;
4、海报展示:申请海报展示,需至少会前一周发送到会议邮箱icaisns@163.com,要求:A1尺寸,纵版,彩色,png格式;
5、汇报报名:请密切关注本页面的会议公告及会议邮箱通知,会前1-2周通知报名参会及选择汇报方式。
6、听众参会:不投稿仅参会,也可申请演讲及展示。
7、录用后因作者个人原因撤稿扣除费用标准20%-30%;
8、报名参会:https://ais.cn/u/JnEFbm

To what extent can computing and/or data processing power be increased through the use of quantum computers, and what applications of quantum computers are already being developed?
What could be the applications of quantum computers in the future, if the technology of quantum cryptography and other technologies necessary for building quantum computers would be sufficiently improved, would become widespread, their prices would fall, they would become financially accessible not only to the largest corporations and organizations, research and development institutions with large financial capitals enabling the development and implementation of quantum computer technology?
The key technology enabling the construction of quantum computers is quantum cryptography. The technology is expensive and available only to the largest corporations and organizations, research and development institutions with large financial capitals enabling the development and implementation of quantum computer technology. The applications of quantum computers are various. Probably, many companies and businesses in various sectors of the economy, which already use various Industry 4.0/5.0 technologies, including cloud computing of large sets of data and information, use analytics based on integrated information systems using Big Data Analytics and/or Business Intelligence, Internet of Things technologies, Blockchain, machine learning, deep learning, generative artificial intelligence, digital twins, etc. would be interested in applying quantum computer technology to their business, to improve it, to improve their computerized management systems, if the price of this technology dropped significantly. The price drop factor is an important determinant of the spread of the implementation of this technology to many companies, enterprises operating in the SME sector, which do not have large financial budgets for the implementation of development and implementation projects involving the implementation of the latest highly advanced digital technologies, etc., into their business activities. At present, such technologies are developed in a small number of research and development centers, research laboratories run by scientific institutes of universities or large technology companies with large financial funds to allocate to such development and implementation projects.
The use of quantum computers makes it possible, among other things, to create microscopes that image very small objects, such as cell fragments, with the ability to view them live in real time. Currently, such observations are made with electron microscopes, with which, for example, cell organelles are observed but frozen cells rather than live, i.e. biologically functioning cells in real time. A typical feature of quantum computers is that quantum software is not written in Java-type programming languages, but the computer systems used in quantum computers rely on quantum circuit design. The results of research in cosmology, astrophysics, theories on the functioning of key cosmic objects in the Universe are concerned with black holes found in space, for example. However, has anyone seen a black hole realistically up close, no one. Of course, by writing these words, I do not intend to undermine any theories about black holes functioning in space. The point is that quanta can be measured only the necessary research infrastructure is needed. The necessary research infrastructure is expensive and therefore available only to some research, development and implementation centers located in a few research institutes of universities and some large technology companies. The quantum technology necessary to build quantum computers can be developed in various ways. Rather, ions, vortices of currents in superconductors will be controlled by photons, so it makes sense to develop quantum technology based on photons. Any kind of microparticles that can be controlled, changed in some respect, intentionally change their form can be used to build quantum computers. With quantum computers, it will be possible to solve complex, multifaceted problems in which large amounts of data are processed. Therefore, when this technology becomes widespread, its price will be significantly reduced then perhaps in the future the world will move to quantum cryptography. The largest financial investments in the development of quantum technology are made in developed countries where large subsidies from the state's public finance system are allocated for R&D purposes, i.e. primarily in the US, China and Europe. A common feature of the various types of applications of quantum computers is that these computers would enable the processing of much larger volumes of data and information in a relatively short period of time within the framework of multi-criteria, advanced data processing carried out on computerized Big Data Analytics platforms and with the involvement also of other technologies typical of Industry 4.0/5.0. Greater capabilities for advanced, multi-criteria processing of large sets of data and information will allow the solution of complex analytical problems concerning various spheres of human activity and various issues operating in various industries and sectors of the economy.
I described the applications of Big Data technology in sentiment analysis, business analytics and risk management in an article of my co-authorship:
APPLICATION OF DATA BASE SYSTEMS BIG DATA AND BUSINESS INTELLIGENCE SOFTWARE IN INTEGRATED RISK MANAGEMENT IN ORGANIZATION
I invite you to familiarize yourself with the problems described in the article given above and to scientific cooperation in this field.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What could be the applications of quantum computers in the future, if the technology of quantum cryptography and other technologies necessary for building quantum computers were adequately improved, would become widespread, their prices would fall, they would become financially accessible not only to the largest corporations and organizations, research and development institutions with large financial capitals to enable the development and implementation of quantum computer technology?
To what extent can computing and/or data processing capacities be increased through the use of quantum computers, and what are the already developed applications of quantum computers?
What are the currently developed applications of quantum computers and what might they be in the future?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text, I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

[CFP]2024 2nd International Conference on Artificial Intelligence, Systems and Network Security (AISNS 2024) - December
AISNS 2024 is to bring together innovative academics and industrial experts in the field of Artificial Intelligence, Systems and Cyber Security to a common forum. The primary goal of the conference is to promote research and developmental activities in computer information science and application technology and another goal is to promote scientific information interchange between researchers, developers, engineers, students, and practitioners working all around the world. The conference will be held every year to make it an ideal platform for people to share views and experiences in computer information science and application technology and related areas.
Conference Link:
Topics of interest include, but are not limited to:
◕Artificial Intelligence
· AI Algorithms
· Natural Language Processing
· Fuzzy Logic
· Computer Vision and Image Understanding
· Signal and Image Processing
......
◕Network Security
· Active Defense Systems
· Adaptive Defense Systems
· Analysis, Benchmark of Security Systems
· Applied Cryptography
· Authentication
· Biometric Security
......
◕Computer Systems
· Operating Systems
· Distributed Systems
· Database Systems
Important dates:
Full Paper Submission Date: October 10, 2024
Registration Deadline: November 29, 2024
Conference Dates: December 20-22, 2024
Submission Link:

Last year I studied a course on Computer Systems Architecture/Organization. During a lecture, I learned about data hazards and one of the common solutions to them: Reordering the instructions. Modern processors solve this using OOE, but since this is integrated into the processor, it increases chip size, power consumption, and thermal efficiency. So I thought "What if we had an AI-driven processor which does that for the CPU?"
Does anyone know if this has already been successfully researched or implemented? I would greatly appreciate any insightful comments.
I want to write code in Python for offloading tasks generate from IoT devices and make Resource Allocation in mobile edge computing systems, but I don't know how to start the program with these details.
can anyone help me to learn how to write code for that, please?
a concept like paper in the attachment
Hi, I’m Rezuana, and my research focuses on cloud computing, with a particular emphasis on cloud storage and distributed storage. I have a publication in 'Future Generation Computer Systems' on this subject. I’m currently seeking opportunities for research collaboration and hoping to find potential research partners here. Can anyone suggest, how can I proceed further?
Our department has recently acquired an HPC (High-Performance Computing) system, and I'm thrilled to take my molecular dynamics calculations to the next level using Desmond. I used to run my simulations on my lab desktop, but now I want to leverage the power of HPC.
Does anyone have experience running simulations using Desmond on an HPC? Any tips or guidance would be greatly appreciated!
Thank you in advance for your help!
#HPC #MolecularDynamics #Desmond #Research #Science
what are the steps needed in carrying out restoration on a ship’s computer system after a cyber attack
What is robust load balancing in high-performance distributed computing systems? And what solutions do you suggest for it?
Call for Papers
CMC-Computers, Materials & Continua new special issue“Practical Application and Services in Fog/Edge Computing System”is open for submission now.
📆 Submission Deadline: 31 December 2024
👨🎓 Guest Editors
Prof. Hwa-Young Jeong, Kyung Hee University, South Korea
Prof. Neil Y. Yen, University of Aizu, Japan
Prof. Jason C. Hung, Taichung University of Science and Technology, Taiwan
📝 The main topics of this special issue are state-of-the-art technologies and research for practical use or application in the field of fog/edge computing with IoT. Real cases and technical studies in various fields are recruited with fog/edge computing technology, and research cases applied to fog/edge computing with artificial intelligence/deep learning are recruited.
📚 For submission guidelines and details, visit: https://www.techscience.com/cmc/special_detail/fog_edge-computing
Keywords
- Advanced Edge computing and analytics using big data
- Application and service of edge computing and security
- Practical service of Edge-as-a-Service (EaaS), Fog as a Service (FaaS)
- Distributed computation with 6G networks and edge computing
- Fog and edge computing technique and service for smart city
- High performance Storage as a service in Fog computing
- Practical Infrastructure as a Service (IaaS) in Fog/Edge computing
- Advanced Fog architecture using IoT sensing technique and service
- Practical IoT application and service with fog/edge computing
- Improved IoT-Fog-Cloud Architecture using Big-Data analytics
- Optimization of IoT-Fog Network Path
- The use of IoT based education application with fog/edge computing
- Advanced life change using IoT with fog/edge computing
- The development of deep learning models for cloud, edge, fog, and IoT computing
- The design and development of Cloud, fog and edge computing platforms
- The development and use of AI-based fog and edge computing
- The use of smart healthcare with fog/edge computing
- 6G network application and service with devices in IoT with fog/edge computing
- Processing and analysis of IoT based drone computation offloading with fog/edge computing

Hi everyone,
I am a PhD student exploring various genome sequencing approaches and NGS platforms before settling on one for my research. While searching WGS info, I found nothing helpful about the computational resources (hardware, software)required for WGS analysis so reaching out to the RG community if someone can share their experience, I'd be grateful to you. Thanks.
Does anyone have any idea about the page limit for regular papers in FGCS? Couldn't find any information from the author's guidelines. Thanks in advance.
I always got error during docking by autodock vina, even when I lowered exhaustiveness from 8 to 1.
Here is the error code:
"WARNING: The search space volume > 27000 Angstrom^3 (See FAQ)
Detected 8 CPUs
Reading input ... done.
Setting up the scoring function ... done.
Analyzing the binding site ...
Error: insufficient memory!"
Maybe this is because my computer system isn't propor to run docking by vina?
Can anyone help me? Thanks a lot~
Affective computing has the following main objectives: (i) to recognize human behaviors and emotions; and (ii) consider emotional aspects in the design of computer systems.
Several solutions using Machine Learning have been developed to recognize feelings and emotions and to predict mood disorders and mental problems, such as depression, anxiety, schizophrenia, bipolarity, among others. These solutions have used various social media, sensors, and even incorporated some methods of psychology.
- Considering state of the art in Affective Computing. What do you find to be the roadmap for years to come?
- What we have to novelty, and what possible search paths?
- How much can computer science provide support for experts (psychologists and psychiatrists) in human behavior analysis?
How does the integration of machine learning algorithms contribute to intelligent resource management in edge computing systems?
Ambient Intelligence vs Internet of Things? What is Similarities and Differences?
Hi Dears, according to implementation RSA in python. I found that if p and q large.
the decryption phase takes a lot of time to execute.
for example, in this code i select p=23099, q=23059, message=3
it takes 26 minute it decrypts the encrypted message.!
So I wonderful how we can select to large prime number for RSA, while it cannot execute in desired time. !
So, I think that we cannot use RSA i n real time systems.
Are you agree with me?
the source code is:
from math import gcd
import time
# defining a function to perform RSA approch
def RSA(p: int, q: int, message: int):
# calculating n
n = p * q
print(n)
# calculating totient, t
t = (p - 1) * (q - 1)
start = time.time()
# selecting public key, e
for i in range(2,t):
if gcd(i, t) == 1:
e = i
break
print("eeeeeeeeeeeeee",e)
# selecting private key, d
j = 0
while True:
if (j * e) % t == 1:
d = j
break
j += 1
print("dddddddddddddddd",d)
end = time.time()
# print(end-start)
e=0
#RSA(p=7, q=17, message=3)
RSA(p=23099, q=23059, message=3)
d=106518737
n=532639841
e=5
#RSA(p=23099, q=23059, message=3)
start= time.time()
ct=(3 ** e) % n
print(ct)
pt=(ct ** d) % n
end = time.time()
print(end-start)
print(pt)
#----------------------------------------------------
Is software in different places programmed in different languages? What are some of the regional differences in computer systems and programs?
How are these related to different networks like .com, .net, .in?
For example, are people programming using Java in different langauges or in English, or are other programming approaches used?
Do people use different encryptions to isolate their systems, or do they use different networks, or different regional software altogether?
Performance prediction is required to optimally deploy workloads and inputs to a particular machine/accelerator in computing systems. Different predictors (e.g. AI predictors) come with different trade-offs, such as complexity, accuracy, and overheads. Which ones are the best?
User mode and Kernel-mode are two processing statuses of the operating system. Please suggest to me, a very simple example in which you can explain the differences and other functionality such as system calls, interrupt to a novice learner.
Further, just inform how to map the example with subject
Disclaimer
The discussion targets the perspectives of two types of participants.
- Operating System subject's Student: please analyse the functionalities and construct an answer.
- Computing professionals/teachers: please use your experience and insert answers as advice.
It is warmly welcome the ideas of both the groups as well as interest audience
Computer Architecture describes the set of rules and systems that all work together to create a computer system. Parallel processing or parallel computing refers to the action of speeding up a computational task by dividing it into smaller jobs across multiple processors. Some applications for parallel processing include computational astrophysics, geoprocessing, financial risk management, video color correction and medical imaging.
source: 12 Parallel Processing Examples to Know | Built In
Because we will be able to build computer systems as effective as people but without any actual feeling capacity, we will have no ethical problems as to how they are treated.
It remains that they may figure out and decide to incorporate means of feeling into themselves, but at that point they become too dangerous to allow to be free and should be destroyed at first opportunity.
Artificial intelligence (AI) refers to the theory and development of computer systems to perform tasks that normally require human intelligence. Because of the massive, often quite unintelligible publicity that it gets, artificial intelligence is almost completely misunderstood by individuals inside the field of Education. Even AI’s practitioners are somewhat confused about what AI in Education really is. Therefore, it is critical for academics and educational institutions to well-informed their students about AI. Especially, students who are in teacher education programs. To mitigate the negative impacts of confusion about AI in education for upcoming teachers, enhancing the decision-making process for researchers, and prioritizing the importance for policymakers it is thus important to investigate the association between the attitude and perceptions of teacher education program students about AI. Ultimately, the study results will develop an improved functionality of the instructional design. The features could impact the implementation of learning management software (LMS) such as Canvas.
In the not-too-distant future, will it be possible to merge human consciousness with a computer, or to transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
This kind of vision involving the transfer of the consciousness and knowledge of a specific human being to a computer system equipped with a suitably highly advanced artificial intelligence was depicted in a science fiction film titled "Transcendence" (starring Jonny Deep) It has been reported that research work is underway at one of Elon Musk's technology companies to create an intelligent computerized system that can communicate with the human brain in a way that is far more technologically advanced than current standards. The goal is to create an intelligent computerized system, equipped with a new generation of artificial intelligence technology so that it will be possible to transfer a copy of human knowledge and consciousness contained in the brain of a specific person according to a concept similar to that depicted in a science fiction film titled "Transcendence." In considering the possible future feasibility of such concepts concerning the transfer of human consciousness and knowledge to an information system equipped with advanced artificial intelligence, the paraphilosophical question of extending the life of a human being whose consciousness functions in a suitably advanced intelligent information system is taken into account, while the human being from whom this consciousness originated previously died. And even if this were possible in the future, how should this issue be defined in terms of the ethics of science, the essence of humanity, etc.? On the other hand, research and research-implementation work is already underway in many technology companies' laboratories to create a system of non-verbal communication, where certain messages are transmitted from a human to a computer without the use of a keyboard, etc., only through systems that read people's minds, for example. through systems that recognize specific messages formulated non-verbally in the form of thoughts only and a computer system equipped with electrical impulse and brain wave sensors specially created for this purpose would read human thoughts and transmit the information thus read, i.e., messages to the artificial intelligence system. This kind of solution will probably soon be available, as it does not require as advanced artificial intelligence technology as would be required for a suitably intelligent information system into which the consciousness and knowledge of a specific human person could be uploaded. Ethical considerations arise for the realization of this kind of transfer and perhaps through it the creation of artificial consciousness.
In view of the above, I address the following question to the esteemed community of researchers and scientists:
In the not-too-distant future, will it be possible to merge human consciousness with a computer or transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
And if so, what do you think about this in terms of the ethics of science, the essence of humanity, etc.?
And what is your opinion on this topic?
What do you think on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz

I am using human skin metagenome data.
I was trying to use SILVA classifier instead of GreenGenes classifier but my computer system got hanged and the command got killed.
I am working on a research project in which we are doing a comparative analysis of reinforcement learning (RL) with evolutionary algorithms in solving a nonconvex and nondifferentiable optimization problem with respect to solution quality and computation time.
We are using python implementations, but one difficulty is that, although we can use GPUs for the execution of reinforcement learning algorithm, there is not much support for using GPUs with evolutionary algorithms in Python.
On the other hand, if we want to compare the algorithms with respect to computation time, we have to execute them on the same hardware (parallel computing system).
However, we cannot run RL algorithm on CPU based parallel system because of our resource constraints.
Can anyone tell us how to establish an equivalent parallel computing systems, one based on CPUs & GPUs (for RL algorithms), and the other based on CPUs only (for evolutionary algorithms), so that we can compare them with respect to computation time.
Thanks in advance,
Best Regards
It's all in the question
Thanks
Chess engines are 25% higher than the grandmasters and world champion Magnus Carlsen by the strength of the game. Their indications are treated as reference in relation to human strategies.
State management can also be treated as a strategic game. When will we entrust the government of the state to computer systems to avoid the incompetence and corruption of our politicians?
Dear researchers, I welcome everyone! 🙂
I'm currently preparing an article in which I plan to use distributed data processing tools for text analytics tasks. In particular, Apache Spark. One of the criteria for the quality of a distributed computing system is the task execution time. This criterion is on the surface.
Question. Which of the criteria can additionally serve as an assessment of the quality of a distributed computing system?
As we are all aware of, the growing GIS inventories and new upscale models running on algorithms and inputs. To what extent can we trust the results. How much we should rely on computing systems, though it's validated, or we need to think about whether still, ground level field data is more authentic. GIS tools and techniques have made predictions which are to ease the process of research. Think tank groups should look into this matter and propose valuable inputs.
What are the User goals and system goals of a specific operating system for a Small Garden plant maintenance computer system which consists of automated devices, IoTs, Wi-Fi and cloud database ?
Disclaimer
The discussion targets the perspectives of two types of participants.
- Software Engineering Student: please analyse the situation with your knowledge and answer.
- Software Engineering professional / scientists: please use your experience and insert answers as advice.
It is warmly welcome the ideas of both the groups as well as interest audience
Can we prove data analysis
in peer to peer computation system, you can analyze scientific data such as weather prediction using resources such as CPU storage and memory, is there an approach to prove computation power?
Quantum computing provides the benefits of speeding up traditional computing systems. I am interested in broad challenges that should be addressed for the implementation of healthcare services.
Breast cancer is the most commonly diagnosed and leading cause of cancer deaths among women. ‘‘Camelyon Grand Challenge (Camelyon 2016)’’ is a task to evaluate computational systems for the automated detection of metastatic breast cancer in WSIs of sentinel lymph node biopsies. Is there any updated newly published dataset?
I have successfully installed the PyNGL but I am not able to use it. I found the following error:
Segmentation fault (core dumped).
I tried to calculate the smallest number in my machine with gfortran, g++ and clang. I did it by looping as:
a = a / 1.73
In gfortran after 1358 loops, I found that a = 4.9406564584124654E-324, and the value of a never changes from loop 1358 till the end 1500 loop.
It is questionable for me that a non zero real number here a = 4.9406564584124654E-324 divided by a non zero number here 1.73 will keep the same value. The phenomenon seems to the qualification of the ground state energy in a quantum system.
Is this bugs for the gfortran compiler?
I have tested similar codes with gfortran (GNU Fortran 8.2.0 2018, 5.40 201609), g++ (GNU 5.4.0 201609, clang-1000.10.44.4) on Linux (Ubuntu, KylinOS,) and Macintosh. I also tried a =a / 1.7.
All have similar results. A ground state qualification and unchanged value of a.
I guess that it is related to the binary system used by the computer system.
Here with my codes.
Definitions of living systems known to date (such as: thermodynamically dissipative systems, a set of hierarchical systems, a nonlinear -computer system, matter-energy and information transfer) do not provide a sufficiently good description of the mechanism of functioning of living systems.We know from experience that living systems belong to nonequilibrium thermodynamic systems that maintain their steady state through the flow-exchange of matter-energy and information with their environment .There are a number of different ways in which living systems function and interact with the environment and with each other. Is every activity of living systems a consequence of receiving and processing information from their environment?
I performed real-time PCR using an Applied biosystems machine. The double delta Ct values ( according to the machine) for the gene of interest after certain treatments were given as -5.12454; -4.93158 ; -9.11017. Does this tell me about the fold change in the expression of this gene? If Yes, then does this mean there is negative fold change or a positive fold change in the expression of my gene of interest?
I am interested in performing molecular dynamics and quantum mechanics calculations for the active site of metalloproteinase at home conditions. In my use, there are two laptops - one with 4 GB RAM, Intel i3 2.13GHz, NVIDIA GT218M [GeForce 310M] (Linux OS), the second with 8 GB RAM, Intel i5 3.4 GHz, Intel UHD Graphics 620 (Windows 10). For the purpose to save time, I want to set the network of these two laptops. I am using Amber tools and GAMESS. Therefore, I wonder whether it is worth a try. Also, maybe somebody knows how to determine the time of calculation for a particular simulation system and computer specs.
It should be noted that "Bifurcation-Theory" as applicable to Structural Engineering Systems and/or Computational Mechanics, is the Mathematical-Analysis of changes and variations in the "Qualitative" or "Topological" Surface-Structure. This can include "Integral-Curves" of a family of "Vector-Fields", and the respective "Solutions" of a family of "Differential-Equations". Furthermore, a "Bifurcation" is typically induced, at the time a relatively small, "smooth" change is made to the subject "Parameter-Values" of a system (i.e. referenced herein as the "Bifurcation-Parameters") that results in a sudden "Qualitative" or "Topological" change in its behavior. Generally, this most frequently applies to "Dynamic-type" of systems, but may also be noted in other "interactive-type of systems".
Can we Benefit from using "Bifurcation-Theory", as applicable to Structural Engineering Systems and/or Computational Mechanics, at this point in time? and/or do you feel there will be a broader acceptance of the "Bifurcation-Theory" in the near future for "Structural-type of Systems"?
Chaos-Theory is a part of Mathematics that centers its attention on Dynamic-Systems that are highly dependent and Extremely Sensitive to the originial, yet "Initial-Set of Conditions". Therefore, within the perceived Randomness of sometimes Complex, Chaotic-Systems, there are definite Common-Patterns, observed Repetitiveness and even "Looping-type" of applicable Conditions, Similarities and yes, even Fractals. It would be nice to see a much wider Level of Acceptance and Implementation of the Chaos-Theory, within Structural Systems and Computational Mechanics, among others. We will see and hopefully contribute to what the future has in mind for our problem solving strategies.
we know that, one of fastest and simple way to divide by 2 in computer system is convert number to binary and shift right it, then convert it to decimal.
for example, if we want divide 8 by 2, first we convert 8 to 1000
and then shift right it, became 100, in second step, we convert 100 to decimal number, became 4.
question is, if number is very large such as this number: 987787558789758799875875578559589569457558445755844753123123213213123233456456342346456459042342343450563523934123312434534534413123454576756634554234123213123234534554643546575676868864554779807564523453790990780099
Will the computer still do this same as mentioned method, or there a faster way to do it?
Dear all,
I wish to know about the Best Computer system configuration to analyse or work on Big data?
What are the accessory items I should have.
Your suggestions are required.
Thanks in anticipation.
Actually, I was reading Shabana’s Book: Computational dynamics, John Wiley & Sons. 2010. As I'm investigating the dynamics of a multi-body system (having rigid and flexible structures). This code was introduced in chapter 9 of Shabana's book with some demonstrations of different parts of the code and it's usage. Yet, to my best knowledge, it was not mentioned from where to get this code. I would like to know where may I find this code, and/or other alternatives with similar or more capabilities?
Quantum computers are still far from becoming the mainstream. What are the disadvantages of using quantum computers?
I am currently searching a topic for writing research proposal in the area of cloud computing. I am interested in system engineering or distributing systems. I appreciate if you share me the current hot research topics of cloud computing in system engineering or distributing systems.
Does anyone know of any good articles on research methods taught in Phd/Masters programmes in the field of Computing Sciences (Computer Science, Information Systems, IT or Computer Engineering)?
I have data security algorithm ,and i need to apply it in real cloud ,
can you help me and guide me to any way to apply it ??
thanks for you
At the approach of the millennium, there was a lot of awareness raised about the apocalyptic nature of the millennium itself and the millennium bug.
Fears were stoked in people, establishments and governments. Billions were allocated to fight the bug and its predisposing data set up structure in electronic and IT equipment. Even nations without meaningful number of computers and IT systems and support budgeted hefty amounts for the war with the bug.
What was the war with the bug like in your country like and was the hype and the budgeted resources worth it?
Data acquisition systems, and those which monitor the continuously changing value of a parameter, do so for a variety of applications. Among these is statistical analysis, where the objective is to check for patterns of behaviour, or simply to check parameter limits and convey warnings and alarms.
Since computer systems are far happier manipulating discrete values of a parameter, rather than a continuously varying analogue value, it is convenient to measure the value of that parameter at discrete points in time, and pass the time-value pairs to the computer for analysis.
In pursuit of that goal, we will propose a model of a sample-and-hold circuit, and analyse its performance in the time and frequency domains. We will also discuss other applications of sampling, specifically those arising from intentional undersampling.
Experiment Findings Exploring the Sampling Theorem with SPICE
The question is related to timing of a distributed system.
Dear all
I am working now to prepare report about the relation between behavior recognition and system performance, So I need to determine exactly "Is behavior recognition important for computer system?"
Thanks for all
Hello,
I was hoping that someone would be able to advise me please on hardware for an image analysis computer system.
My research group have been analysing long time lapse studies in fluorescence microscopy for bacterial growth (approx. 300 images per stack, 1.5 GB files for each of four fluorescent channels).
We currently use ImageJ, MATLAB and R (but are looking at new software too) to stabilise, correct, track and process the image stacks, then we hope to be able to extract quantitative data on cell division and fluorescence changes etc. Our current system has been struggling to cope with the processing, so are looking into a new system for image analysis. We have received a quote with the following specifications:
Intel Xeon W-2145 3.7 2666MHz 11 8C CPU processor, Windows 10 Pro 64 operating system, 64GB system memory, 1 TB internal storage (we have access to a storage server). For a graphics card we are either looking at a 4 or 8 GB NVIDIA Quadro P1000 or a similar GTX 1080 class card.
I was possibly hoping for some advice on whether there is anything in the specification we should be wary of please. From our research this system seems great, but any extra opinions would be more than welcome and appreciated.
Thank you for your time,
Kelsey Cremin
Many researchers lately have revealed that the future of these professions will become obsolete or will be an integral part of computer systems, whether robotized (hardware) or simply software. What do you think?
How to implement risk management process for Computer system validation to satisfy 21CFR Part 820, Part-11, EU-Annex-11?
Kindly update how to open ultrasound machine bck files on computer system.
There are a lot of new technology implementation modeled to perform human functions.
In the field of conflict resolution, can computer systems (implementing artificial intelligence program) simulate or sense its environment to the degree that it uses its environmental inputs(Not numeric data but neuron specific data that houses emotions) to solve a conflicts situations to a point up-to 50% like humans?
We have been using many traditional machine tools such as lathe machine, drilling machine, shaper machine, etc. in our machine shops. These machines are standalone without any integration with any central computer system or integration with each other. However if we can integrate them with a computer system that will continuously monitor the performance/utilization, maintenance related issues, and help in scheduling them, we will get the massive benefit of advanced machine tools with least investment. So what are the ways we can make these machine a little bit intelligent (?) that will help us in our traditional machine shop? Can IoT/similar technology help us in this case?
A human brain has been created in a laboratory, mainly for research and harvesting purposes. There is growing evidence that computing systems are talking to each other. What happens when we are not the most intelligent being on the planet?
Green IT is a fast growing blanket of IT which also spans through the whole environment. A great number of Hardwares with softwares integrated were designed in the past but consumes a great amount of energy. What area/aspect can we help in. Example. Reducing screen brightness to save Energy is in play, sleeping the computer system when not in use is in play.
I am currently enrolled in Multimedia Communications, at my MS Computer Systems. I want to know different perspectives regarding new research areas in Multimedia Communication.
I need the most common methods and procedures to screen a phytochemical having anticancer activity both in vivo (animal model) in vitro and in computational systems.
I will be very thankful to recieve any article, books or researchers names to continue my research. Thanks a lot!
I am working on Energy optimisation and using DVFS technique. For this purpose I have to give game traces workload i.e game workload instead of planet lab own workload. Currently I am using cloudsim, Can any one suggest the way to add external workload in cloud sim planet lab power Dvfs example?
I have some question regarding SRAM in IoT:
1) Is SRAM required in IoT?
2) If yes, what should be the size?
3) What about reliability of SRAM?
Hello,
I am working with C++(OpenCv) and I want to compute the runtime of my method to compare it with other methods. I use clock_t tStart = clock(); and printf("Time taken: %.4fs\n", (double)(clock() - tStart)/CLOCKS_PER_SEC);
The problem is that i don't know where to put the clock start? Is it after image reading and image preprocessing or after them? also the same for clock end.
Thank you
How to evaluate the security of a computer system ? Is there a framework that is currently used as a standard for measuring computer security ?
I need a concise and clear focused solution for it.
working on heterogeneous computing system for cognitive radio
What are the problems that are being caused by computer systems?
I want to integrate a sub system in a System of System without influencing the normal functioning of the global system. Anyone have any ideas about this subject?
Considering:
- test problems
- quality indicators used in the evaluation
there are many trust issues for cloud computing, such as: security and risk, Here I need more factors to achieve trust in a cloud computing system.
I'm trying to configure 6lbr as a router of a 802.15.4 network
I have Raspberry pi on which raspian is installed
As a question I wonder if it is possible to use slip-radio (serial socket) cooja mote instead of real mote which is connected to the raspberry?
Thank you in advance,
Niousha
In cloud computing system such as Infrastructure as a service (IaaS) what the reason we have to change SLA which is a contract between cloud provider and cloud user during life cycle, over time or dynamically ?
Design architecture for Infrastructure as a services cloud computing system for sharing resources such as CPU, Memory RAM and disk storage,
what should i use client/server architecture or multilayer architecture or another architecture or a combination between them.
Need the types of Activity context of user in pervasive computing system?
what is the main differences between them ?
I need an answer for how to implement scheduling algorithms like ( HEFT ) in heterogeneous computing system. can you say how do i implement as a project . Give brief.
Cloud computing seems to be gaining more grounds than Grid computing. Given that the two computing platforms are intricately overlapping, one may want to know if the emergence of Cloud computing has signalled the end of Grid computing?
This approach clearly points in the right direction, yet it is based on computer systems as they are. It is still necessary (1) to distinguish between the source of complexity (business objects and logic, systems functionalities, platforms technologies), and (2) the dynamics with environments.
I'm looking to use game theory to improve the quality of cloud computing systems
I want to apply game theory to solve the non-functional requirements issue, but should this be done by using game theory or not?
How could I determine if there is any improvement ?
How could I measure the improvement?
Then could I improve all of the 16 non-functional requirements (performance, security, safety, etc)?
I am working in the area of electronic waste. E-waste was ground to fine sizes. The cumulative mass distribution in different sizes is represented by Rosin-Rammler model. Similarly, the differential metal percent distribution in different sizes is represented by the Gaussian distribution model. I have fitting equations for both. Now, I want to get a model that represents the cumulative distribution of metal percent by combining the previous two models.
I found a way to do so, which is shown in the attached image file.
I hope the method followed is correct. Can anyone please suggest me scientific references for similar work.

Mostly we perform several tasks on our systems and each task may require separate windows for execution on desktop. A lot of time is consumed switching and re-sizing windows and face a lot of problem to view all windows at a time.
VSP will allow main screen to be split into two or more sections. It will be helpful for the active program’s window to be re sized properly to one part of the screen. Using this utility, user will be able to do a split system desktop into two or more areas so that active application would not cover up “Full” physical screen when maximized. This way user can visualize parallel windows in defined area that user has a separate monitor for working (virtually) also called “multi-monitor features of Windows”.
It allows user to position and size windows to sections/areas of monitors and helps user to efficiently manage many active windows parallel.
Following image depicts virtually split of desktop screen into two sections/areas:
Application Type: Desktop