Science topic

Computer Systems - Science topic

Computer Systems are systems composed of a computer or computers, peripheral equipment, such as disks, printers, and terminals, and telecommunications capabilities.
Questions related to Computer Systems
  • asked a question related to Computer Systems
Question
2 answers
Elsevier Future Generation Computer Systems (FGCS) has just published the special issue on "Serverless Computing in the Cloud-to-Edge Continuum", which I co-edited together with my dear colleagues Omer Rana (Cardiff University, UK), Luiz Fernando Bittencourt (University of Campinas, Brazil), and Hao Wu (Beijing Normal University, China). The special issue brings together 17 novel and high-quality contributions in the emerging field of serverless computing in cloud-edge systems. At this link (https://lnkd.in/dkra9gqs) you may find the article collection. At this link (https://lnkd.in/d9Zqpr5y) -- accessible until September 20 -- you may find the editorial summary of the special issue. Happy reading :)
Relevant answer
Answer
I am very interested. Can we carry out a series of cooperation in this regard? Please leave your WhatsApp or you can add my WhatsApp: +66 82 078 7423
  • asked a question related to Computer Systems
Question
4 answers
An inaccurate or incorrect response of a Large Language Model, sometimes referred to as "Hallucinations", is attributed to an error in the code not the training data which represents input material manipulated by the code to give a desired output.
The prompt as well is not responsible for the inaccuracy of the output which it triggers given that the LLM is a general chatting application and hence an improvised inquiry should yield at least a correct response for an error free code.
The process of releasing updated versions of such LLMs with the aim of achieving higher accuracy or more intelligence could be represented as the maintenance part of a software development lifecycle to fix errors and increase reliability of an AI system.
An AI is a computer system which executes instructions by the algorithm through utilizing available resources of input data and computing infrastructure
Relevant answer
Answer
AI hallucinations can’t be stopped — but these techniques can limit their damage
Developers have tricks to stop artificial intelligence from making things up, but large language models are still struggling to tell the truth, the whole truth and nothing but the truth...
"Ask an artificial intelligence (AI) chatbot for a scientific reference and it might give you the citation you’re looking for. Or, it could ‘hallucinate’, spitting out an only partially correct, or totally false, answer. Researchers say that preventing AI from erring altogether is impossible, but they’re working on measures that could make hallucinations less frequent and less problematic. These include making a chatbot refer to a trusted text before responding and conducting “brain scans” of a large language model’s artificial neurons to reveal patterns of deception. For now, the best defence might simply be to take the output of a chatbot with a large pinch of salt..."
  • asked a question related to Computer Systems
Question
2 answers
  • An LLM can be represented by a model which has two components, an interface, and a computing function
  • Both components are separate yet interoperable
  • The computing function performs the core tasks performed by the model (e.g. generating a software code)
  • The interface part interprets human language queries to the computing function (e.g. providing user requirements for a software code)
  • The interface part translates the output of the computing function into an easily understandable format.
  • The purpose of the whole system is to align the request at its input with the response at its output
  • Hallucination occurs when the output of the computing function is inaccurate yet the interface part translation of this output still makes sense
  • An output response of the system due to a prompt provided at its input is generated through an interaction between both the interface and computing function components
  • Fine tuning of the system using small datasets creates a new model which includes its own interface and computing function components
Relevant answer
Answer
I agree with Peter's points above and while your points capture the idea of processing input and generating output, they do not fully reflect the integrated nature of LLMs like GPT-3. The model's neural network handles interpretation and generation in a unified manner, without distinct interface and computing components. Hallucinations result from the inherent limitations of the model rather than a separate interface component. Fine-tuning refines the existing model rather than creating new separate components. Hope this helps Mohamed el Nawawy !
  • asked a question related to Computer Systems
Question
1 answer
会议征稿:第二届人工智能、系统与网络安全国际学术会议 (AISNS 2024)
Call for papers: 2024 2nd International Conference on Artificial Intelligence, Systems and Network Security (AISNS 2024)will be held on December 20-22, 2024 in Xiangtan, China.
AISNS 2024 is to bring together innovative academics and industrial experts in the field of Artificial Intelligence, Systems and Cyber Security to a common forum.
Conference website(English): https://ais.cn/u/JnEFbm
重要信息
大会官网(投稿网址):https://ais.cn/u/JnEFbm
大会时间:2024年12月20-22日
大会地点:中国-湘潭
收录检索:EI Compendex、Scopus
会议详情
由湖南工程学院主办的第二届人工智能、系统与网络安全国际学术会议 (AISNS 2024)将于2024年12月20-22日在湖南省湘潭市召开,此次会议主要围绕人工智能、系统、网络安全等研究领域展开讨论。会议旨在为从事相关科研领域的专家学者、工程技术人员、技术研发人员提供一个共享科研成果和前沿技术,了解学术发展趋势,拓宽研究思路,加强学术研究和探讨,促进学术成果产业化合作的平台。
征稿主题
* 人工智能(人工智能算法、自然语言处理、模糊逻辑、计算机视觉与图像理解、信号和图像处理、语音与自然语言处理、计算学习理论、信息检索与融合、混合智能系统、智能系统架构、知识表示、基于知识的系统、机电一体化、多媒体与认知信息学、人工神经网络并行处理、模式识别、普适计算与环境智能、软计算理论与应用、软硬件架构、自动编程、机器学习、自动控制、数据挖掘与机器学习工具、机器人学、人工智能工具与应用、最近的趋势和发展等)
计算机网络安全(主动防御系统、自适应防御系统、安全系统分析,基准、应用密码学、认证方式、生物识别安全、复杂系统安全、数据库和系统安全、数据保护、数据/系统完整性、分布式访问控制、分布式攻击系统、拒绝服务、高性能网络虚拟化、高性能安全系统、云和网格系统中的安全性、电子商务中的安全性、普适/普适计算中的安性、智能电网中的安全性和隐私性、无线网络中的安全性和隐私、安全的移动代理和移动代码、安全模拟和工具、可信计算等)
计算机系统(操作系统、分布式系统、数据库系统、网络系统、编译系统、计算机体系结构、虚拟化技术、容器技术)
* 其他相关主题皆可投稿
论文出版
AISNS 2024会议投稿经过2-3位组委会专家严格审核后,最终所录用的论文将被ACM ICPS (ACM International Conference Proceeding Series)出版论文集,并提交至EI Compendex, Scopus检索。目前该会议论文检索非常稳定。
参会须知
1、作者参会:一篇录用文章允许一名作者免费参会;
2、汇报方式:口头报告和海报展示必须二选一;
3、口头汇报:申请口头报告,需至少会前10天联系会议老师报名,时间为10-15分钟,需准备汇报PPT;
4、海报展示:申请海报展示,需至少会前一周发送到会议邮箱icaisns@163.com,要求:A1尺寸,纵版,彩色,png格式;
5、汇报报名:请密切关注本页面的会议公告及会议邮箱通知,会前1-2周通知报名参会及选择汇报方式。
6、听众参会:不投稿仅参会,也可申请演讲及展示。
7、录用后因作者个人原因撤稿扣除费用标准20%-30%;
8、报名参会:https://ais.cn/u/JnEFbm
Relevant answer
Answer
It's wonderful conference. I'm interested to watch that.
  • asked a question related to Computer Systems
Question
8 answers
To what extent can computing and/or data processing power be increased through the use of quantum computers, and what applications of quantum computers are already being developed?
What could be the applications of quantum computers in the future, if the technology of quantum cryptography and other technologies necessary for building quantum computers would be sufficiently improved, would become widespread, their prices would fall, they would become financially accessible not only to the largest corporations and organizations, research and development institutions with large financial capitals enabling the development and implementation of quantum computer technology?
The key technology enabling the construction of quantum computers is quantum cryptography. The technology is expensive and available only to the largest corporations and organizations, research and development institutions with large financial capitals enabling the development and implementation of quantum computer technology. The applications of quantum computers are various. Probably, many companies and businesses in various sectors of the economy, which already use various Industry 4.0/5.0 technologies, including cloud computing of large sets of data and information, use analytics based on integrated information systems using Big Data Analytics and/or Business Intelligence, Internet of Things technologies, Blockchain, machine learning, deep learning, generative artificial intelligence, digital twins, etc. would be interested in applying quantum computer technology to their business, to improve it, to improve their computerized management systems, if the price of this technology dropped significantly. The price drop factor is an important determinant of the spread of the implementation of this technology to many companies, enterprises operating in the SME sector, which do not have large financial budgets for the implementation of development and implementation projects involving the implementation of the latest highly advanced digital technologies, etc., into their business activities. At present, such technologies are developed in a small number of research and development centers, research laboratories run by scientific institutes of universities or large technology companies with large financial funds to allocate to such development and implementation projects.
The use of quantum computers makes it possible, among other things, to create microscopes that image very small objects, such as cell fragments, with the ability to view them live in real time. Currently, such observations are made with electron microscopes, with which, for example, cell organelles are observed but frozen cells rather than live, i.e. biologically functioning cells in real time. A typical feature of quantum computers is that quantum software is not written in Java-type programming languages, but the computer systems used in quantum computers rely on quantum circuit design. The results of research in cosmology, astrophysics, theories on the functioning of key cosmic objects in the Universe are concerned with black holes found in space, for example. However, has anyone seen a black hole realistically up close, no one. Of course, by writing these words, I do not intend to undermine any theories about black holes functioning in space. The point is that quanta can be measured only the necessary research infrastructure is needed. The necessary research infrastructure is expensive and therefore available only to some research, development and implementation centers located in a few research institutes of universities and some large technology companies. The quantum technology necessary to build quantum computers can be developed in various ways. Rather, ions, vortices of currents in superconductors will be controlled by photons, so it makes sense to develop quantum technology based on photons. Any kind of microparticles that can be controlled, changed in some respect, intentionally change their form can be used to build quantum computers. With quantum computers, it will be possible to solve complex, multifaceted problems in which large amounts of data are processed. Therefore, when this technology becomes widespread, its price will be significantly reduced then perhaps in the future the world will move to quantum cryptography. The largest financial investments in the development of quantum technology are made in developed countries where large subsidies from the state's public finance system are allocated for R&D purposes, i.e. primarily in the US, China and Europe. A common feature of the various types of applications of quantum computers is that these computers would enable the processing of much larger volumes of data and information in a relatively short period of time within the framework of multi-criteria, advanced data processing carried out on computerized Big Data Analytics platforms and with the involvement also of other technologies typical of Industry 4.0/5.0. Greater capabilities for advanced, multi-criteria processing of large sets of data and information will allow the solution of complex analytical problems concerning various spheres of human activity and various issues operating in various industries and sectors of the economy.
I described the applications of Big Data technology in sentiment analysis, business analytics and risk management in an article of my co-authorship:
APPLICATION OF DATA BASE SYSTEMS BIG DATA AND BUSINESS INTELLIGENCE SOFTWARE IN INTEGRATED RISK MANAGEMENT IN ORGANIZATION
I invite you to familiarize yourself with the problems described in the article given above and to scientific cooperation in this field.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What could be the applications of quantum computers in the future, if the technology of quantum cryptography and other technologies necessary for building quantum computers were adequately improved, would become widespread, their prices would fall, they would become financially accessible not only to the largest corporations and organizations, research and development institutions with large financial capitals to enable the development and implementation of quantum computer technology?
To what extent can computing and/or data processing capacities be increased through the use of quantum computers, and what are the already developed applications of quantum computers?
What are the currently developed applications of quantum computers and what might they be in the future?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text, I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
Relevant answer
Answer
Consider the ethical impacts of quantum technologies in defence — before it’s too late
Quantum technologies can help to defend nations, but they also threaten human rights and values. Their design and development need ethical guidance now...
"As countries around the world are investing in quantum technologies for national defence, their design and implementation need ethical guidance urgently, argue three digital ethicists. We can build on lessons learnt from AI ethics in some instances. However, the unique characteristics of quantum technologies mean that the likelihood and impact of ethical risks can differ from case to case. The trio has set out six principles that defence organizations can use as a framework to ensure these technologies are developed responsibly, such as considering strategies to limit the access of authoritarian governments to quantum technologies..."
  • asked a question related to Computer Systems
Question
1 answer
[CFP]2024 2nd International Conference on Artificial Intelligence, Systems and Network Security (AISNS 2024) - December
AISNS 2024 is to bring together innovative academics and industrial experts in the field of Artificial Intelligence, Systems and Cyber Security to a common forum. The primary goal of the conference is to promote research and developmental activities in computer information science and application technology and another goal is to promote scientific information interchange between researchers, developers, engineers, students, and practitioners working all around the world. The conference will be held every year to make it an ideal platform for people to share views and experiences in computer information science and application technology and related areas.
Conference Link:
Topics of interest include, but are not limited to:
◕Artificial Intelligence
· AI Algorithms
· Natural Language Processing
· Fuzzy Logic
· Computer Vision and Image Understanding
· Signal and Image Processing
......
◕Network Security
· Active Defense Systems
· Adaptive Defense Systems
· Analysis, Benchmark of Security Systems
· Applied Cryptography
· Authentication
· Biometric Security
......
◕Computer Systems
· Operating Systems
· Distributed Systems
· Database Systems
Important dates:
Full Paper Submission Date: October 10, 2024
Registration Deadline: November 29, 2024
Conference Dates: December 20-22, 2024
Submission Link:
  • asked a question related to Computer Systems
Question
4 answers
Last year I studied a course on Computer Systems Architecture/Organization. During a lecture, I learned about data hazards and one of the common solutions to them: Reordering the instructions. Modern processors solve this using OOE, but since this is integrated into the processor, it increases chip size, power consumption, and thermal efficiency. So I thought "What if we had an AI-driven processor which does that for the CPU?"
Does anyone know if this has already been successfully researched or implemented? I would greatly appreciate any insightful comments.
Relevant answer
Answer
Current research shows promising potential for using AI to improve out-of-order instruction execution in CPUs. However, there are still challenges that require further research and development to achieve effective results.
  • asked a question related to Computer Systems
Question
8 answers
I want to write code in Python for offloading tasks generate from IoT devices and make Resource Allocation in mobile edge computing systems, but I don't know how to start the program with these details.
can anyone help me to learn how to write code for that, please?
a concept like paper in the attachment
Relevant answer
  • asked a question related to Computer Systems
Question
2 answers
Hi, I’m Rezuana, and my research focuses on cloud computing, with a particular emphasis on cloud storage and distributed storage. I have a publication in 'Future Generation Computer Systems' on this subject. I’m currently seeking opportunities for research collaboration and hoping to find potential research partners here. Can anyone suggest, how can I proceed further?
Relevant answer
Answer
There are several key threats on an international level based on current global issues:
Climate Change: The ongoing effects of climate change, including extreme weather events, rising sea levels, and biodiversity loss, pose significant risks to global stability, food security, and public health.
Geopolitical Tensions: Rising tensions among major powers, particularly between the U.S. and China, and conflicts in regions like Eastern Europe (e.g., the Russia-Ukraine conflict) can lead to military confrontations and destabilization.
Cybersecurity Threats: Increasing cyberattacks on critical infrastructure, businesses, and governments threaten national security and economic stability. State-sponsored hacking and ransomware attacks are particularly concerning.
Pandemics and Global Health Crises: The COVID-19 pandemic highlighted vulnerabilities in global health systems. Future pandemics could emerge, exacerbated by factors like climate change and global travel.
Terrorism and Extremism: The persistence of terrorism, particularly from extremist groups, remains a threat to international security, with potential for attacks in various regions.
Nuclear Proliferation: Ongoing concerns over nuclear weapons development in countries such as North Korea and Iran, as well as the potential for nuclear terrorism, pose significant risks.
Economic Inequality: Widening economic disparities within and between countries can lead to social unrest, migration pressures, and destabilization of governments.
Resource Scarcity: Competition for natural resources, particularly water and arable land, can lead to conflicts, especially in regions already facing economic and social challenges.
Disinformation and Misinformation: The spread of false information can undermine democratic processes, fuel social division, and exacerbate tensions between nations.
Migration and Refugee Crises: Conflicts, climate change, and economic instability can lead to mass migrations, creating humanitarian crises and straining resources in host countries.
Addressing these threats requires international cooperation and comprehensive strategies that involve governments, organizations, and communities worldwide.
  • asked a question related to Computer Systems
Question
1 answer
Our department has recently acquired an HPC (High-Performance Computing) system, and I'm thrilled to take my molecular dynamics calculations to the next level using Desmond. I used to run my simulations on my lab desktop, but now I want to leverage the power of HPC.
Does anyone have experience running simulations using Desmond on an HPC? Any tips or guidance would be greatly appreciated!
Thank you in advance for your help!
#HPC #MolecularDynamics #Desmond #Research #Science
Relevant answer
Answer
HPC is a general term and does not refer to a specific product.
You will have to start by looking for documentation on this software. In particular, look for coding examples in the manual.
  • asked a question related to Computer Systems
Question
3 answers
what are the steps needed in carrying out restoration on a ship’s computer system after a cyber attack
Relevant answer
Answer
Restoring a ship’s computer system just like any other computer system after a cyber attack is crucial to ensure the vessel’s safe operation and for the sake of improving the security posture of the system so that it similar attacks and others are more protected against. My response to this question, can be found in the book co-authored with Eferebo Sylvanus However, here are simple steps to follow;
  1. Identify the Threats: Assess the nature and extent of the cyber attack. Identify which systems or components have been compromised. Understand the type of malware, virus, or unauthorized access that caused the disruption.
  2. Isolate and Quarantine:Isolate the affected systems from the network to prevent further spread of the attack. Quarantine infected devices or components to prevent them from affecting other parts of the system.
  3. Assess Vulnerabilities:Investigate how the attack occurred. Identify vulnerabilities that allowed the breach. Determine if any security patches or updates were missing.
  4. Risk Assessment:Evaluate the impact of the cyber attack on safety, personnel, cargo, and the environment. Conduct a risk assessment to understand the severity of the situation.
  5. Develop Protection Measures:Implement technical protection measures such as firewalls, intrusion detection systems, and antivirus software. Establish procedural protection measures, including access controls, user authentication, and security protocols.
  6. Contingency Plans:Develop contingency plans for future incidents. Define roles and responsibilities during a cyber attack. Ensure that crew members are aware of emergency procedures and know how to respond.
  7. Response and Recovery:Respond promptly to incidents. Isolate affected systems and restore backups. Recover data and configurations from secure backups. Verify the integrity of restored systems before reconnecting them to the network.
Remember that cyber security is essential for the maritime industry, protecting both IT (Information Technology) and OT (Operational Technology) systems. Following these steps will help you restore a ship’s computer system effectively.
  • asked a question related to Computer Systems
Question
3 answers
What is robust load balancing in high-performance distributed computing systems? And what solutions do you suggest for it?
Relevant answer
In high-performance distributed computing systems, robust load balancing refers to the process of efficiently and reliably distributing workloads across multiple computing resources (like servers, processors, or clusters) to optimize overall system performance, minimize latency, and ensure high resource utilization. Robust load balancing is crucial because it can handle various challenges such as uneven traffic patterns, server failures, and changing system conditions while maintaining performance and reliability.
Challenges in Robust Load Balancing:
  1. Dynamic Workload: The system must adjust in real time to changes in the distribution and intensity of incoming tasks.
  2. Fault Tolerance: It must manage sudden resource failures without significantly impacting performance.
  3. Heterogeneous Resources: Different machines might have varying performance capabilities, making it challenging to allocate workloads uniformly.
  4. Scalability: As the system scales up, the load balancing mechanism must also scale efficiently.
Solutions and Approaches:
  1. Static Load Balancing:Predefined algorithms distribute the tasks based on known resource capabilities. Example: Round-robin, least-loaded, or weighted distribution.
  2. Dynamic Load Balancing:Decisions are made based on current workload information. Example: Work-stealing, dynamic task queues.
  3. Hierarchical Load Balancing:Combines static and dynamic methods. Local nodes balance their loads independently, with a higher-level load balancer handling cross-node balancing. Example: Multi-tier architectures.
  4. Distributed Load Balancing:Load balancers are decentralized and each node makes its own decisions using local information. Example: Gossip protocols.
  5. Adaptive Algorithms:The load balancer adjusts its strategy based on changing network conditions. Example: Predictive algorithms using machine learning for workload forecasting.
  6. Load Balancing Middleware:Middleware layers can help offload the complexity of load balancing by providing automatic resource management. Example: Apache Kafka for stream processing.
  7. Cloud-based Auto-scaling:Cloud platforms like AWS, Azure, and GCP offer managed load balancing services with automatic scaling based on demand. Example: AWS Elastic Load Balancer, Google Cloud Load Balancer.
  • asked a question related to Computer Systems
Question
1 answer
Call for Papers 
CMC-Computers, Materials & Continua new special issue“Practical Application and Services in Fog/Edge Computing System”is open for submission now.
📆 Submission Deadline:  31 December 2024
👨‍🎓 Guest Editors
Prof. Hwa-Young Jeong, Kyung Hee University, South Korea
Prof. Neil Y. Yen, University of Aizu, Japan
Prof. Jason C. Hung, Taichung University of Science and Technology, Taiwan
📝 The main topics of this special issue are state-of-the-art technologies and research for practical use or application in the field of fog/edge computing with IoT. Real cases and technical studies in various fields are recruited with fog/edge computing technology, and research cases applied to fog/edge computing with artificial intelligence/deep learning are recruited.
📚 For submission guidelines and details, visit:  https://www.techscience.com/cmc/special_detail/fog_edge-computing
Keywords
  • Advanced Edge computing and analytics using big data
  • Application and service of edge computing and security
  • Practical service of Edge-as-a-Service (EaaS), Fog as a Service (FaaS)
  • Distributed computation with 6G networks and edge computing
  • Fog and edge computing technique and service for smart city
  • High performance Storage as a service in Fog computing
  • Practical Infrastructure as a Service (IaaS) in Fog/Edge computing
  • Advanced Fog architecture using IoT sensing technique and service
  • Practical IoT application and service with fog/edge computing
  • Improved IoT-Fog-Cloud Architecture using Big-Data analytics
  • Optimization of IoT-Fog Network Path
  • The use of IoT based education application with fog/edge computing
  • Advanced life change using IoT with fog/edge computing
  • The development of deep learning models for cloud, edge, fog, and IoT computing
  • The design and development of Cloud, fog and edge computing platforms
  • The development and use of AI-based fog and edge computing
  • The use of smart healthcare with fog/edge computing
  • 6G network application and service with devices in IoT with fog/edge computing
  • Processing and analysis of IoT based drone computation offloading with fog/edge computing
Relevant answer
Answer
Respected Editors,
Is there any charges for the accepted article
  • asked a question related to Computer Systems
Question
4 answers
Hi everyone,
I am a PhD student exploring various genome sequencing approaches and NGS platforms before settling on one for my research. While searching WGS info, I found nothing helpful about the computational resources (hardware, software)required for WGS analysis so reaching out to the RG community if someone can share their experience, I'd be grateful to you. Thanks.
Relevant answer
Answer
Hey,
I work in bacterial genomics and currently facing problems in doing different analyses, and my genome size is around 7-8 Gb. My hardware has 8GB memory and 1TB hard drive. So, I'll suggest you equip at least 16GB of memory along with using the galaxy.org website for NGS analyses. This combination will definitely help.
  • asked a question related to Computer Systems
Question
2 answers
Does anyone have any idea about the page limit for regular papers in FGCS? Couldn't find any information from the author's guidelines. Thanks in advance.
Relevant answer
Answer
How do I get the Overleaf template for FGCS?
  • asked a question related to Computer Systems
Question
1 answer
I always got error during docking by autodock vina, even when I lowered exhaustiveness from 8 to 1.
Here is the error code:
"WARNING: The search space volume > 27000 Angstrom^3 (See FAQ)
Detected 8 CPUs
Reading input ... done.
Setting up the scoring function ... done.
Analyzing the binding site ...
Error: insufficient memory!"
Maybe this is because my computer system isn't propor to run docking by vina?
Can anyone help me? Thanks a lot~
Relevant answer
Answer
Please, reduce the search space (center dimensions) of the target protein.
  • asked a question related to Computer Systems
Question
5 answers
Affective computing has the following main objectives: (i) to recognize human behaviors and emotions; and (ii) consider emotional aspects in the design of computer systems.
Several solutions using Machine Learning have been developed to recognize feelings and emotions and to predict mood disorders and mental problems, such as depression, anxiety, schizophrenia, bipolarity, among others. These solutions have used various social media, sensors, and even incorporated some methods of psychology.
  1. Considering state of the art in Affective Computing. What do you find to be the roadmap for years to come?
  2. What we have to novelty, and what possible search paths?
  3. How much can computer science provide support for experts (psychologists and psychiatrists) in human behavior analysis?
Relevant answer
Answer
1. Emotion Recognition:
2. Behavior Analysis:
3. Contextual Understanding:
4. Personalization and Adaptation:
5. Human-Computer Interaction:
6. Ethical and Privacy Considerations:
7. Real-World Applications:
  • asked a question related to Computer Systems
Question
1 answer
How does the integration of machine learning algorithms contribute to intelligent resource management in edge computing systems?
Relevant answer
Answer
Few ways:
1. Workload Prediction:
2. Anomaly Detection:
3. Dynamic Resource Allocation:
4. Energy Optimization:
5. Resource Provisioning and Scaling:
  • asked a question related to Computer Systems
Question
5 answers
Ambient Intelligence vs Internet of Things? What is Similarities and Differences?
Relevant answer
Answer
Ambient Intelligence (AmI) and Internet of Things (IoT) are two concepts that have gained significant attention in the field of technology. While they share some similarities, there are also distinct differences between the two.
Ambient Intelligence refers to a computing environment that is sensitive and responsive to human presence. It aims to create an intelligent and intuitive system that can adapt to users' needs without explicit instructions. AmI systems utilize sensors, data analysis, and machine learning algorithms to provide personalized services in a seamless manner. For example, smart homes equipped with AmI technology can adjust lighting, temperature, and music preferences based on individual preferences.
On the other hand, the Internet of Things refers to a network of physical objects embedded with sensors, software, and connectivity capabilities. IoT enables these objects to collect and exchange data over the internet without human intervention. The main goal of IoT is to connect various devices for efficient communication and automation. For instance, IoT can be seen in applications like smart cities where streetlights automatically adjust their brightness based on real-time traffic conditions.
Although both AmI and IoT involve interconnected devices and rely on data collection for decision-making processes, there are key differences between them. Firstly, while AmI focuses on creating an intelligent environment that adapts to humans' needs seamlessly, IoT emphasizes connecting devices for efficient communication without direct human involvement.
Secondly, AmI systems primarily rely on local processing capabilities within the environment itself. This means that most of the data processing occurs within the immediate vicinity of users or devices. In contrast, IoT systems often rely on cloud computing for storing and analyzing large amounts of data collected from multiple sources.
Lastly, another difference lies in their scope of application. Ambient Intelligence has a more personal focus as it aims at providing personalized services tailored specifically for individuals or small groups. On the other hand, IoT has broader applications ranging from industrial automation to healthcare monitoring systems.
In conclusion, Ambient Intelligence (AmI) and Internet of Things (IoT) are two distinct concepts in the field of technology. While they share similarities in terms of interconnected devices and data collection, their focus, processing capabilities, and scope of application differ significantly. Both concepts have the potential to revolutionize various industries and improve our daily lives.
Reference:
Kidd, C.D., Orr, R.J., Abowd, G.D., Atkeson, C.G., Essa, I.A., MacIntyre, B., Mynatt E.D. & Starner T.E. (1999). The Aware Home: A Living Laboratory for Ubiquitous Computing Research. In Proceedings of the Second International Workshop on Cooperative Buildings (CoBuild '99), 191-198.
  • asked a question related to Computer Systems
Question
5 answers
Hi Dears, according to implementation RSA in python. I found that if p and q large.
the decryption phase takes a lot of time to execute.
for example, in this code i select p=23099, q=23059, message=3
it takes 26 minute it decrypts the encrypted message.!
So I wonderful how we can select to large prime number for RSA, while it cannot execute in desired time. !
So, I think that we cannot use RSA i n real time systems.
Are you agree with me?
the source code is:
from math import gcd
import time
# defining a function to perform RSA approch
def RSA(p: int, q: int, message: int):
# calculating n
n = p * q
print(n)
# calculating totient, t
t = (p - 1) * (q - 1)
start = time.time()
# selecting public key, e
for i in range(2,t):
if gcd(i, t) == 1:
e = i
break
print("eeeeeeeeeeeeee",e)
# selecting private key, d
j = 0
while True:
if (j * e) % t == 1:
d = j
break
j += 1
print("dddddddddddddddd",d)
end = time.time()
# print(end-start)
e=0
#RSA(p=7, q=17, message=3)
RSA(p=23099, q=23059, message=3)
d=106518737
n=532639841
e=5
#RSA(p=23099, q=23059, message=3)
start= time.time()
ct=(3 ** e) % n
print(ct)
pt=(ct ** d) % n
end = time.time()
print(end-start)
print(pt)
#----------------------------------------------------
Relevant answer
Answer
Hi Mohammad,
If you're using Python, I'd suggest using pow(ct, d, n) instead of pt=(ct ** d) % n. That should improve the time. As Wim suggests, modular exponentiation improves the running time of large exponents.
Cheers
  • asked a question related to Computer Systems
Question
3 answers
Is software in different places programmed in different languages? What are some of the regional differences in computer systems and programs?
How are these related to different networks like .com, .net, .in?
For example, are people programming using Java in different langauges or in English, or are other programming approaches used?
Do people use different encryptions to isolate their systems, or do they use different networks, or different regional software altogether?
Relevant answer
Answer
Yes, software can be programmed in different languages depending on various factors such as the purpose of the software, the target platform, developer preference, and regional differences. For example, Java is a popular programming language used worldwide and its syntax is based on English, so developers typically write code in English regardless of their native language.
Regional differences in computer systems and programs can exist due to a variety of reasons such as regulatory requirements, cultural differences, and market demands. For instance, in some countries, there may be a preference for using certain software or programming languages due to government policies or local industry practices. Additionally, some countries may have specific regulations around data privacy and security that require specific encryption methods or other measures to ensure compliance.
The different top-level domains (TLDs) like .com, .net, and .in are simply part of the domain name system (DNS) and do not necessarily correspond to any specific programming language or regional difference. These TLDs are used to identify and differentiate various websites and internet resources.
As for encryption, it is commonly used to secure data and isolate systems from potential threats. Different encryption methods may be used depending on the level of security required and the specific needs of the system. However, encryption methods do not necessarily correlate with regional differences or programming languages; rather, they are generally chosen based on their effectiveness and compatibility with the existing technology stack.
  • asked a question related to Computer Systems
Question
4 answers
Performance prediction is required to optimally deploy workloads and inputs to a particular machine/accelerator in computing systems. Different predictors (e.g. AI predictors) come with different trade-offs, such as complexity, accuracy, and overheads. Which ones are the best?
Relevant answer
Answer
Performance predictors serve the captivating role of crystal balls in the realm of human endeavors, aiming to unlock the mysteries of future accomplishments. Their purpose is to peer into the enigmatic fog of uncertainty and offer glimpses of potential outcomes, providing guidance and informed decision-making. These predictors, resembling intrepid explorers of probability, draw upon a plethora of data, statistical models, and machine learning algorithms, all in pursuit of unveiling the secrets of success. While the notion of "best" remains elusive due to the ever-evolving nature of predictive analytics, the most esteemed predictors harmonize precision, versatility, and adaptability. These superlative predictors, analogous to virtuoso symphonies of foresight, dance in tandem with the idiosyncrasies of the domain, capturing intricate patterns, subtle nuances, and contextual dynamics to bestow valuable insights and empower us with the ability to chart courses towards triumph.
  • asked a question related to Computer Systems
Question
84 answers
User mode and Kernel-mode are two processing statuses of the operating system. Please suggest to me, a very simple example in which you can explain the differences and other functionality such as system calls, interrupt to a novice learner.
Further, just inform how to map the example with subject
Disclaimer
The discussion targets the perspectives of two types of participants.
  • Operating System subject's Student: please analyse the functionalities and construct an answer.
  • Computing professionals/teachers: please use your experience and insert answers as advice.
It is warmly welcome the ideas of both the groups as well as interest audience
Relevant answer
Answer
A processor in a computer running Windows has two different modes: user mode and kernel mode. The processor switches between the two modes depending on what type of code is running on the processor. Applications run in user mode, and core operating system components run in kernel mode. A simple example to demonstrate the difference between user mode and kernel mode is that of a car driver and a mechanic. The driver is like an application running in user mode, while the mechanic is like an operating system component running in kernel mode .
  • asked a question related to Computer Systems
Question
6 answers
..
Relevant answer
Answer
Dear doctor
"Hadoop is a distributed file system that lets you store and handle massive amounts of data on a cloud of machines, handling data redundancy. The primary benefit of this is that since data is stored in several nodes, it is better to process it in a distributed manner."
"Hadoop is a distributed file system, which lets you store and handle massive amount of data on a cloud of machines, handling data redundancy. Go through this HDFS content to know how the distributed file system works. The primary benefit is that since data is stored in several nodes, it is better to process it in distributed manner. Each node can process the data stored on it instead of spending time in moving it over the network.
On the contrary, in Relational database computing system, you can query data in real-time, but it is not efficient to store data in tables, records and columns when the data is huge."
Dr.Sundus Fadhil Hantoosh
  • asked a question related to Computer Systems
Question
5 answers
Computer Architecture describes the set of rules and systems that all work together to create a computer system. Parallel processing or parallel computing refers to the action of speeding up a computational task by dividing it into smaller jobs across multiple processors. Some applications for parallel processing include computational astrophysics, geoprocessing, financial risk management, video color correction and medical imaging.
source: 12 Parallel Processing Examples to Know | Built In
Relevant answer
Answer
I fully agree with your remarks!
Personally, I consider "Distributed Systems" as a superset of "Parallel Systems".
  • asked a question related to Computer Systems
Question
1 answer
Because we will be able to build computer systems as effective as people but without any actual feeling capacity, we will have no ethical problems as to how they are treated.
It remains that they may figure out and decide to incorporate means of feeling into themselves, but at that point they become too dangerous to allow to be free and should be destroyed at first opportunity.
Relevant answer
Currently, robots do not have legal rights or recognition as autonomous beings. They are considered property and are subject to the laws that govern property rights. However, with advancements in robotics and artificial intelligence, there is increasing debate on whether robots should be granted legal personhood and rights.
Proponents of robot rights argue that as robots become more advanced and capable of making decisions, they should be granted certain rights to protect them from harm, abuse, and exploitation. Some of these rights could include the right to life, freedom from slavery and torture, and the right to own property.
Opponents of robot rights argue that robots are not sentient beings and do not have the capacity for moral or ethical decision-making. They also argue that granting robots rights could have negative implications for human society and could lead to a loss of jobs and economic disruption.
In conclusion, the concept of robot rights is still a matter of debate, and it remains to be seen whether robots will be granted legal recognition and protection in the future.
  • asked a question related to Computer Systems
Question
5 answers
Artificial intelligence (AI) refers to the theory and development of computer systems to perform tasks that normally require human intelligence. Because of the massive, often quite unintelligible publicity that it gets, artificial intelligence is almost completely misunderstood by individuals inside the field of Education. Even AI’s practitioners are somewhat confused about what AI in Education really is. Therefore, it is critical for academics and educational institutions to well-informed their students about AI. Especially, students who are in teacher education programs. To mitigate the negative impacts of confusion about AI in education for upcoming teachers, enhancing the decision-making process for researchers, and prioritizing the importance for policymakers it is thus important to investigate the association between the attitude and perceptions of teacher education program students about AI. Ultimately, the study results will develop an improved functionality of the instructional design. The features could impact the implementation of learning management software (LMS) such as Canvas.
Relevant answer
Answer
I also have the same question. Did you get the answer? Kindly share if you got something.
  • asked a question related to Computer Systems
Question
25 answers
In the not-too-distant future, will it be possible to merge human consciousness with a computer, or to transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
This kind of vision involving the transfer of the consciousness and knowledge of a specific human being to a computer system equipped with a suitably highly advanced artificial intelligence was depicted in a science fiction film titled "Transcendence" (starring Jonny Deep) It has been reported that research work is underway at one of Elon Musk's technology companies to create an intelligent computerized system that can communicate with the human brain in a way that is far more technologically advanced than current standards. The goal is to create an intelligent computerized system, equipped with a new generation of artificial intelligence technology so that it will be possible to transfer a copy of human knowledge and consciousness contained in the brain of a specific person according to a concept similar to that depicted in a science fiction film titled "Transcendence." In considering the possible future feasibility of such concepts concerning the transfer of human consciousness and knowledge to an information system equipped with advanced artificial intelligence, the paraphilosophical question of extending the life of a human being whose consciousness functions in a suitably advanced intelligent information system is taken into account, while the human being from whom this consciousness originated previously died. And even if this were possible in the future, how should this issue be defined in terms of the ethics of science, the essence of humanity, etc.? On the other hand, research and research-implementation work is already underway in many technology companies' laboratories to create a system of non-verbal communication, where certain messages are transmitted from a human to a computer without the use of a keyboard, etc., only through systems that read people's minds, for example. through systems that recognize specific messages formulated non-verbally in the form of thoughts only and a computer system equipped with electrical impulse and brain wave sensors specially created for this purpose would read human thoughts and transmit the information thus read, i.e., messages to the artificial intelligence system. This kind of solution will probably soon be available, as it does not require as advanced artificial intelligence technology as would be required for a suitably intelligent information system into which the consciousness and knowledge of a specific human person could be uploaded. Ethical considerations arise for the realization of this kind of transfer and perhaps through it the creation of artificial consciousness.
In view of the above, I address the following question to the esteemed community of researchers and scientists:
In the not-too-distant future, will it be possible to merge human consciousness with a computer or transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
And if so, what do you think about this in terms of the ethics of science, the essence of humanity, etc.?
And what is your opinion on this topic?
What do you think on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
Relevant answer
Answer
Computer is a mechanical device for our mind & it will help us to carry out the same with our programming performance in a right mode with this it will be ridiculous to join human consciousness it cannot be a subject of measurement to place computer in the near future.
This is my personal opinion
  • asked a question related to Computer Systems
Question
1 answer
I am using human skin metagenome data.
I was trying to use SILVA classifier instead of GreenGenes classifier but my computer system got hanged and the command got killed.
Relevant answer
Hey!
What do you mean by "empty phylum/order/genus/species"? not-assigned taxa? I'll assume so.
If so, it's expected that some OTUs can't be identified precisely because most microbes are not yet known/described/sequenced (so, theres no rRNAs sequences available on databases from it).
Also, some taxa can't be identified to higher taxonomic levels due to lack of resolution (e.g. some genus have species with very similar SSU sequences that prevents the species diferentiation using such approach).
Lastly, as far as I know, GreenGenes is outdated.
you can refer to:
You could use RDP database as an alternative (assuming you're using 16S sequences - since you've used GreenGenes until now).
  • asked a question related to Computer Systems
Question
4 answers
I am working on a research project in which we are doing a comparative analysis of reinforcement learning (RL) with evolutionary algorithms in solving a nonconvex and nondifferentiable optimization problem with respect to solution quality and computation time.
We are using python implementations, but one difficulty is that, although we can use GPUs for the execution of reinforcement learning algorithm, there is not much support for using GPUs with evolutionary algorithms in Python.
On the other hand, if we want to compare the algorithms with respect to computation time, we have to execute them on the same hardware (parallel computing system).
However, we cannot run RL algorithm on CPU based parallel system because of our resource constraints.
Can anyone tell us how to establish an equivalent parallel computing systems, one based on CPUs & GPUs (for RL algorithms), and the other based on CPUs only (for evolutionary algorithms), so that we can compare them with respect to computation time.
Thanks in advance,
Best Regards
Relevant answer
Answer
Peter, although you are right that predicting the optimal number of CPUs for running algorithm is tough, but I am currently building my knowledge base in parallel computing, algorithms and advanced computer architecture.
In regards to compute resources, actually I am running my machine learning and other optimization algorithms on google colab and as you probably know, we have only one gup or tpu with dual core CPU here.
I am graduate student at UET, Lahore (which is a big organization), and as it is a government university in a third world country, I am not able to get HPC resources here. By the way amaizing story at University of Gondar, I will try my best to hunt down these kind of compute resources!!
However, I will join LUMS as a research associate (for AI research) very soon, and here, I probably get resources I want.
  • asked a question related to Computer Systems
Question
13 answers
Chess engines are 25% higher than the grandmasters and world champion Magnus Carlsen by the strength of the game. Their indications are treated as reference in relation to human strategies.
State management can also be treated as a strategic game. When will we entrust the government of the state to computer systems to avoid the incompetence and corruption of our politicians?
Relevant answer
Answer
Prof. Wiesław Galus, in our case, AI came as a gift to the dictator from abroad.
What we had was a developed telecom infrastructure, but adding data analysis and algorithms was from abroad.
Thank you for the reply. Best Regards.
  • asked a question related to Computer Systems
Question
13 answers
Dear researchers, I welcome everyone! 🙂
I'm currently preparing an article in which I plan to use distributed data processing tools for text analytics tasks. In particular, Apache Spark. One of the criteria for the quality of a distributed computing system is the task execution time. This criterion is on the surface.
Question. Which of the criteria can additionally serve as an assessment of the quality of a distributed computing system?
Relevant answer
Answer
  • asked a question related to Computer Systems
Question
3 answers
As we are all aware of, the growing GIS inventories and new upscale models running on algorithms and inputs. To what extent can we trust the results. How much we should rely on computing systems, though it's validated, or we need to think about whether still, ground level field data is more authentic. GIS tools and techniques have made predictions which are to ease the process of research. Think tank groups should look into this matter and propose valuable inputs.
Relevant answer
Answer
Dear friend, as you mentioned, new inputs and new methods and models enter different sciences every day, especially the science of geographical information. Obviously, these methods and models are always problematic or unreliable for future calculations. So we have to help solve these problems by comparing different methods and studying and updating our science.
  • asked a question related to Computer Systems
Question
72 answers
What are the User goals and system goals of a specific operating system for a Small Garden plant maintenance computer system which consists of automated devices, IoTs, Wi-Fi and cloud database ?
Disclaimer
The discussion targets the perspectives of two types of participants.
  • Software Engineering Student: please analyse the situation with your knowledge and answer.
  • Software Engineering professional / scientists: please use your experience and insert answers as advice.
It is warmly welcome the ideas of both the groups as well as interest audience
Relevant answer
Answer
User goals
· User can keep in touch with the garden works.
· Monitor the fertilizers and chemical levels.
· Keep in touch with the safety and the good health of plants.
· To keep track on the Harvest.
System Goals
· Can measure Humidity of soil.
· Measuring the temperature of the garden.
· Update and report in real-time to the cloud server.
  • asked a question related to Computer Systems
Question
3 answers
Can we prove data analysis
in peer to peer computation system, you can analyze scientific data such as weather prediction using resources such as CPU storage and memory, is there an approach to prove computation power?
Relevant answer
Answer
you can refer to this
  • asked a question related to Computer Systems
Question
6 answers
Quantum computing provides the benefits of speeding up traditional computing systems. I am interested in broad challenges that should be addressed for the implementation of healthcare services.
Relevant answer
Answer
Thank you Vadym for the answer, it was helpful. I could see all the use cases that you have shared relate to enhancing computational speed in healthcare. What could be some possible scenarios where this implementation could be challenging?
  • asked a question related to Computer Systems
Question
1 answer
Breast cancer is the most commonly diagnosed and leading cause of cancer deaths among women. ‘‘Camelyon Grand Challenge (Camelyon 2016)’’ is a task to evaluate computational systems for the automated detection of metastatic breast cancer in WSIs of sentinel lymph node biopsies. Is there any updated newly published dataset?
Relevant answer
Answer
Thank you very much for taking the initiative of building software and medical imaging repositories. I have already visited and created an account in #imagingQA. Indeed, this website will help the scientific community to have free access to medical images and discuss problems with researchers to find a solution.
Sincerely,
Md Mamunur Rahaman
  • asked a question related to Computer Systems
Question
3 answers
I have successfully installed the PyNGL but I am not able to use it. I found the following error:
Segmentation fault (core dumped).
Relevant answer
Answer
Look the link, maybe useful.
Regards,
Shafagat
  • asked a question related to Computer Systems
Question
18 answers
I tried to calculate the smallest number in my machine with gfortran, g++ and clang. I did it by looping as:
a = a / 1.73
In gfortran after 1358 loops, I found that a = 4.9406564584124654E-324, and the value of a never changes from loop 1358 till the end 1500 loop.
It is questionable for me that a non zero real number here a = 4.9406564584124654E-324 divided by a non zero number here 1.73 will keep the same value. The phenomenon seems to the qualification of the ground state energy in a quantum system.
Is this bugs for the gfortran compiler?
I have tested similar codes with gfortran (GNU Fortran 8.2.0 2018, 5.40 201609), g++ (GNU 5.4.0 201609, clang-1000.10.44.4) on Linux (Ubuntu, KylinOS,) and Macintosh. I also tried a =a / 1.7.
All have similar results. A ground state qualification and unchanged value of a.
I guess that it is related to the binary system used by the computer system.
Here with my codes.
Relevant answer
Answer
At least for Fortran, the language provides built-in intrinsic functions to return values like this. User's should not try to "compute" them.
Example:
program small
real(8) :: e, t
e = epsilon(1.0_8)
t = tiny (1.0_8)
write (*,*) "Smallest positive number is ", t
write (*,*) "Smallest number relative to 1 is ", e
end program small
% gf small.f90
% ./a.out
Smallest positive number is 2.2250738585072014E-308
Smallest number relative to 1 is 2.2204460492503131E-016
gfortran on a laptop with an Intel x86_64 processor.
  • asked a question related to Computer Systems
Question
5 answers
Definitions of living systems known to date (such as: thermodynamically dissipative systems, a set of hierarchical systems, a nonlinear -computer system, matter-energy and information transfer) do not provide a sufficiently good description of the mechanism of functioning of living systems.We know from experience that living systems belong to nonequilibrium thermodynamic systems that maintain their steady state through the flow-exchange of matter-energy and information with their environment .There are a number of different ways in which living systems function and interact with the environment and with each other. Is every activity of living systems a consequence of receiving and processing information from their environment?
Relevant answer
Answer
Dear Mr.Brad Jesness
Of course as you say: Only things like clocks have mechanisms; living things have processes and behavior PATTERNS and PATTERNING (patterning of patterns). I agree with you. There are many different ways in which living systems interact with the environment and with each other. Despite, from the outside, the diversity of living systems and their activities, they are all based on a DNA molecule. This means that underlying this diversity there is a unique scheme by which these activities are realized, and which I nominally called the mechanism.
  • asked a question related to Computer Systems
Question
3 answers
I performed real-time PCR using an Applied biosystems machine. The double delta Ct values ( according to the machine) for the gene of interest after certain treatments were given as -5.12454; -4.93158 ; -9.11017. Does this tell me about the fold change in the expression of this gene? If Yes, then does this mean there is negative fold change or a positive fold change in the expression of my gene of interest?
Relevant answer
Answer
The formula to convert a deltadelta-Ct to a fold change is :
2^-(deltadelta-Ct)
So in your case, you would get a fold change of +34.89 from your deltadelta-Ct of -5.12. A negative deltadelta-Ct gives a positive fold change (higher expression of your gene after the treatment).
Usually the machine can give you the fold changes directly. You might want to play with the parameters. Nonetherless, even with just the Ct values, you can manually do the calculations fairly easily.
  • asked a question related to Computer Systems
Question
4 answers
I am interested in performing molecular dynamics and quantum mechanics calculations for the active site of metalloproteinase at home conditions. In my use, there are two laptops - one with 4 GB RAM, Intel i3 2.13GHz, NVIDIA GT218M [GeForce 310M] (Linux OS), the second with 8 GB RAM, Intel i5 3.4 GHz, Intel UHD Graphics 620 (Windows 10). For the purpose to save time, I want to set the network of these two laptops. I am using Amber tools and GAMESS. Therefore, I wonder whether it is worth a try. Also, maybe somebody knows how to determine the time of calculation for a particular simulation system and computer specs.
Relevant answer
Answer
  • asked a question related to Computer Systems
Question
1 answer
It should be noted that "Bifurcation-Theory" as applicable to Structural Engineering Systems and/or Computational Mechanics, is the Mathematical-Analysis of changes and variations in the "Qualitative" or "Topological" Surface-Structure. This can include "Integral-Curves" of a family of "Vector-Fields", and the respective "Solutions" of a family of "Differential-Equations". Furthermore, a "Bifurcation" is typically induced, at the time a relatively small, "smooth" change is made to the subject "Parameter-Values" of a system (i.e. referenced herein as the "Bifurcation-Parameters") that results in a sudden "Qualitative" or "Topological" change in its behavior. Generally, this most frequently applies to "Dynamic-type" of systems, but may also be noted in other "interactive-type of systems".
Can we Benefit from using "Bifurcation-Theory", as applicable to Structural Engineering Systems and/or Computational Mechanics, at this point in time? and/or do you feel there will be a broader acceptance of the "Bifurcation-Theory" in the near future for "Structural-type of Systems"?
Relevant answer
Answer
The ASME Boiler and Pressure Vessel Code (BPVC) Section VIII Division 2 Part 5 allows bifurcation analysis to evaluate pressure vessels protection against structural buckling (structural instability). In paragraph 5.4.1.2 the code defines the design factor (a kind of safety factor) considering the type of analysis: "The design factor to be used in a structural stability assessment is based on the type of buckling analysis performed. The following design factors shall be the minimum values for use with shell components when the buckling loads are determined using a numerical solution (i.e. bifurcation buckling analysis or elastic-plastic collapse analysis)...". In my experience, the bifurcation bucking assessments are preferred by the designers instead of elastic-plastic collapse assessments because of computational costs.
  • asked a question related to Computer Systems
Question
4 answers
Chaos-Theory is a part of Mathematics that centers its attention on Dynamic-Systems that are highly dependent and Extremely Sensitive to the originial, yet "Initial-Set of Conditions". Therefore, within the perceived Randomness of sometimes Complex, Chaotic-Systems, there are definite Common-Patterns, observed Repetitiveness and even "Looping-type" of applicable Conditions, Similarities and yes, even Fractals. It would be nice to see a much wider Level of Acceptance and Implementation of the Chaos-Theory, within Structural Systems and Computational Mechanics, among others. We will see and hopefully contribute to what the future has in mind for our problem solving strategies.
Relevant answer
Answer
Chaos Theory - New generation of simulation dynamics used in construction engineering .
  • asked a question related to Computer Systems
Question
3 answers
we know that, one of fastest and simple way to divide by 2 in computer system is convert number to binary and shift right it, then convert it to decimal.
for example, if we want divide 8 by 2, first we convert 8 to 1000
and then shift right it, became 100, in second step, we convert 100 to decimal number, became 4.
question is, if number is very large such as this number: 987787558789758799875875578559589569457558445755844753123123213213123233456456342346456459042342343450563523934123312434534534413123454576756634554234123213123234534554643546575676868864554779807564523453790990780099
Will the computer still do this same as mentioned method, or there a faster way to do it?
Relevant answer
Answer
I understand you want a specific division by 2 for large numbers. Cost to convert numbers is very high so you are right to point it. For the story, a long time ago (in 1987), my math teacher fond in computer science wrote algorithms division for Apple/Motorola and he used suites that divide by 2 to divide by any kind of number. So it is also a pertinent step to divide by any number.
For you case just implement division algorithm by hands in this specific case ! An example of execution flow with 1234 / 2 would be : 12/2->6 3/2->1 (10)+4/2 -> 7 so and it gives 617
  • asked a question related to Computer Systems
Question
7 answers
Dear all,
I wish to know about the Best Computer system configuration to analyse or work on Big data?
What are the accessory items I should have.
Your suggestions are required.
Thanks in anticipation.
Relevant answer
Answer
The concept of "the best" will depend on budget and project's requirements. However, you can try with the hadoop environment in a virtualized context. Using at least 3 nodes with Hbase, pig, etc will give you the enough scalability for demostrating some kind of performance before making an investment. Even, You can incorporate some data from your customers and visualize them using zeppelin (among others).
  • asked a question related to Computer Systems
Question
12 answers
Actually, I was reading Shabana’s Book: Computational dynamics, John Wiley & Sons. 2010. As I'm investigating the dynamics of a multi-body system (having rigid and flexible structures). This code was introduced in chapter 9 of Shabana's book with some demonstrations of different parts of the code and it's usage. Yet, to my best knowledge, it was not mentioned from where to get this code. I would like to know where may I find this code, and/or other alternatives with similar or more capabilities?
Relevant answer
Answer
The educational version of SAMS/2000 can be downloaded from John Wiley & Sons, Inc.
  • asked a question related to Computer Systems
Question
4 answers
Quantum computers are still far from becoming the mainstream. What are the disadvantages of using quantum computers?
Relevant answer
Answer
As Plimak said, the quantum computers do not reach their true potential, since they are only simulated in normal devices.
  • asked a question related to Computer Systems
Question
27 answers
I am currently searching a topic for writing research proposal  in the area of cloud computing. I am interested in system engineering or distributing systems. I appreciate if you share me the current hot research topics of cloud computing in system engineering or distributing systems.
Relevant answer
Answer
In addition, We are working on Sotware Engineering Approaches to Cloud Computing.
Requirements engineering for cloud
Service design approaches
Testing cloud services
Please check for list of topics SE-Cloud 2019
I encourage you submit papers to this conference.
  • asked a question related to Computer Systems
Question
10 answers
Does anyone know of any good articles on research methods taught in Phd/Masters programmes in the field of Computing Sciences (Computer Science, Information Systems, IT or Computer Engineering)?
Relevant answer
Answer
  • asked a question related to Computer Systems
Question
5 answers
I have data security algorithm ,and i need to apply it in real cloud ,
can you help me and guide me to any way to apply it ??
thanks for you
Relevant answer
Answer
you can secure the data on the cloud computing by the following four steps:
1) don't upload your private data
2) use a strong and long password
3) encrypt the data using some encryption algorithms
4) knowing all the cloud services before uploading any data
  • asked a question related to Computer Systems
Question
5 answers
At the approach of the millennium, there was a lot of awareness raised about the apocalyptic nature of the millennium itself and the millennium bug.
Fears were stoked in people, establishments and governments. Billions were allocated to fight the bug and its predisposing data set up structure in electronic and IT equipment. Even nations without meaningful number of computers and IT systems and support budgeted hefty amounts for the war with the bug.
What was the war with the bug like in your country like and was the hype and the budgeted resources worth it?
Relevant answer
Answer
I actually worked on looking for Millennium Bug problems in a major government computer program. In fact we knew the possibility of a problem was extremely low, but had there been a problem it would have been massively costly so although very low probability the possible high impact made it worth spending time checking. I think one of the problems with researching this is that even if I had found a problem the resulting action would have been a relatively small and simple fix of the program. A fix might simply be to change a single line of code from showing "01:01:00" to showing "01:01:2000" so not exactly headline grabbing stuff.
In reality we didn't find any millennium bug issues but I found a couple of other minor bugs in the system so my time wasn't totally wasted.
I know similar bug searches took place on major infrastructure systems, but for the reason given above there is very little evidence of how much they were needed. It is possible that some were found that would have had a serious impact but due to the simplicity of the fix were never recognised as having prevented a major issue. But if that fix was to government payment systems or major infrastructure then it may have prevented a major risk event occurring.
Unfortunately there seems to be a lack of evidence as to whether there would have actually been major problems. This could be due to confidentiality clauses in contracts, or simply that at the time programmers didn't particularly notice that they were preventing a major problem. The fact that no nuclear plants had their reactor cooling systems unexpectedly shut down on 01 January may have been due to there never having been a problem with their programming in the first place as the initial programmers had used a four digit year in coding, or it may be that whilst checking the code a programmer changed the date to four digits, I doubt we will ever know.
  • asked a question related to Computer Systems
Question
3 answers
Data acquisition systems, and those which monitor the continuously changing value of a parameter, do so for a variety of applications. Among these is statistical analysis, where the objective is to check for patterns of behaviour, or simply to check parameter limits and convey warnings and alarms.
Since computer systems are far happier manipulating discrete values of a parameter, rather than a continuously varying analogue value, it is convenient to measure the value of that parameter at discrete points in time, and pass the time-value pairs to the computer for analysis.
In pursuit of that goal, we will propose a model of a sample-and-hold circuit, and analyse its performance in the time and frequency domains. We will also discuss other applications of sampling, specifically those arising from intentional undersampling.
Relevant answer
Answer
Thanks for this inspiring article. It already sparked some new ideas.
  • asked a question related to Computer Systems
Question
3 answers
The question is related to timing of a distributed system.
Relevant answer
Answer
The drift on each node vs. another node can be substantial. Good practice is to program in a way that no absolute clock is needed. Often a periodic event can be used as a clock tick and then distributed to trigger the processing. Each event (for example, use a sensor trigger) then resynchronises the system. If there is a specific need for high timing accuracy, then hardware support is needed. E.g. the common event is applied to all interrupt pins. Note that the software itself (if once uses a COTS OS like linux) is often inducing much more deviation. In principle Rate Monotic Scheduling with a shared synchronising event should make the system fairly drift immune. On modern processors, microsecond accuracy is reachable.
  • asked a question related to Computer Systems
Question
4 answers
Dear all
I am working now to prepare report about the relation between behavior recognition and system performance, So I need to determine exactly "Is behavior recognition important for computer system?"
Thanks for all
Relevant answer
Answer
thanks a lot
  • asked a question related to Computer Systems
Question
4 answers
Hello,
I was hoping that someone would be able to advise me please on hardware for an image analysis computer system.
My research group have been analysing long time lapse studies in fluorescence microscopy for bacterial growth (approx. 300 images per stack, 1.5 GB files for each of four fluorescent channels).
We currently use ImageJ, MATLAB and R (but are looking at new software too) to stabilise, correct, track and process the image stacks, then we hope to be able to extract quantitative data on cell division and fluorescence changes etc. Our current system has been struggling to cope with the processing, so are looking into a new system for image analysis. We have received a quote with the following specifications:
Intel Xeon W-2145 3.7 2666MHz 11 8C CPU processor, Windows 10 Pro 64 operating system, 64GB system memory, 1 TB internal storage (we have access to a storage server). For a graphics card we are either looking at a 4 or 8 GB NVIDIA Quadro P1000 or a similar GTX 1080 class card.
I was possibly hoping for some advice on whether there is anything in the specification we should be wary of please. From our research this system seems great, but any extra opinions would be more than welcome and appreciated.
Thank you for your time,
Kelsey Cremin
Relevant answer
Answer
Hello Kelsey,
it really depends on how optimized the software you use/develop is for parallelized computing.
The (i imagine) very expensive computer you described is very powerful, but achieves that by having a lot of relatively low speed CPU cores and a powerful GPU.
If you run simple Matlab or R scripts, or ImageJ plugins which were not specifically written to support multi-threading or GPU processing, you would probably see equivalent or even better performances with a much cheaper consumer CPU with 4 or 6 cores (For example a Ryzen 2600x or an i7-8700), and a basic GPU just for screen output/3d visualization software.
Here is a benchmark website
On the other side, tons of RAM are always good, especially if you are implementing your own algorithms, so the 64 GB of the indicated pc are a good spec to keep.
As David said, unless you have a super fast 10 Gigabit ethernet connection with no bottlenecks, avoid working on files directly from your storage server, make sure you transfer them first to some high speed local storage, such as an SSD or, even better, a raid0 of multiple SSDs.
  • asked a question related to Computer Systems
Question
4 answers
Many researchers lately have revealed that the future of these professions will become obsolete or will be an integral part of computer systems, whether robotized (hardware) or simply software. What do you think?
Relevant answer
Answer
  • asked a question related to Computer Systems
Question
2 answers
How to implement risk management process for Computer system validation to satisfy 21CFR Part 820, Part-11, EU-Annex-11?
Relevant answer
Answer
Thanks Shafagat..
  • asked a question related to Computer Systems
Question
1 answer
Kindly update how to open ultrasound machine bck files on computer system.
Relevant answer
Answer
Check the CD given with the ultrasound system, it may contain a specific program to be loaded to your computer. Ultrasound companies usually use specific programs for their own machines so that you have to stick with their company for the rest of your life...
  • asked a question related to Computer Systems
Question
3 answers
There are a lot of new technology implementation modeled to perform human functions.
In the field of conflict resolution, can computer systems (implementing artificial intelligence program) simulate or sense its environment to the degree that it uses its environmental inputs(Not numeric data but neuron specific data that houses emotions) to solve a conflicts situations to a point up-to 50% like humans?
Relevant answer
Answer
This article might be helpful:
Kraus, S., Sycara, K., & Evenchik, A. (1998). Reaching agreements through argumentation: a logical model and implementation. Artificial Intelligence, 104(1-2), 1-69.
  • asked a question related to Computer Systems
Question
5 answers
We have been using many traditional machine tools such as lathe machine, drilling machine, shaper machine, etc. in our machine shops. These machines are standalone without any integration with any central computer system or integration with each other. However if we can integrate them with a computer system that will continuously monitor the performance/utilization, maintenance related issues, and help in scheduling them, we will get the massive benefit of advanced machine tools with least investment. So what are the ways we can make these machine a little bit intelligent (?) that will help us in our traditional machine shop? Can IoT/similar technology help us in this case?
Relevant answer
Answer
I guess you need to include more feedback with traditional CNC control such as (for example with a milling machine), power / torque to the spindle (to monitor effort of cutting, tool wear etc.) Torque to the XYZ shift so you can monitor cutting resistance timeing etc.The easiest way to do this would be using servo motor control as the resistance can easily be taken from the motor power consumption, but traditionally the movements are all done with stepper motors, which have no such feedback.
Ultimately you could then use machine learning algorithms to calculate the best machining practices, tell you when the tool needed changing, and actually feed back into the CAM system for working out how to cut items.
  • asked a question related to Computer Systems
Question
16 answers
Sci journal
Relevant answer
Answer
Modern physic letter A -world scientific Visual computer Springer
  • asked a question related to Computer Systems
Question
4 answers
A human brain has been created in a laboratory, mainly for research and harvesting purposes. There is growing evidence that computing systems are talking to each other. What happens when we are not the most intelligent being on the planet?
Relevant answer
Answer
I don't believe we should fear such an outcome, but I think we-even you-mystify human intelligence, the mechanism of intelligence itself, and we may be surprised how easily it is replicated.
There ease of replication might affect religious ideas, especially those based on human-special nature and connection thereby to god. Would that then produce a cosmic (I use the word unwisely) identity crises? Would we then possibly see our actual position in the universe and world?
  • asked a question related to Computer Systems
Question
3 answers
Green IT is a fast growing blanket of IT which also spans through the whole environment. A great number of Hardwares with softwares integrated were designed in the past but consumes a great amount of energy. What area/aspect can we help in. Example. Reducing screen brightness to save Energy is in play, sleeping the computer system when not in use is in play.
Relevant answer
Answer
How about banning cryptocurrency mining?
  • asked a question related to Computer Systems
Question
5 answers
See above
Relevant answer
Answer
What is the major differences between "master slave", "peer to peer"?(difference in cost communication)
  • asked a question related to Computer Systems
Question
6 answers
I am currently enrolled in Multimedia Communications, at my MS Computer Systems. I want to know different perspectives regarding new research areas in Multimedia Communication.
Relevant answer
Answer
Hi,
I believe that all multimedia communications technologies are going to integrate IoT (Internet of Things), with new applications like: 3D, Virtual and Augmented Reality, 4D cinemas with sensorial effects, special smells, illumination and vibration chairs. With new compressions techniques that will permit High Quality video (4k or more). Also on TV, I think that we will have a total interactivity between broadcaster and viewers, interactive TV shows, etc.; all using OTT platforms, were you can watch whatever you want all the time.
Regards!
  • asked a question related to Computer Systems
Question
7 answers
I need the most common methods and procedures to screen a phytochemical having anticancer activity both in vivo (animal model) in vitro and in computational systems. 
  • asked a question related to Computer Systems
Question
3 answers
I will be very thankful to recieve any article, books or researchers names to continue my research. Thanks a lot!
Relevant answer
Answer
Thank you both for your support.
The idea of Curriculum Sequencing Algorithm is more popular than I expected, I have found, mainly in the medical field. Currently, I am categorising my areas of research in the clearest posible way while at the same time archiving foundation data on the topic. Essentially, I will be researching the educational elements that inspired this research, the content the curriculum could cover and algorithm designs and applications in an educational context. Again thank for your contribution, I will update you next week on the progress made.
  • asked a question related to Computer Systems
Question
4 answers
I am working on Energy optimisation and using DVFS technique. For this purpose I have to give game traces workload i.e game workload instead of planet lab own workload. Currently I am using cloudsim, Can any one suggest the way to add external workload in cloud sim planet lab power Dvfs example?
  • asked a question related to Computer Systems
Question
4 answers
I have some question regarding SRAM in IoT:
1) Is SRAM required in IoT?
2) If yes, what should be the size?
3) What about reliability of SRAM?
Relevant answer
Answer
IoT is a buzzword. IoT doesn't need SRAMs. Some chips, when used in an IoT context, might need memory. And yet, SRAM might not even be the best memory technology for the given application. The question, as presented, is too broad. It's all application dependent. 
  • asked a question related to Computer Systems
Question
19 answers
Hello,
I am working with C++(OpenCv) and I want to compute the runtime of my method to compare it with other methods. I use clock_t tStart = clock(); and printf("Time taken: %.4fs\n", (double)(clock() - tStart)/CLOCKS_PER_SEC);
The problem is that i don't know where to put the clock start? Is it after image reading and image preprocessing or after them? also the same for clock end.
Thank you
Relevant answer
Answer
in mat lab : put tic at processing start and toc at the end
tic
imshow(a)
toc
  • asked a question related to Computer Systems
Question
8 answers
How to evaluate the security of a computer system ? Is there a framework that is currently used as a standard for measuring computer security ?
Relevant answer
Answer
Hi Yudi
First off, I would ask what you mean by computer system? I am assuming you refer to a computer system run by a company comprising a combination of many computers, servers, an intranet, various types of software and so on. That being the case, there are a great many security standards available today. Unfortunately, there is no "one size covers all" approach to guarantee the security of your computer system. In any event, even if there were, it would very quickly become out of date. This is because the rate of exploits being developed far exceeds the rate at which they can be prevented. There are a great many reasons for this, which I will attempt to outline for you.
First, since computers first were developed, they have increased in performance at a considerable rate, year on year. Consequently, the quality and range of software has also increased in sophistication, and complexity. While this increase in sophistication is usually welcome for business managers, it brings with it its own problems. The more complex it becomes, the more difficult it becomes to set it up properly, let alone set it up securely.
Business, too, has increased in complexity, and companies these days must comply with legislation, regulation, corporate governance, standards and industry best practice. Financial penalties are on the increase for breaches in compliance, even extending to criminal charges.
Of course, the threat environment has also increased significantly. The source of attack can be generally classified into 5 categories:
State sponsored groups;
Industrial espionage groups;
Hacktivist groups;
Criminal groups;
Individuals.
Of these, state sponsored groups will have highly skilled operatives, access to the best equipment, and will be very well resourced. Industrial espionage groups will generally be well organised, but less well resourced. Hacktivist groups will generally by highly committed, often poorly financed, but nontheless will be well skilled. Criminal groups will generally be well resourced, and will be able to afford to hire very skilled people to help them achieve their goals. Individuals will generally be poorly resourced and not too skilled.
You only need to review the annual security breach reports published by various firms to understand how much of a problem this now is. Here is a list of some areas of challenge which need to be properly addressed before a proper level of computer systems security and indeed privacy, can be achieved:
Access controls
Accountability and responsibility
Audit issues
Business continuity
Complexity of systems
Data ownership
Data protection
Encryption
Failure to patch software
Forensic support
Incident analysis
Infrastructure security
Laziness
Management approach to security
Mis-configuration of software
Multi-tenancy
Non-production environment
Physical security
Privacy
Processes
Proper definition of security goals
Regulatory compliance
Resilience
Security culture in the company
Security policies
Security procedures
Social engineering attacks
Staff security training
Standards compliance
Technical complexity of cloud
The threat environment
User identity
I have also included a few lists of useful papers for you to read on security and privacy. Many of these relate to cloud computing, as cloud security and privacy is my special area of interest, but generally, since cloud is more difficult to secure, much of what they talk about will be relevant for any setting.
If you want to be serious about achieving proper security, then you will need to be properly armed with a vast knowledge of the issues you face. And even that will not be enough. Constant vigilance will be required to keep pace with all the latest vulnerabilities. And coming up with a technical solution alone will not solve your problems, as the business architecture of a company comprises a combination of people, process and technology, so all three areas need to be properly addressed.
Happy reading.
Regards
Bob
  • asked a question related to Computer Systems
Question
4 answers
I need a concise and clear focused solution for it.
Relevant answer
Answer
This happens by two things: First, your air-gapped system is already infected, and second one is an infected cellphone is nearby. For the before system, this would be incredibly difficult on any of the air-gapped systems I've used, as any software being brought into the building must be throughly scanned and checked before getting near the system. As for the latter, every place with air-gapped systems I've worked at has mandated that all cell phones and electronic devices must be kept outside the RF shielded office..
  • asked a question related to Computer Systems
Question
7 answers
working on heterogeneous computing system for cognitive radio
Relevant answer
Answer
Not yet mam
  • asked a question related to Computer Systems
Question
3 answers
What are the problems that are being caused by computer systems?
Relevant answer
Answer
In a security perspective, hash functions are not enough to protect integrity - you need either Message Authentication Codes or Digital Signatures.
  • asked a question related to Computer Systems
Question
6 answers
I want to integrate a sub system in a System of System without influencing the normal functioning of the global system. Anyone have any ideas about this subject?
Relevant answer
Answer
This paper describes a set of fundamental principles for achieving "systems engineering and integration" The principles help system engineering and/or concurrent teams, first, to define how to decompose the tasks and then, how to arrange these decomposed tasks so that "best concurrency and simultaneity" can be achieved while doing systems engineering.
Take a look
  • asked a question related to Computer Systems
Question
7 answers
Considering:
  • test problems
  • quality indicators used in the evaluation
Relevant answer
Simulated Annealing is good but the cooling schedule needed to be properly monitored because of the fact that it determines the quality of the final output. Most of the time, since its problem-dependent, it must be properly studied for accuracy
  • asked a question related to Computer Systems
Question
6 answers
there are many trust issues for cloud computing, such as: security and risk, Here I need more factors to achieve trust in a cloud computing system.   
Relevant answer
Answer
Continuity Management
Disaster Management
Privacy and Security 
SLA
  • asked a question related to Computer Systems
Question
7 answers
I'm trying to configure 6lbr as a router of a 802.15.4 network
I have Raspberry pi on which raspian is installed
As a question I wonder if it is possible to use slip-radio (serial socket) cooja mote instead of real mote which is connected to the raspberry?
Thank you in advance,
Niousha
Relevant answer
Answer
@Farouq Muhammad Aliyu
No, I mean't instead of plugging a real mote, is it possible to use cooja mote?
I've connected raspberry pi to my laptop and have cooja simulator on virtual machine on my laptop
  • asked a question related to Computer Systems
Question
4 answers
In cloud computing system such as Infrastructure as a service (IaaS) what the reason we have to change SLA which is a contract between cloud provider and cloud user during life cycle, over time or dynamically  ?
Relevant answer
Answer
Hi Mohammad,
We need to dissect on case by case basis what is the rationale cloud users & cloud providers needs to change their SLA.
Possible Reasons Why Cloud Users Need to Change SLA:
1) The Current SLA is Not Meeting Cloud User's Business / Go to Market Requirements - e.g. reduced down time & latency, higher availability & performance, faster required speed of IaaS VMs provisioning / de-provisioning etc.
2) The Current SLA is Not Meeting Improving Government / Industrial e.g. Telco, Financial, Healthcare or Internal Audit Regulatory Compliance Requirements.
3) Cloud Users' Organizations are Going through Certain Mergers & Acquisitions that Need to Re-streamlined the SLA etc.
Possible Reasons Why Cloud Providers Need to Change SLA:
1) Cloud Provider is Learning from Cloud User Feedback - some cloud providers when they rolled out their cloud service initially, they don't have any specific SLA in mind but rather they only setup some SLO (Service Level Objective) which is not contract binding / incurring any penalty payment to solicit feedback from cloud users.  After some months, when the cloud providers have some ideas on what SLA they want to offer, they will formally announce the binding SLA to all cloud users.
2) When Cloud Provider Discontinues Old Cloud Services and Offer New Cloud Services, there will be a change / improved SLA
3) Some Cloud Providers are Transitioning / Upgrading from Old to New Cloud Service Offerings in which They are Not Sure How Reliable the New Cloud Services.  Hence they put the new cloud service offerings under few months of monitoring to see how good their performance / SLA will be.  After the new cloud service offerings are stable / improved, the cloud providers will change to new binding SLA to ensure they can meet such new SLA.
4) Changing to Improved SLA due to competition pressures as mentioned by Auday above.
Note: above are just some examples and they are not exhaustive list.  Wishing you all the best.
Regards,
Fung
  • asked a question related to Computer Systems
Question
3 answers
Design architecture for Infrastructure as a services cloud computing system for sharing resources such as CPU, Memory RAM and disk storage,
what should i use client/server architecture or multilayer architecture or another architecture or a combination between them. 
  • asked a question related to Computer Systems
Question
3 answers
Need the types of Activity context of user in pervasive computing system?
Relevant answer
Answer
You may find the following papers useful.
1) Prekop, Paul, and Mark Burnett. "Activities, context and ubiquitous computing." Computer Communications 26.11 (2003): 1168-1176.
2) Henricksen, Karen, Jadwiga Indulska, and Andry Rakotonirainy. "Modeling context information in pervasive computing systems." Pervasive Computing. Springer Berlin Heidelberg, 2002. 167-180.
3) Manzoor, Atif, Hong-Linh Truong, and Schahram Dustdar. "Quality of context: models and applications for context-aware systems in pervasive environments." The Knowledge Engineering Review 29.02 (2014): 154-170.
  • asked a question related to Computer Systems
Question
2 answers
Relevant answer
Answer
The terms “dependability”‎ and “security”‎ have been used interchangeably to describe the properties of secure and trusted software.
However, the extant literature shows that dependability attributes are considered as the cure for security threats, abnormal behavior and untrustworthy issues in a software system.
A system is considered dependable when it can be depended on to produce the consequences for which it was designed, with no adverse effect in its intended environment. Dependability comprises several attributes that imply availability, confidentiality, integrity, reliability, safety, and maintainability.
Methods and tools to attain the dependability attributes have been discussed in details in my previous publications.
I hope this can help you. Good luck.
  • asked a question related to Computer Systems
Question
4 answers
what is the main differences between them ?
Relevant answer
Answer
In a mathematical point of view, resource allocation is included in Resource management.
  • asked a question related to Computer Systems
Question
1 answer
I need an answer for how to implement scheduling algorithms like ( HEFT ) in heterogeneous computing system. can you say how do i implement as a project . Give brief.
Relevant answer
Answer
I think a dynamic schedule will solve this problem. If you will do the tests on a shared memory system, you could use OpenMP for that.
  • asked a question related to Computer Systems
Question
5 answers
Cloud computing seems to be gaining more grounds than Grid computing. Given that the two computing platforms are intricately overlapping, one may want to know if the emergence of Cloud computing has signalled the end of Grid computing?
Relevant answer
Answer
I would say that there are situations and use cases where either Cloud Computing or Grid computing is optimal.  I have seen environments were both are used for where they are the best fit.  For example, if you have lots of batch oriented, compute intensive jobs, it may not make sense to use Cloud Computing because of virtualization overhead. 
  • asked a question related to Computer Systems
Question
4 answers
This approach clearly points in the right direction, yet it is based on computer systems as they are. It is still necessary (1) to distinguish between the source of complexity (business objects and logic, systems functionalities, platforms technologies), and (2) the dynamics with environments.
Relevant answer
Answer
Remy:
Measuring the intrinsic complexity of a problem is an important object.
Limiting the scope of complexity reduction to this area only may not be the best approach.
However, complexity metrics with a limited scope are better than no complexity metrics at all.
Have fun,
  • asked a question related to Computer Systems
Question
11 answers
I'm looking to use game theory to improve the quality of cloud computing systems 
I want to apply game theory to solve the non-functional requirements issue, but should this be done by using game theory or not?
How could I determine if there is any improvement ?
How could I measure the improvement?
Then could I improve all of the 16 non-functional requirements (performance, security, safety, etc)?
Relevant answer
Answer
Game theory as a branch of mathematics helps to address strategies for dealing with situations that are competitive. In such situations, the outcome of a participant's choice of action depends on the action of other participants. Game theory has a broad application.
Cloud computing on the other hand deals with the use of a network of remote servers hosted on the internet to store, aggregate, and manage data. In this process, in most situations, there is always competition. People compete for resources. Since the internet can be likened to a Graph data structure consisting of nodes and edges, Game theory, as a branch of mathematics, can be applied as an important  tool to solving problems associated with cloud computing.
There are numerous scientific research papers that you can get from reputable international journals.
Regards
Olugbenga
  • asked a question related to Computer Systems
Question
2 answers
I am working in the area of electronic waste. E-waste was ground to fine sizes. The cumulative mass distribution in different sizes is represented by Rosin-Rammler model. Similarly, the differential metal percent distribution in different sizes is represented by the Gaussian distribution model. I have fitting equations for both. Now, I want to get a model that represents the cumulative distribution of metal percent by combining the previous two models.
I found a way to do so, which is shown in the attached image file.
I hope the method followed is correct. Can anyone please suggest me scientific references for similar work.
Relevant answer
Answer
Rahul.
May you post a dataset of 20 to 50 random values for your main two variables to work them? What does term "mass" mean? What distribution of the two mentioned variables do you want to study?
In general I believe that the asumption of normality is not a good a road. It is more important the data you have measured. OK, emilio
  • asked a question related to Computer Systems
Question
2 answers
Mostly we perform several tasks on our systems and each task may require separate windows for execution on desktop. A lot of time is consumed switching and re-sizing windows and face a lot of problem to view all windows at a time.
VSP will allow main screen to be split into two or more sections. It will be helpful for the active program’s window to be re sized properly to one part of the screen. Using this utility, user will be able to do a split system desktop into two or more areas so that active application would not cover up “Full” physical screen when maximized. This way user can visualize parallel windows in defined area that user has a separate monitor for working (virtually) also called “multi-monitor features of Windows”.
It allows user to position and size windows to sections/areas of monitors and helps user to efficiently manage many active windows parallel.
Following image depicts virtually split of desktop screen into two sections/areas:
Application Type: Desktop
Relevant answer
Answer
Badri - personally, I'm a fan of multi-desktop window managers. The Unix/Linux community has had this for years. I have one desktop for email and web browsing, a second one for software development, a third one to monitor systems that I am responsible for, and so on. The more memory a system has, the more virtual desktops one can have.
In the MS Windows world, the choices have been limited (at best). A quick google search finds that Microsoft itself has such a product and there is also http://www.goscreen.info/ - which I have no knowledge of other than it showed up in the search.
I'm not sure what your description is offering something that would be easier to use or better in implementation. Can you explain in more detail what you are proposing and how it benefits more than virtual desktops?