Science topic
ICT - Science topic
Explore the latest questions and answers in ICT, and find ICT experts.
Questions related to ICT
Uses of ICT in Education, Business and Agriculture
I am a vibrant researcher in Language Education (English), with a focus on Technological integration in education, SDGs, ICT, AI, Gen Z, teacher education, 4IR (Fourth Industrial Revolution), Digital literacy, Information literacy, Research Productivity, Gender, Ranking - Webometrics, pre-service teachers, MOOC, etc. among others.
I am opened to INTERDISCIPLINARY and MULTIDISCIPLINARY Collaboration from national and international researchers.
You can reach me via email
Or WhatsApp +2347035044420
Or LinkedIn. http://bit.ly/4iV0yUy
In recent years, digital technologies have significantly transformed the way English as a Second Language (ESL) is taught. Tools such as adaptive learning platforms, AI-driven tutoring systems, and mobile applications have enabled more personalized learning experiences that cater to individual student needs.
However, several questions remain regarding the effective integration of these technologies:
- What specific digital tools have proven most effective in addressing diverse learner profiles in ESL?
- How do we balance the role of the teacher with the use of these tools to ensure meaningful human interaction?
- What challenges arise in terms of accessibility, especially for students in under-resourced environments?
I invite fellow educators, linguists, and researchers to share their experiences, insights, and any evidence-based findings on the impact of these technologies in personalizing language instruction. How do you foresee the future of ESL teaching evolving with these innovations?
Looking forward to a stimulating discussion!
Include "Title page and Sections: Introduction, Literature Review, Methodology, Results, Discussion, Conclusion, References" etc.
I am just wondering how can we use the agency theory in business to explain the advantages that ICT can have on business processes in enterprises, which can impact its economy interms of cost saving and competitiviness?
How do ICT interactions influence science achievement?
You are invited to jointly develop a SWOT analysis for generative artificial intelligence technology: What are the strengths and weaknesses of the development of AI technology so far? What are the opportunities and threats to the development of artificial intelligence technology and its applications in the future?
A SWOT analysis details the strengths and weaknesses of the past and present performance of an entity, institution, process, problem, issue, etc., as well as the opportunities and threats relating to the future performance of a particular issue in the next months, quarters or, most often, the next few or more years. Artificial intelligence technology has been conceptually known for more than half a century. However, its dynamic and technological development has occurred especially in recent years. Currently, many researchers and scientists are involved in many publications and debates undertaken at scientific symposiums and conferences and other events on various social, ethical, business, economic and other aspects concerning the development of artificial intelligence technology and eggs applications in various sectors of the economy, in various fields of potential applications implemented in companies, enterprises, financial and public institutions. Many of the determinants of impact and risks associated with the development of generative artificial intelligence technology currently under consideration may be heterogeneous, ambiguous, multifaceted, depending on the context of potential applications of the technology and the operation of other impact factors. For example, the issue of the impact of technology development on future labor markets is not a homogeneous and unambiguous problem. On the one hand, the more critical considerations of this impact mainly point to the potentially large scale of loss of employment for many people employed in various jobs in a situation where it turns out to be cheaper and more convenient for businesses to hire highly sophisticated robots equipped with generative artificial intelligence instead of humans for various reasons. However, on the other hand, some experts analyzing the ongoing impact of AI technology applications on labor markets give more optimistic visions of the future, pointing out that in the future of the next few years, artificial intelligence will not largely deprive people of work only this work will change, it will support employed workers in the effective implementation of work, it will significantly increase the productivity of work carried out by people using specific solutions of generative artificial intelligence technology at work and, in addition, labor markets will also change in other ways, ie. through the emergence of new types of professions and occupations realized by people, professions and occupations arising from the development of AI technology applications. In this way, the development of AI applications may generate both opportunities and threats in the future, and in the same application field, the same development area of a company or enterprise, the same economic sector, etc. Arguably, these kinds of dual scenarios of the potential development of AI technology and its applications in the future, different scenarios made up of positive and negative aspects, can be considered for many other factors of influence on the said development or for different fields of application of this technology. For example, the application of artificial intelligence in the field of new online media, including social media sites, is already generating both positive and negative aspects. Positive aspects include the use of AI technology in online marketing carried out on social media, among others. On the other hand, the negative aspects of the applications available on the Internet using AI solutions include the generation of fake news and disinformation by untrustworthy, unethical Internet users. In addition to this, the use of AI technology to control an autonomous vehicle or to develop a recipe for a new drug for particularly life-threatening human diseases. On the one hand, this technology can be of great help to humans, but what happens when certain mistakes are made that result in a life-threatening car accident or the emergence after a certain period of time of particularly dangerous side effects of the new drug. Will the payment of compensation by the insurance company solve the problem? To whom will responsibility be shifted for such possible errors and their particularly negative effects, which we cannot completely exclude at present? So what other examples can you give of ambiguous in the consequences of artificial intelligence applications? what are the opportunities and risks of past applications of generative artificial intelligence technology vs. what are the opportunities and risks of its future potential applications? These considerations can be extended if, in this kind of SWOT analysis, we take into account not only generative artificial intelligence, its past and prospective development, including its growing number of applications, but when we also take into account the so-called general, general artificial intelligence that may arise in the future. General, general artificial intelligence, if built by technology companies, will be capable of self-improvement and with its capabilities for intelligent, multi-criteria, autonomous processing of large sets of data and information will in many respects surpass the intellectual capacity of humans.
The key issues of opportunities and threats to the development of artificial intelligence technology are described in my article below:
OPPORTUNITIES AND THREATS TO THE DEVELOPMENT OF ARTIFICIAL INTELLIGENCE APPLICATIONS AND THE NEED FOR NORMATIVE REGULATION OF THIS DEVELOPMENT
In view of the above, I address the following question to the esteemed community of scientists and researchers:
I invite you to jointly develop a SWOT analysis for generative artificial intelligence technology: What are the strengths and weaknesses of the development of AI technology to date? What are the opportunities and threats to the development of AI technology and its applications in the future?
What are the strengths, weaknesses, opportunities and threats to the development of artificial intelligence technology and its applications?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
Which Education theory can I use for the Theoretical frame work of the above ICT related Topic
"Today several adjectival phrases have been used to describe English like ‘International Language’, ‘Lingua-Franca’, ‘Language for Globally Connecting’, ‘Library Language’, ‘Official Language’, ‘Administrative Language’, ‘Queen of Languages’, ‘Employment Passport’ and ‘the most Preferred Language’ etc." (Jabir, M. 2019)
Reference:
Jabir, M. (2019). The Use of ICT in Teaching English: A Study of ELT in the Secondary Schools of Kargil District . An M. Phil Dissertation Submitted to Jaipur National University, p. 5.
It's more of a project research in my school so I would like to get your contributions just from the background of the study to literature review
How to fill the growing gap in energy production in a situation where combustion energy dominates, RES are little developed and nuclear energy is still not developed?
With what to fill the growing gap of lack of energy production in a situation where expensive energy sources based on combustion of fossil fuels still prevail, the price of energy produced from RES is steadily falling and the chaotic and short-sighted energy policy does not take into account the construction of nuclear power plants or plans to build the first nuclear power plants only in 2 decades time?
Due to economic development, including the development of energy-intensive industries and services, the demand for electricity is gradually increasing.
In addition, the development of electromobility is becoming an important factor in the growth of electricity demand. With the developing economy becoming a knowledge-based economy, an information economy, an economy in which the scale of implementation of new ICT, Industry 4.0/5.0, including the development of data centers using Big Data Analytics, artificial intelligence, cloud computing, Blockchain, etc. is growing rapidly, then in addition, the demand for energy is also growing rapidly. Another factor that is already increasing and will continue to increase the demand for electricity in the future is the process of ongoing global warming resulting in increased use of cooling equipment. On the other hand, the pace of energy development, including, first and foremost, energy that meets the guidelines of climate policy based on renewable and emission-free energy sources is not sufficient.
As a result, the energy deficit gap is growing every year, and will unfortunately continue to grow in the coming years unless appropriate reforms are undertaken and the green transformation processes of the energy sector are accelerated. In addition, the importance of this issue is particularly high in countries where types of energy sources such as nuclear power are underdeveloped or not developed at all is particularly important. Nuclear power is the type of energy sources that can act as an intermediate stage in the process of green transformation of the economy involving the replacement of conventional energy sources based on the combustion of fossil fuels with fully emission-free, climate and environmentally clean energy sources. In addition, countries where, for geographical, natural and geological reasons, it may be difficult to develop certain types of renewable energy such as limited opportunities for the construction of hydroelectric power plants due to the small scale of diversity in terms of terrain, few rivers and certain geological reasons, have a difficult situation in the implementation of the process of green transformation of the economy. A significant further factor not conducive to reducing the scale of the growing energy deficit gap may be the unreliable, short-sighted, haphazard, non-strategic energy, climate and environmental policies, in which there are even situations of limiting and/or blocking the development of certain types of renewable and carbon-free energy sources. An example is the blocking of the development of onshore wind energy in Poland in 2016 through the introduction of Law 10h, resulting in a strong increase in coal imports and a significant slowdown in the green energy transition. The result is that Poland's energy production is still significantly dominated by conventional power generation based on the combustion of fossil fuels, mainly coal and lignite, which accounts for more than 70 percent of Poland's total energy production. Paradoxically, even this relatively small share of RES power generation can, under favorable natural and climatic conditions, provide more than the usual amount of energy, much of which is wasted because it is not accepted by the dominant power industry, including government-controlled energy companies functioning as state-owned companies. The argument that is given by these large power companies to this anachronistic, irrational situation is the years-long lack of investment in the development of electricity transmission networks. Paradoxically, over the past 3 decades of time, most of the funds coming from the state's public finance system have been allocated to subsidizing unprofitable coal and lignite mines and maintaining the power plants where the aforementioned coal is burned.
The reason that in the past the development of renewable and emission-free sources of energy has been limited and even blocked is that nowadays there are more and more absurd situations of sorts, where during sunny and windy weather from prosumers, photovoltaic panels and household wind turbines installed by citizens on the roof of their homes, there is an above-average increase in electricity production, but all the energy generated is not used by energy companies due to the lack of adequately developed infrastructure of transmission networks and the lack of established energy storage facilities, batteries, a significant part of the energy generated from RES goes to waste, and in other months it happens to buy energy from other countries, when there is a periodic shortage of energy due to the growing demand for energy. The paradox and economic irrationality of this situation also lies in the fact that energy prices are steadily rising, and the cheapest sources of energy generated from wind and solar power are too slowly being developed.
As a result, energy policy, and also climate and environmental policy, in the country where I operate is being conducted chaotically, strategically and short-sightedly. The guidelines of the European Union's Green Deal are largely ignored, and this is despite the available financial subsidies from the European Union, which should be allocated to the green transformation of the energy sector. In addition, subsidies for combustion power generation based mainly on coal and lignite continue to dominate, which translates into high energy prices, poor air quality and the postponement of the implementation of the plan to build a sustainable, green, zero-emission closed-loop economy, an essential element of which is to build a zero-emission power industry based on RES.
In addition, there is almost no research, analysis and implementation work on new innovative energy technologies such as those based on hydrogen power technology, cold fusion technology, etc.
I am conducting research in this area. I have included the conclusions of my research in the following article:
IMPLEMENTATION OF THE PRINCIPLES OF SUSTAINABLE ECONOMY DEVELOPMENT AS A KEY ELEMENT OF THE PRO-ECOLOGICAL TRANSFORMATION OF THE ECONOMY TOWARDS GREEN ECONOMY AND CIRCULAR ECONOMY
The key issues of the problematic sources of Poland's exceptionally deep energy cross in 2022 are described in my co-authored article below:
POLAND'S 2022 ENERGY CRISIS AS A RESULT OF THE WAR IN UKRAINE AND YEARS OF NEGLECT TO CARRY OUT A GREEN TRANSFORMATION OF THE ENERGY SECTOR
I invite you to study the problems described in the above-mentioned publications and scientific cooperation in this issue.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What to fill the growing gap of lack of electricity production in a situation where expensive energy sources based on burning fossil fuels still prevail, the price of energy generated from RES is steadily falling and the chaotic and short-sighted energy policy does not include the construction of nuclear power plants or plans to build the first nuclear power plants only in 2 decades of time?
With what to fill the growing gap in energy production in a situation where combustion power dominates, RES are little developed and nuclear power is still not developed?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text, I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
How to build an effective system of activating the innovativeness of business entities financed with the use of subsidies from the system of public finances of the state?
What are the most effective measures for activating the innovativeness of economic entities, thanks to which the scale of investments made in technological branches of the economy increases and economic development accelerates?
An important factor determining the level of economic development in highly developed countries, which also translates into the level of income and general well-being of the economic, material, livelihood situation, etc. of citizens is the level of innovation of companies and enterprises. The issue of innovativeness of economic entities translates into labor productivity, efficiency of manufacturing processes and profitability of economic processes implemented by companies and enterprises. Innovation of economic entities can be realized in various spheres and areas of economic activity of companies, enterprises and also financial and public institutions. Business entities can create and develop technological, product, service, logistics, process, organizational, marketing and other innovations. In connection with the rapid technological progress and the ICT and Industry 4.0/5.0 technologies being developed, among others, technological innovations are particularly important considering the issues of improving the efficiency of manufacturing processes, increasing labor productivity, improving the profitability of production, etc. In recent years, the scale of implementation into business entities of such Industry 4.0/5.0 technologies as Big Data, cloud computing, Internet of Things, blockchain, digital twins, cyber security instruments, machine learning, deep learning, artificial intelligence, among others, has been growing. Among the important factors in the implementation of new technologies to companies, enterprises, financial and public institutions are favorable conditions for external investment financing of the creation or purchase, implementation and development of new technologies. External financing includes, among others, investment loans offered by commercial banks, equity financing offered by investment funds, national financial subsidies and financial subsidies under certain European Union programs, financing realized through the issuance of securities, i.e. shares and corporate bonds, loan funds, funds, etc. The innovativeness of business entities is often correlated with the level of entrepreneurship of these entities. Therefore, the government programs conducted to activate the innovation of business entities are often programs that also activate entrepreneurship. In recent years, the instruments of activation of innovation and entrepreneurship have included changes to the tax system as part of the fiscal policy pursued, as well as increasing outlays from the state finance system for the creation of funds for systemic support for the development of innovative business ventures initiated by startups and companies operating in the new technology sector. On the other hand, the soft monetary policy applied before and during the Covid-19 pandemic did not have significant effects in improving innovation and entrepreneurship. The interest rates lowered by the central banks to successive historically low levels did not result in a significant increase in the scale of innovation and entrepreneurship among economic agents. Lenient monetary policy improved liquidity in the financial sector, including the interbank market, activated lending in commercial banks' lending activities, improved the prosperity occurring in capital markets, including reducing the scale of the downturn in securities markets. Indirectly, therefore, an easing monetary policy can be a factor in the activation of innovation and entrepreneurship, if a significant portion of business loans, including investment loans, are taken by business entities to finance a venture involving the development or purchase and implementation of new technologies. In the context of the activation of innovation and entrepreneurship, it may also be important to improve and adapt to a specific socio-economic situation interventionist, anti-crisis and pro-development instruments and programs implemented under specific socio-economic policies, including, above all, budgetary, fiscal and sectoral policies, including sectoral policies to support the development of innovation.
I am conducting research on this issue. I have included the conclusions of my research in the following article:
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the most effective measures for activating the innovativeness of business entities, so that the scale of investments made in technological branches of the economy increases and economic development accelerates?
How to build an effective system of activating the innovativeness of business entities financed with subsidies from the state's public finance system?
How to effectively activate the innovation of business entities using financial subsidies from the state's public finance system?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Thank you,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text, I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
The areas of my interest are education, pedagogy, Artificial intelligence, Research productivity, Digital literacy, Gen Z , Gender and research productivity, ICT, e-learning, language teaching and learning, information literacy, Literacy, etc
Multidisciplinary approach is welcomed.
You can reach me at bolanle.apata@aaua.edu.ng, stellybola@gmail.com
+2347035044420
Most farmers of developing nations are self learned. they have limited knowledge on ICT & precision agriculture. will it be an obstacle to adopt IoT monitored Alternate Wetting and Drying (AWD) rice irrigation system? can anyone share practical information on this topics with references?
I am writing to let you know that Journal of E-Learning And Knowledge Society (SCOPUS Q2, SJR Q3) is currently welcoming submissions of original research to "VOLUME 20 | SPECIAL ISSUE CALL 2024" with title “Digital Transformation in Educational Research: Competencies, Resources and Challenges in the Context of ICT”. As the Guest Editor of this CfP, I hope you will consider this as an outlet for a future research paper.
As a result of these reflections, different questions arise.
• Are teachers digitally trained in research skills?
• What skills do teachers have to use digital resources developed for the research context?
• What factors influence the digital competencies of teachers in their research work?
• How do AI tools impact the digital skills of teachers in research work?
For this reason, we welcome studies that cover topics related to:
• digital skills in research work;
• analysis of the use of digital resources and AI tools for the research process;
• factors incident to the digital competencies associated with the research process;
• infrastructure of university institutions on digital resources for the research process.
I am basically looking for a specific research topic from this vague heading to research on
Hello everyone!!! I am kindly willing to know your takes on this topic.
"I am unsure which statistical tool is more appropriate for my PhD study - PLS or AMOS. I am using the DTPB model, along with cultural moderating variables, to investigate teachers' intention to incorporate ICT in their teaching practice. My research question aims to understand the factors that influence the adoption of ICT by teachers in the classroom."
Hi There!
I am doing a thesis for an ISV (independent service vendor) company in the Netherlands that wants to broaden its horizons and set foot in Finland and appeal to MSPs (management service providers) there. My goal is to develop a marketing communication plan. Does anyone have some experience with this or knowledge about it? Any help is appreciated!
The Americas Conference on Information Systems (AMCIS) 2024,
Salt Lake City, August 15-17, 2024
Track: ICT for Global Development
Mini-track: Digital technology and built environmental sustainability
Chaired by
Ewa Duda, Maria Grzegorzewska University, Warsaw, Poland, eduda@aps.edu.pl
Hanna Obracht-Prondzyńska, University of Gdańsk, Poland, hanna.obracht-prondzynska@ug.edu.pl
March 1, 2024: Submissions are due at 10 a.m. ET (New York)
I am searching for all the possible ways to measure this gap and validate it statistically.
When ICT is integrated in to teaching and learning, learners are provided with opportunities to participate actively in their learning experience
Can someone let me know the questionnaire tool measure the impact of ICT on teaching competency of teacher educators.
How can the application of generative artificial intelligence improve the existing applications of Big Data Analytics and increase the scale of application of these technologies in carrying out analyses of processing large data sets, generating multi-criteria simulation models and carrying out predictive analyses and projections?
The acceleration of the processes of digitization of the economy triggered by the development of the Covid-19 pandemic has resulted in a significant increase in computerization, Internetization, applications of ICT information technologies and Industry 4.0 to various economic processes. There is an increase in applications of specific Industry 4.0 technologies in many industries and sectors of the economy, i.e., such as Big Data Analytics, Data Science, cloud computing, machine learning, personal and industrial Internet of Things, artificial intelligence, Business Intelligence, autonomous robots, horizontal and vertical data system integration, multi-criteria simulation models, digital twins, additive manufacturing, Blockchain, cybersecurity instruments, Virtual and Augmented Reality, and other advanced Data Mining technologies. In my opinion, among others, in the fields of medical therapies, communications, logistics, new online media, life science, ecology, economics, finance, etc., and also in the field of predictive analytics, there is an increase in the applications of ICT information technologies and Industry 4.0/Industry 5.0. Artificial intelligence technologies are growing rapidly as they find applications in various industries and sectors of the economy. It is only up to human beings how and in what capacity artificial intelligence technology will be implemented in various manufacturing processes, analytical processes, etc., where large data sets are processed in the most efficient manner. In addition, various opportunities are opening up for the application of artificial intelligence in conjunction with other technologies of the current fourth industrial revolution referred to as Industry 4.0/5.0. It is expected that in the years to come, applications of artificial intelligence will continue to grow in various areas, fields of manufacturing processes, advanced data processing, in improving manufacturing processes, in supporting the management of various processes, and so on.
I have been studying this issue for years and have presented the results of my research in the article, among others:
APPLICATION OF DATA BASE SYSTEMS BIG DATA AND BUSINESS INTELLIGENCE SOFTWARE IN INTEGRATED RISK MANAGEMENT IN ORGANIZATION
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How can the application of generative artificial intelligence improve the existing applications of Big Data Analytics and increase the scale of application of these technologies in carrying out analysis of processing large data sets, generating multi-criteria simulation models and carrying out predictive analysis and projections?
How can the application of generative artificial intelligence improve existing applications of Big Data Analytics?
And what is your opinion about it?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
How to build a Big Data Analytics system based on artificial intelligence more perfect than ChatGPT that learns but only real information and data?
How to build a Big Data Analytics system, a Big Data Analytics system, analysing information taken from the Internet, an analytics system based on artificial intelligence conducting real-time analytics, integrated with an Internet search engine, but an artificial intelligence system more perfect than ChatGPT, which will, through discussion with Internet users, improve data verification and will learn but only real information and data?
Well, ChatGPT is not perfect in terms of self-learning new content and perfecting the answers it gives, because it happens to give confirmation answers when there is information or data that is not factually correct in the question formulated by the Internet user. In this way, ChatGPT can learn new content in the process of learning new but also false information, fictitious data, in the framework of the 'discussions' held. Currently, various technology companies are planning to create, develop and implement computerised analytical systems based on artificial intelligence technology similar to ChatGPT, which will find application in various fields of big data analytics, will find application in various fields of business and research work, in various business entities and institutions operating in different sectors and industries of the economy. One of the directions of development of this kind of artificial intelligence technology and applications of this technology are plans to build a system of analysis of large data sets, a system of Big Data Analytics, analysis of information taken from the Internet, an analytical system based on artificial intelligence conducting analytics in real time, integrated with an Internet search engine, but an artificial intelligence system more perfect than ChatGPT, which will, through discussion with Internet users, improve data verification and will learn but only real information and data. Some of the technology companies are already working on this, i.e. on creating this kind of technological solutions and applications of artificial intelligence technology similar to ChatGPT. But presumably many technology start-ups that plan to create, develop and implement business specific technological innovations based on a specific generation of artificial intelligence technology similar to ChatGPPT are also considering undertaking research in this area and perhaps developing a start-up based on a business concept of which technological innovation 4.0, including the aforementioned artificial intelligence technologies, is a key determinant.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How to build a Big Data Analytics system, a system of Big Data Analytics, analysis of information taken from the Internet, an analytical system based on Artificial Intelligence conducting real-time analytics, integrated with an Internet search engine, but an Artificial Intelligence system more perfect than ChatGPT, which will, through discussion with Internet users, improve data verification and will learn but only real information and data?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
Should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to its full extent?
As part of the development of the concept of universal open access to knowledge resources, should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to their full extent?
There are different types of websites and sources of data and information on the Internet. The first Internet-accessible intelligent chatbot, i.e. ChatGPT, made available by OpenAI in November 2022, performs certain commands, solves tasks, and writes texts based on knowledge resources, data and information downloaded from the Internet, which were not fully up-to-date, as they were downloaded from selected websites and portals last in January 2022. In addition, the data and information were downloaded from many selected websites of libraries, articles, books, online indexing portals of scientific publications, etc. Thus, these were data and information selected in a certain way. In 2023, more Internet-based leading technology companies were developing and making their intelligent chatbots available on the Internet. Some of them are already based on data and information that is much more up-to-date compared to the first versions of ChatGPT made available on the Internet in open access. In November 2023, social media site X (the former Twiter) released its intelligent chatbot in the US, which reportedly works on the basis of up-to-date information entered into the site through posts, messages, tweets made by Internet users. Also in October 2023, OpenAI announced that it will create a new version of its ChatGPT, which will also draw data and knowledge from updated knowledge resources downloaded from multiple websites. As a result, rival Internet-based leading forms of technology are constantly refining the evolving designs of the intelligent chatbots they are building, which will increasingly use more and more updated data, information and knowledge resources drawn from selected websites, web pages and portals. The rapid technological advances currently taking place regarding artificial intelligence technology may in the future lead to the integration of generative artificial intelligence and general artificial intelligence developed by technology companies. Competing technology companies may strive to build advanced artificial intelligence systems that can achieve a high level of autonomy and independence from humans, which may lead to a situation of the possibility of artificial intelligence technology development slipping out of human control. Such a situation may arise when the emergence of a highly technologically advanced general artificial intelligence that achieves the possibility of self-improvement and, in addition, realizing the process of self-improvement in a manner independent of humans, i.e. self-improvement with simultaneous escape from human control. However, before this happens it is earlier that technologically advanced artificial intelligence can achieve the ability to select data and information, which it will use in the implementation of specific mandated tasks and their real-time execution using up-to-date data and online knowledge resources.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
As part of the development of the concept of universal open access to knowledge resources, should the intelligent chatbots created by technology companies available on the Internet be connected to Internet resources to their full extent?
Should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to the full extent?
And what is your opinion about it?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
education
ICT -Based-tools
Can someone recommend me “highly Fast track” sustainability or related journals having scope to publish SDG 3 “Good health and wellbeing” from the perspective of ICT and E-Health? I really appreciate any help you can provide.
Greetings to all!
- I'm a Ph.D Research scholar in the Department of Education. Please help me to find a research topic in the area of ICT (Information and Communication Technology) and Education keeping in view the current scenario.
- Related to Online Learning and Education will get considered.
Your valuable suggestions are much appreciated and welcomed...
Introduction for my thesis study effects of ICT in the teaching and learning process
Already, can the application of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in the strategic management of an enterprise?
Already today, can the application of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in the field of carrying out multi-faceted, complex strategic analysis of the business environment and determinants of company development, predictive analysis based on the processing of large data sets and, therefore, also in the field of strategic business management?
The ongoing technological progress is characterized by the dynamic development of Industry 4.0/5.0 technologies, technologies typical of the current fourth technological revolution, including ICT information and communication technologies, technologies for advanced multi-criteria processing of large data sets and information resources. The development of information processing technologies in the era of the current technological revolution termed Industry 4.0/5.0 is determined by the development and growth of applications of ICT information and communication technologies, Internet technologies and advanced data processing, which include. : Big Data Analytics, Data Science, cloud computing, artificial intelligence, machine learning, deep learning, personal and industrial Internet of Things, Business Intelligence, autonomous robots, horizontal and vertical data system integration, multi-criteria simulation models, digital twins, additive manufacturing, Blockchain, smart technologies, cybersecurity instruments, Virtual and Augmented Reality, and other advanced data processing technologies Data Mining. Technological advances in computing, emerging faster and faster microprocessors, more and more capacious and high-speed data storage disks, etc., are making it possible to process large data sets faster and more efficiently. In addition, numerous new applications of the aforementioned technologies are emerging in various sectors of the economy by combining these technologies in various configurations for new applications. Numerous business applications of these technologies in companies and enterprises are emerging. The implementation of these technologies into the business activities of companies, enterprises, financial and public institutions contributes to increasing the efficiency of the implementation of certain processes. In view of the above, therefore, there is much to suggest that if not yet now, then soon the application of a combination of artificial intelligence technologies, Big Data Analytics and quantum computers may be helpful in terms of carrying out multi-faceted, complex strategic analyses of the business environment and determinants of company development, predictive analyses based on the processing of large data sets and, therefore, also in terms of strategic business management.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the application of the combination of artificial intelligence technology, Big Data Analytics and quantum computers already be helpful in the field of strategic business management?
Can the use of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in strategic business management?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Thank you,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
The fourth technological revolution currently underway is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to machine learning, deep learning, artificial intelligence, ... what's next? Intelligent thinking autonomous robots?
The fourth technological revolution currently underway is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to technologies learning machines, deep learning, artificial intelligence. Machine learning, machine learning, machine self-learning or machine learning systems are all synonymous terms relating to the field of artificial intelligence with a particular focus on algorithms that can improve themselves, improving automatically through the action of an experience factor within exposure to large data sets. Algorithms operating within the framework of machine learning build a mathematical model of data processing from sample data, called a learning set, in order to make predictions or decisions without being programmed explicitely by a human to do so. Machine learning algorithms are used in a wide variety of applications, such as spam protection, i.e. filtering internet messages for unwanted correspondence, or image recognition, where it is difficult or infeasible to develop conventional algorithms to perform the needed tasks. Deep learning is a kind of subcategory of machine learning, which involves the creation of deep neural networks, i.e. networks with multiple levels of neurons. Deep learning techniques are designed to improve, among other things, automatic speech processing, image recognition and natural language processing. The structure of deep neural networks consists of multiple layers of artificial neurons. Simple neural networks can be designed manually so that a specific layer detects specific features and performs specific data processing, while learning consists of setting appropriate weights, significance levels, value system for components of specific issues defined on the basis of processing and learning from large amounts of data. In large neural networks, the deep learning process is automated and self-contained to a certain extent. In this situation, the network is not designed to detect specific features, but detects them on the basis of the processing of appropriately labelled data sets. Both such datasets and the operation of neural networks themselves should be prepared by specialists, but the features are already detected by the programme itself. Therefore, large amounts of data can be processed and the network can automatically learn higher-level feature representations, which means that they can detect complex patterns in the input data. In view of the above, deep learning systems are built on Big Data Analytics platforms built in such a way that the deep learning process is performed on a sufficiently large amount of data. Artificial intelligence, denoted by the acronym AI (artificial intelligence), is respectively the 'intelligent', multi-criteria, advanced, automated processing of complex, large amounts of data carried out in a way that alludes to certain characteristics of human intelligence exhibited by thought processes. As such, it is the intelligence exhibited by artificial devices, including certain advanced ICT and Industry 4.0 information technology systems and devices equipped with these technological solutions. The concept of artificial intelligence is contrasted with the concept of natural intelligence, i.e. that which pertains to humans. In view of the above, artificial intelligence thus has two basic meanings. On the one hand, it is a hypothetical intelligence realised through a technical rather than a natural process. On the other hand, it is the name of a technology and a research field of computer science and cognitive science that also draws on the achievements of psychology, neurology, mathematics and philosophy. In computer science and cognitive science, artificial intelligence also refers to the creation of models and programmes that simulate at least partially intelligent behaviour. Artificial intelligence is also considered in the field of philosophy, within which a theory is developed concerning the philosophy of artificial intelligence. In addition, artificial intelligence is also a subject of interest in the social sciences. The main task of research and development work on the development of artificial intelligence technology and its new applications is the construction of machines and computer programmes capable of performing selected functions analogously to those performed by the human mind functioning with the human senses, including processes that do not lend themselves to numerical algorithmisation. Such problems are sometimes referred to as AI-difficult and include such processes as decision-making in the absence of all data, analysis and synthesis of natural languages, logical reasoning also referred to as rational reasoning, automatic proof of assertions, computer logic games e.g. chess, intelligent robots, expert and diagnostic systems, among others. Artificial intelligence can be developed and improved by integrating it with the areas of machine learning, fuzzy logic, computer vision, evolutionary computing, neural networks, robotics and artificial life. Artificial intelligence (AI) technologies have been developing rapidly in recent years, which is determined by its combination with other Industry 4.0 technologies, the use of microprocessors, digital machines and computing devices characterised by their ever-increasing capacity for multi-criteria processing of ever-increasing amounts of data, and the emergence of new fields of application. Recently, the development of artificial intelligence has become a topic of discussion in various media due to the open-access, automated and AI-enabled solution ChatGPT, with which Internet users can have a kind of conversation. The solution is based and learns from a collection of large amounts of data extracted in 2021 from specific data and information resources on the Internet. The development of artificial intelligence applications is so rapid that it is ahead of the process of adapting regulations to the situation. The new applications being developed do not always generate exclusively positive impacts. These potentially negative effects include the potential for the generation of disinformation on the Internet, information crafted using artificial intelligence, not in line with the facts and disseminated on social media sites. This raises a number of questions regarding the development of artificial intelligence and its new applications, the possibilities that will arise in the future under the next generation of artificial intelligence, the possibility of teaching artificial intelligence to think, i.e. to realise artificial thought processes in a manner analogous or similar to the thought processes realised in the human mind.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
The fourth technological revolution currently taking place is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to machine learning technologies, deep learning, artificial intelligence, .... what's next? Intelligent thinking autonomous robots?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best regards,
Dariusz Prokopowicz
Which new ICT information technologies are most helpful in protecting the biodiversity of the planet's natural ecosystems?
What are examples of new technologies typical of the current fourth technological revolution that help protect the biodiversity of the planet's natural ecosystems?
Which new technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are helping to protect the biodiversity of the planet's natural ecosystems?
How do new Big Data Analytics and Artificial Intelligence technologies, including deep learning based on artificial neural networks, help protect the biodiversity of the planet's natural ecosystems?
New technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are finding new applications. These technologies are currently developing rapidly and are an important factor in the current fourth technological revolution. On the other hand, due to the still high emissions of greenhouse gases generating the process of global warming, due to progressive climate change, increasingly frequent weather anomalies and climatic disasters, in addition to increasing environmental pollution, still rapidly decreasing areas of forests, carried out predatory forest management, the level of biodiversity of the planet's natural ecosystems is rapidly decreasing. Therefore, it is necessary to engage new technologies, including ICT information technologies, technologies categorized as Industry 4.0/Industry 5.0, including new technologies in the field of Big Data Analytics and Artificial Intelligence in order to improve and scale up the protection of the biodiversity of the planet's natural ecosystems.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How do the new technologies of Big Data Analytics and artificial intelligence, including deep learning based on artificial neural networks, help to protect the biodiversity of the planet's natural ecosystems?
Which new technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are helping to protect the biodiversity of the planet's natural ecosystems?
What are examples of new technologies that help protect the biodiversity of the planet's natural ecosystems?
How do new technologies help protect the biodiversity of the planet's natural ecosystems?
And what is your opinion on this topic?
What do you think about this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
What are the applications of machine learning, deep learning and/or artificial intelligence technologies to securities market analysis, including stock market analysis, bonds, derivatives?
ICT information technologies have already been implemented in banking and large companies operating in non-financial sectors of the economy since the beginning of the third technological revolution. Subsequently, the Internet was used to develop online and mobile banking. Perhaps in the future, virtual banking will be developed on the basis of the increasing scale of application of technologies typical of the current fourth technological revolution and the growing scale of implementation of Industry 4.0 technologies to businesses operating in both the financial and non-financial sectors of the economy. In recent years, various technologies for advanced, multi-criteria data processing have increasingly been applied to business entities in order to improve organisational management processes, risk management, customer and contractor relationship management, management of supply logistics systems, procurement, production, etc., and to improve the profitability of business processes. In order to improve the profitability of business processes, improve marketing communications, offer products and services remotely to customers, etc., such Industry 4.0 technologies as the Internet of Things, cloud computing, Big Data Analytics, Data Science, Blockchain, robotics, multi-criteria simulation models, digital twins, but also machine learning, deep learning and artificial intelligence are increasingly being used. In the field of improving the processes of equity investment management, the processes of carrying out economic and financial analyses, fundamental analyses concerning the valuation of specific categories of investment assets, including securities, i.e. improving the processes carried out in investment banking, ICT information technologies and Industry 4.0 have also been used for many years now. In this connection, there are also emerging opportunities to apply machine learning, deep learning and/or artificial intelligence technologies to the analysis of the securities market, including the analysis of the stock market, bonds, derivatives, etc., i.e. key aspects of business analytics carried out in investment banking. Improving such analytics through the use of the aforementioned technologies should, in addition to the issue of optimising investment returns, also take into account important aspects of the financial security of capital markets transactions, including issues of credit risk management, market risk management, systemic risk management, etc.
In view of the above, I would like to address the following question to the esteemed community of scientists and researchers:
What are the applications of machine learning, deep learning and/or artificial intelligence technologies for securities market analysis, including equity, bond, derivatives market analysis?
What is your opinion on the subject?
What do you think about this topic?
Please respond,
I invite you all to discuss,
Thank you very much,
Best regards,
Dariusz Prokopowicz
It can be books, papers, articles... I will read it for a literature review.
I am looking for a postdoc opportunity in language related areas, pedagogy, ICT in teaching, research productivity, ESL,etc
Does anyone have an opportunity for me or know of any opportunity for me?
My Doctorate thesis is titled "INFORMATION LITERACY SKILLS AND INFORMATION COMMUNICATION TECHNOLOGY AS CORRELATES OF ACADEMIC STAFF'S RESEARCH PRODUCTIVITY IN LANGUAGE EDUCATION IN ONDO STATE.
Thank you
What Are Some Of The Material That Makes Your Work Easier?
ICT in elementary school
In which levels of education (primary school, secondary school, high school, university) and which skills should we prefer to use augmented reality applications and web 2.0 tools in foreign language teaching?
What are the applications of digital twins in conjunction with artificial intelligence, Big Data Analytics and other Industry 4.0 technologies in creating simulations of digital models of complex macroprocesses?
The technology of digital twins is used, among other things, to simulate production, logistics processes in business entities, i.e. in the microeconomic field. The creation of digital twins for specific economic and financial processes carried out in economic entities supports the management of these entities. Computer simulations of e.g. production processes, offering of services, supply and procurement logistics, distribution logistics, marketing communication with customers, etc. save time and money, as possible errors in decisions generate much smaller negative effects if they are realised not within the framework of real processes and a kind of experimentation on a functioning enterprise, company, corporation, institution, etc., but within the framework of computer simulation. but within the framework of a computer simulation in which various alternative variants of the development of the economic and financial situation of a company are considered and compared with each other as forecasts of specific processes defined for the following days, weeks, months, quarters or years. Therefore, since the pandemic, many companies and enterprises in Poland have been investing in the creation of IT systems equipped with digital twin technologies, within which it is possible to create multi-criteria, multi-faceted, complex simulation models of specific economic and other processes realised within the company, and/or simulation of processes realised at the level of the company's relations with the environment, with business partners, customers, cooperators, etc.
On the other hand, the possibilities of creating simulations for macroprocesses, i.e. e.g. macroeconomic processes, natural processes, technological processes, geological processes, social processes, long-term climate change, cosmological processes, etc., through the use of digital twin technologies and also other Industry 4.0 technologies, including learning machines, deep learning, artificial intelligence, analytics carried out on Big Data Analytics platforms, are a matter of debate. Year on year, due to technological advances in ICT, including the use of new generations of microprocessors characterised by ever-increasing computing power, the possibilities for increasingly efficient, multi-criteria processing of large collections of data and information are growing. Artificial intelligence can be particularly useful in the field of selective and precise search for specific, defined types of information and data extracted from many selected types of websites and real-time transfer and processing of these data in database systems organised in cloud computing on Big Data Analytics platforms, which would be accessed by a system managing a built and updated model of a specific macro-process using digital twin technologies.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the applications of digital twins in conjunction with Artificial Intelligence, Big Data Analytics and other Industry 4.0 technologies for creating simulated digital models of complex macroprocesses?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
Greetings, fellow scholars! I am an academic at Deakin University Melbourne Australia who is interested in studying entrepreneurship education for students outside of the business discipline. I believe that such education can equip students with valuable skills and knowledge that can benefit them in various fields, and I would like to explore how to deliver it effectively.
- I am looking for collaborators from other universities who share my interest and would like to join me in this research endeavour. Specifically, I am interested in studying the following topics:
- The current state of entrepreneurship education for non-business students
- The learning outcomes of entrepreneurship education for non-business students
- The most effective approaches and tools to deliver entrepreneurship education to non-business students
- The impact of entrepreneurship education for non-business students on their career choices and outcomes
If you are a scholar who has experience or interest in any of these topics, please reach out to me. I believe that a collaborative effort can yield valuable insights and contributions to the field of entrepreneurship education.
I can be touched at jack.li@deakin.edu.au Thank you for your attention, and I look forward to hearing from you soon.
Best regards,
Jack
How can cryptocurrency trading be formalised, institutionalised and made more secure?
How to build formalised and high-security transaction institutionalised cryptocurrency trading markets?
In recent years, many technology startups have based their growth and competitive advantage on business, technological, product, service, marketing or other innovations. Banks are reluctant to provide investment loans to emerging startups basing their growth on innovative technologies. In such a situation, innovative startups emerge and are financed through such external sources of funding as investment funds, business angels, securities issuance, crowdfunding and others. Crowdfunding will develop intensively in the future as an alternative to classic forms of external financing for business ventures. It is an alternative to the financial service offerings of financial sector institutions, particularly in the segment of financing innovative start-ups. Commercial banks operating within the framework of classic deposit and credit banking often avoid lending to innovative start-ups due to difficulties in assessing credit risk. In such a situation, crowdfunding can be a good solution to the problems of finding external funding. On the other hand, cryptocurrencies, which operate outside institutionalised and centralised financial systems, are growing in importance. Perhaps in the future, cryptocurrencies will displace traditional currency from online financial transactions between fintechs, financial institutions, innovative start-ups, online technology companies running social media portals and their customers, and between users of these online portals. In addition to this, it is becoming essential to improve the security of online financial transactions and settlements carried out through online mobile banking. In this connection, blockchain technology is developing as an application for securing online transactions and data transfer. An increasing number of large companies are announcing the creation of their own cryptocurrency. Some investment banks such as JP Morgan have announced the creation of their own cryptocurrency for settlements with key counterparties. The development and implementation of ICT information technologies, advanced data processing technologies Industry 4.0 and Internet technologies into the business activities of companies and enterprises facilitates the execution of financial operations on the Internet and ensures a high level of security of Internet data transfer. The development of technological innovations, ICT information technologies, advanced information processing technologies co-creating the current technological revolution Industry 4.0, financing through crowdfunding, securing online transactions with blockchain technology, the increase in the use of cryptocurrencies in these settlements, etc. are likely to be important determinants of the development of innovative, technological start-ups operating on the Internet and factors in the development of the knowledge economy in the years to come. Consequently, the development of open innovation is correlated with the issue of innovation and entrepreneurship development in the economy. A significant proportion of innovative startups develop their business model based on open innovation. On the other hand, in macroeconomic terms, the development of open innovation can be an important determinant of economic development in developing countries and in developed knowledge-based economies. In view of the above, research shows that the spread of open innovation and open knowledge bases is an important issue for building a sustainable economy in a technologically developed and developing country. A number of predictive studies show that cryptocurrencies will grow in importance in the future in financing various transactions and settlements carried out electronically, through the Internet, on social media, in investment banking, etc. Currently, many technology startups base their growth and competitive advantage on technological, product, service, marketing or other innovations. However, in order for the financing of new business ventures, innovative startups to develop using cryptocurrencies it is necessary to increase the scale of systemic formalisation and institutionalisation of transactions carried out using cryptocurrencies, to build formalised cryptocurrency markets and to increase the security of transactions carried out using cryptocurrencies in the future.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the planned taxation of cryptocurrency transactions be a first step for increasing the scale of systemic formalisation and institutionalisation of cryptocurrency transactions, building formalised cryptocurrency markets and increasing the future security of cryptocurrency transactions?
How to build formalised and highly secure transaction institutionalised cryptocurrency trading markets?
How can cryptocurrency transactions be formalised, institutionalised and made more secure?
What is your opinion on this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
How should schools, colleges and universities take advantage of new technologies so that they do not lag behind technological progress?
In recent years, the world has been increasingly influenced by the ongoing fourth technological revolution. Industry 4.0, biotechnologies, green technologies, innovation, etc. are being developed. The development of information processing technologies in the era of the current technological revolution termed Industry 4.0 is determined by the development and growth of applications of ICT information technologies, internet technologies and advanced data processing. The current technological revolution defined as Industry 4.0 is determined by the development of the following technologies, analytical techniques, ICT solutions, etc. : learning machines, deep learning, artificial intelligence, smart technologies, Big Data Analytics, Data Science, cloud computing, machine learning, personal and industrial Internet of Things, Business Intelligence, autonomous robots, horizontal and vertical data system integration, multi-criteria simulation models, digital twins, additive manufacturing, Blockchain, cyber security instruments, Virtual and Augmented Reality and other advanced data processing technologies Data Mining. New technologies and innovations are emerging from research institutes, R&D centres, new technology development centres, technology companies, universities and other entities, among others. On the other hand, newly emerging innovations and technologies find new practical and business applications through their implementation in the various fields of activity of actors. Also schools, colleges and universities try to implement new technologies into their activities, so as not to lag behind the technological progress.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How should schools, colleges and universities make use of new technologies so as not to lag behind the technological advances taking place?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best regards,
Dariusz Prokopowicz
Good day All, I just registered for a Post Grad Diploma in ICT. What are the latest topics that I can tackle for my research. I am interested in Cybersecurity but I just don't know where to begin choosing a topic.
Does the application of artificial intelligence to automate credit scoring processes of potential borrowers allow to improve credit risk management processes and increase profitability of commercial banks' lending activities?
Does the application of specific artificial intelligence technologies for the automation of risk analysis processes, the execution of credit scoring processes for potential individual credit transactions, the ongoing monitoring of open credit transactions and the analysis of changes in the level of credit risk significantly allow the improvement of credit risk management processes and the optimisation of credit processes in the context of improving the profitability of commercial banks' lending activities?
Commercial banks operating according to the classic deposit and credit banking model generate revenues and profits mainly from lending activities. On the one hand, the formation of the quality of the bank loan portfolio and the level of financial results are determined by external factors, i.e. the economic situation in the economy, the economic environment of the bank's customers, borrowers taking bank loans and depositors placing their financial surpluses on bank deposits. On the other hand, the efficiency of the lending business and the development of financial results are also influenced by internal factors, which primarily include the efficiency of the credit risk management process. The credit risk management process is carried out on a stand-alone basis in terms of examining the creditworthiness of the potential or current borrower (ongoing monitoring of the loan granted) and the credit risk the bank accepts in the situation of granting a loan. The credit risk management process is also carried out in banks on a portfolio basis with regard to the entire portfolio of loans granted and by type of loan. Both in terms of the individual and portfolio risk management process, banks are seeking to improve and optimise these processes through the involvement of new information technologies and Industry 4.0. Thanks to these new technologies, banks have the possibility to transfer part of their risk management processes to the bank's internal IT systems and to offer loans also and increasingly via the Internet. Loans of relatively low amounts, consumer loans, instalment loans, i.e. mainly granted to the public, and working capital loans to businesses can already be entirely remote communication with the customer via the Internet. In the case of home and business loans, including investment loans, banks require the borrower to provide various business-related documents to carry out a creditworthiness analysis and examine a number of different economic, financial, operational, investment risk factors, etc., which can be used as a basis for the creditworthiness analysis. As a result, the process of granting these types of business, investment and housing loans, which are usually also for relatively higher amounts, is not yet fully feasible via the Internet when the process of granting these loans itself is not expected to generate high operational risks for banks. However, ongoing technological advances may also change this significantly in the future. At present, banks are trying to implement new Industry 4.0 technologies into their lending activities in order to improve and optimise their costs. The use of new Industry 4.0 technologies in banks is also determined by the need to improve computerised cyber-security systems with a view to constantly improving cyber-security. Recently, key Industry 4.0 technologies that banks are implementing into their operations include artificial intelligence, machine learning technology, deep learning, Big Data Analytics, cloud computing, Internet of Things, Blockchain, multi-criteria simulation models, digital twins, etc. Particularly new opportunities arise in terms of improving both remote marketing communication techniques, optimising banking procedures, reducing the scale of the bank's operational risks and also improving credit procedures and the credit risk management process by involving artificial intelligence in the bank's operations.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Does the application of specific artificial intelligence technologies for the automation of risk analysis processes, the execution of credit scoring processes for potential individual credit transactions, the ongoing monitoring of open credit transactions and the analysis of changes in the level of credit risk make it possible to improve credit risk management processes and the optimisation of credit processes to a significant extent in the context of improving the profitability of commercial banks' lending activities?
And what is your opinion on this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
Information and Communication Technologies (ICT) have played a significant role in identifying COVID-19 positive patients. Some examples include:
- Telemedicine: ICT has enabled remote consultations between patients and healthcare professionals, which has helped to reduce the spread of the virus and reduce the burden on hospitals and clinics.
- Electronic Health Records (EHRs): ICT has allowed for the digitization of health records, which has made it easier for healthcare professionals to access patient information and track the spread of the virus.
- Contact Tracing: ICT has been used to develop contact tracing apps, which use Bluetooth technology to track and alert individuals who have come into contact with a positive case.
- Remote Monitoring: ICT has been used to monitor patients remotely, using devices such as wearable sensors to track vital signs and alert healthcare professionals to potential complications.
In summary, ICT has played a vital role in identifying and tracking COVID-19 positive patients, enabling healthcare professionals to make more informed decisions and take necessary actions in a timely manner.
Telemedicine: ICT has enabled remote consultations between patients and healthcare professionals through various means such as video conferencing, phone calls and messaging. This has been particularly useful during the COVID-19 pandemic, as it has allowed patients to receive medical advice and treatment without having to visit a hospital or clinic in person, which reduces the risk of transmission. Telemedicine has also allowed healthcare professionals to triage patients remotely, identifying those who are most at risk and need to be seen in person, and enabling others to receive care and advice from the safety of their own homes.
Electronic Health Records (EHRs): ICT has allowed for the digitization of health records, which has made it easier for healthcare professionals to access patient information and track the spread of the virus. This has been particularly useful in identifying patients who have been in close contact with a positive case, as well as tracking the spread of the virus within communities. EHRs can also be used to identify patterns in the spread of the virus, which can help healthcare professionals and policymakers to better understand the virus and develop effective strategies to combat it.
Contact Tracing: ICT has been used to develop contact tracing apps, which use Bluetooth technology to track and alert individuals who have come into contact with a positive case. These apps have been used in many countries to track the spread of the virus and alert individuals who may have been exposed, enabling them to take necessary precautions and get tested.
Remote Monitoring: ICT has been used to monitor patients remotely, using devices such as wearable sensors to track vital signs and alert healthcare professionals to potential complications. This has been particularly useful for patients who are self-isolating or who have been discharged from hospital but are still recovering. Remote monitoring enables healthcare professionals to monitor patients remotely, which can help to identify and intervene early if any complications arise.
In summary, ICT has played a vital role in identifying and tracking COVID-19 positive patients, enabling healthcare professionals to make more informed decisions and take necessary actions in a timely manner. Telemedicine and EHRs have improved accessibility, Contact tracing has helped to track the spread of the virus, and Remote monitoring has improved patient outcomes by enabling healthcare professionals to intervene early if any complications arise.
How do the new ICT information technologies and Industry 4.0 contribute to improving the efficiency of the functioning of business entities?
The functioning of which areas of economic activity of companies and enterprises is improved by the implementation of ICT information technologies and/or Industry 4.0?
The pandemic has accelerated the implementation of ICT information technology into business operations and the use of the Internet to promote and sell product and service offerings. The SARS-CoV-2 (Covid-19) coronavirus pandemic indirectly accelerated the development of ICT information technology applications in various aspects of business. During the pandemic, lockdowns were introduced on selected, mainly service sectors of the economy. As part of anti-pandemic security instruments, citizens infected with the coronavirus had to stay at home in quarantine for several to several days. As a result, quarantined consumers started ordering and buying products and services more via the Internet. On the other hand, companies under lockdown were able to conduct their business via the Internet. Consequently, many companies and businesses during the pandemic increased their use of the Internet for their business operations. Many businesses have increased the scale of sales of products and services offered to customers via the Internet. Many companies during the pandemic built or developed their online sales platforms. The scale of turnover carried out through the Internet in many companies and enterprises has even increased several times. As a result, the scale of e-commerce increased. The turnover of courier companies increased. E-logistics has developed. The use of ICT information technology and Industry 4.0 in various aspects of business has increased. When placing orders via the Internet and making purchases of products and services, citizens, including consumers and entrepreneurs, increasingly made payments for ordered products, services, semi-finished products, components, raw materials, etc. via the Internet. The development of online payments and settlements has resulted in the acceleration of the development of e-banking, especially online and mobile banking. In enterprises, computerized systems are being developed to support management processes, decision-making processes. The improvement of computerized enterprise management systems is currently being carried out by increasing the scale of use of Internet technologies and Industry 4.0. Examples are investments in computerized management and decision-making process support systems, in which so-called digital twins of economic, business processes and/or specific spheres of conducted production, service, trade, etc. are built. During the pandemic of the SARS-CoV-2 coronavirus (Covid-19), the scale of investment by companies and enterprises in the development of computerized decision-making systems, in which the aforementioned digital twins and other innovations of Industry 4.0 technology are used, increased significantly. The development of information processing technologies in the era of the current technological revolution termed Industry 4.0 is determined by the development and growth of applications of ICT information technologies, Internet technologies and advanced data processing. The technological revolution currently taking place, referred to as Industry 4. 0 is determined by the development and implementation into economic processes of the following technologies: analytical and database technologies Big Data Analytics, Data Science, cloud computing, machine learning, personal and industrial Internet of Things, artificial intelligence, Business Intelligence, autonomous robots, horizontal and vertical data system integration, multi-criteria simulation models, digital twins, additive manufacturing, Blockchain, smart technologies, cyber security instruments, Virtual and Augmented Reality, and other advanced data processing technologies Data Mining. Thanks to the application of new Industry 4.0 technologies, computerized multi-criteria simulation models known as digital twins are now being created in many companies, enterprises and financial institutions. Building digital twins of entire processes of production, service provision, procurement logistics, manufacturing and distribution in information systems, the company obtains an excellent tool to support the process of managing the organization. Also, business analytics combined with Big Data Analytics technology supports the management process of an enterprise, company, financial institution or other type of business entity. With these solutions, it is also possible to create simulation models of the future development of complex, multi-faceted economic, social, financial, pandemic, natural, climatic, etc. processes. With the help of Big Data Analytics, many companies are already carrying out market research based on the analysis of Internet users' sentiment regarding changes in trends in the evaluation of the quality of product and service offerings, company reputation, logos, corporate branding, etc. The applications of Big Data technology in this area, including computerized market research, analysis of Internet user sentiment, in supporting business management processes, I have described in my publications posted on my profile of this Research Gate portal.
In view of the above, I address the following questions to the esteemed community of scientists and researchers:
How do new ICT information technologies and Industry 4.0 contribute to improving the efficiency of business entities?
The functioning of which areas of economic activity of companies and enterprises is improved through the implementation of ICT information technologies and/or Industry 4.0?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
Hi, Everyone,
Kindly share your input on the above questions for me. We all know the digital health has become more relevant to the health systems more anticipated in years. The population around the world are more technologically inclined though there are some laps but the majority of the human race now accustom to the use ICT to manage their health and wellness.
Look forward to your response.
Thank you all.
In the not-too-distant future, will it be possible to merge human consciousness with a computer, or to transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
This kind of vision involving the transfer of the consciousness and knowledge of a specific human being to a computer system equipped with a suitably highly advanced artificial intelligence was depicted in a science fiction film titled "Transcendence" (starring Jonny Deep) It has been reported that research work is underway at one of Elon Musk's technology companies to create an intelligent computerized system that can communicate with the human brain in a way that is far more technologically advanced than current standards. The goal is to create an intelligent computerized system, equipped with a new generation of artificial intelligence technology so that it will be possible to transfer a copy of human knowledge and consciousness contained in the brain of a specific person according to a concept similar to that depicted in a science fiction film titled "Transcendence." In considering the possible future feasibility of such concepts concerning the transfer of human consciousness and knowledge to an information system equipped with advanced artificial intelligence, the paraphilosophical question of extending the life of a human being whose consciousness functions in a suitably advanced intelligent information system is taken into account, while the human being from whom this consciousness originated previously died. And even if this were possible in the future, how should this issue be defined in terms of the ethics of science, the essence of humanity, etc.? On the other hand, research and research-implementation work is already underway in many technology companies' laboratories to create a system of non-verbal communication, where certain messages are transmitted from a human to a computer without the use of a keyboard, etc., only through systems that read people's minds, for example. through systems that recognize specific messages formulated non-verbally in the form of thoughts only and a computer system equipped with electrical impulse and brain wave sensors specially created for this purpose would read human thoughts and transmit the information thus read, i.e., messages to the artificial intelligence system. This kind of solution will probably soon be available, as it does not require as advanced artificial intelligence technology as would be required for a suitably intelligent information system into which the consciousness and knowledge of a specific human person could be uploaded. Ethical considerations arise for the realization of this kind of transfer and perhaps through it the creation of artificial consciousness.
In view of the above, I address the following question to the esteemed community of researchers and scientists:
In the not-too-distant future, will it be possible to merge human consciousness with a computer or transfer human consciousness and knowledge to a computer system equipped with sufficiently highly advanced artificial intelligence?
And if so, what do you think about this in terms of the ethics of science, the essence of humanity, etc.?
And what is your opinion on this topic?
What do you think on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
How can new ICT information technologies and Industry 4.0 help in environmental monitoring and conservation of the tropical Amazon Rainforest and other areas of forests, green spaces?
The technological advances taking place are contributing to the improvement of computerized analytical techniques implemented on large data sets. The development of technological solutions typical of the current fourth technological revolution, including the improvement and creation of new generations of ICT and Industry 4.0 information technologies, makes it possible to carry out multi-criteria analysis and simulation and forecasting models carried out on large sets of information and data. Such analyses are carried out using computerized analytical tools including Big Data Analytics in conjunction with other Industry 4.0 technologies. When the aforementioned analytical tools are enriched with Internet of Things technologies, cloud computing and satellite-implemented sensing and monitoring techniques, the possibilities for multi-criteria analytics of large areas, e.g. nature, climate and others in real time conducted using satellites, emerge. When artificial intelligence technology, machine learning, multi-criteria simulation models, and digital twins are added to these analytical and research techniques, opportunities arise for creating predictive simulations for multi-factor, complex processes realized in real time. These can be complex multi-factor natural, climatic, ecological processes, etc., and concerning changes in the state of the environment, environmental pollution, changes in the state of ecosystems, biodiversity, changes in the state of soils in agricultural fields, changes in the state of moisture in forest areas, etc. caused by civilization factors. In view of the above, new ICT information technologies and Industry 4.0 can also help monitor the state of the environment and protect the tropical Amazon rainforest and other areas of forests, green areas.
In view of the above, I address the following question to the esteemed community of researchers and scientists:
How can new ICT information technologies and Industry 4.0 help in environmental monitoring the biodiversity status and protection of the tropical Amazon Rainforest and other areas of forests, green spaces?
In what configuration of individual Industry 4.0 technologies should computerized environmental monitoring systems be built as essential elements of the system for protecting the tropical Amazon Rainforest and other areas of forests, green areas?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Regards,
Dariusz Prokopowicz
Can Big Data Analytics technology be helpful in forecasting complex multi-faceted climate, natural, social, economic, pandemic, etc. processes?
Industry 4.0 technologies, including Big Data Analytics technology, are used in multi-criteria processing, analyzing large data sets. The technological advances taking place in the field of ICT information technology make it possible to apply analytics carried out on large sets of data on various aspects of the activities of companies, enterprises and institutions operating in different sectors and branches of the economy.
Before the development of ICT information technologies, IT tools, personal computers, etc. in the second half of the 20th century as part of the 3rd technological revolution, computerized, partially automated processing of large data sets was very difficult or impossible. As a result, building multi-criteria, multi-article, big data and information models of complex structures, simulation models, forecasting models was limited or impossible. However, the technological advances made in the current fourth technological revolution and the development of Industry 4.0 technology have changed a lot in this regard. More and more companies and enterprises are building computerized systems that allow the creation of multi-criteria simulation models within the framework of so-called digital twins, which can present, for example, computerized models that present the operation of economic processes, production processes, which are counterparts of the real processes taking place in the enterprise. An additional advantage of this type of solution is the ability to create simulations and study the changes of processes fictitiously realized in the model after the application of certain impact factors and/or activation, materialization of certain categories of risks. When large sets of historical quantitative data presenting changes in specific factors over time are added to the built multi-criteria simulation models within the framework of digital twins, it is possible to create complex multi-criteria forecasting models presenting potential scenarios for the development of specific processes in the future. Complex multi-criteria processes for which such forecasting models based on computerized digital twins can be built include climatic, natural, social, economic, pandemic, etc. processes, which can be analyzed as the environment of operating specific companies, enterprises and institutions.
In view of the above, I address the following question to the esteemed community of researchers and scientists:
In forecasting complex multi-faceted climate, natural, social, economic, pandemic, etc. processes, can Big Data Analytics technology be helpful?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz