Science topic
Big Data - Science topic
In information technology, big data is a loosely-defined term used to describe data sets so large and complex that they become awkward to work with using on-hand database management tools.
Questions related to Big Data
Should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to its full extent?
As part of the development of the concept of universal open access to knowledge resources, should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to their full extent?
There are different types of websites and sources of data and information on the Internet. The first Internet-accessible intelligent chatbot, i.e. ChatGPT, made available by OpenAI in November 2022, performs certain commands, solves tasks, and writes texts based on knowledge resources, data and information downloaded from the Internet, which were not fully up-to-date, as they were downloaded from selected websites and portals last in January 2022. In addition, the data and information were downloaded from many selected websites of libraries, articles, books, online indexing portals of scientific publications, etc. Thus, these were data and information selected in a certain way. In 2023, more Internet-based leading technology companies were developing and making their intelligent chatbots available on the Internet. Some of them are already based on data and information that is much more up-to-date compared to the first versions of ChatGPT made available on the Internet in open access. In November 2023, social media site X (the former Twiter) released its intelligent chatbot in the US, which reportedly works on the basis of up-to-date information entered into the site through posts, messages, tweets made by Internet users. Also in October 2023, OpenAI announced that it will create a new version of its ChatGPT, which will also draw data and knowledge from updated knowledge resources downloaded from multiple websites. As a result, rival Internet-based leading forms of technology are constantly refining the evolving designs of the intelligent chatbots they are building, which will increasingly use more and more updated data, information and knowledge resources drawn from selected websites, web pages and portals. The rapid technological advances currently taking place regarding artificial intelligence technology may in the future lead to the integration of generative artificial intelligence and general artificial intelligence developed by technology companies. Competing technology companies may strive to build advanced artificial intelligence systems that can achieve a high level of autonomy and independence from humans, which may lead to a situation of the possibility of artificial intelligence technology development slipping out of human control. Such a situation may arise when the emergence of a highly technologically advanced general artificial intelligence that achieves the possibility of self-improvement and, in addition, realizing the process of self-improvement in a manner independent of humans, i.e. self-improvement with simultaneous escape from human control. However, before this happens it is earlier that technologically advanced artificial intelligence can achieve the ability to select data and information, which it will use in the implementation of specific mandated tasks and their real-time execution using up-to-date data and online knowledge resources.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
As part of the development of the concept of universal open access to knowledge resources, should the intelligent chatbots created by technology companies available on the Internet be connected to Internet resources to their full extent?
Should the intelligent chatbots created by technology companies available on the Internet be connected to the resources of the Internet to the full extent?
And what is your opinion about it?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

Digital transformation seems to be more than just the digitization of data and processes, or digitization in combination with robotisation. It leads to a special kind of socio-economic change. With digital transformation the events gain its momentum and affect functioning of organisations and many aspects of lives of individuals, with consequences as follows:
• the emergence of the almost ubiquitous Internet of Things – subjectivity and objectivity become complex,
• the unreal world becomes a new reality,
• use of smartphones – the need for continuous communication (Fear of Missing Out)
• virtual assistants,
• threats to our private lives through the unauthorized use of security cameras and surveillance equipment
The bank must be safe but fast, cheap, tailored to the customer‘s needs and smart. Today, it is difficult to talk about customer loyalty or sentiment. Today‘sclient is mobile, he comes and goes, does not stay in the bank through sentiment or habit, and because the bank accompanies him in all phases of his life as a consumer and as an economic entity.
I advise you to please take a read of the below chapter and I would be very happy to know your thoughts.
Is it the end of banking as we know it? Will AI be the future of banking? Will banks be soon digitized mechanism and advisors AI?
based on my earlier research:
(PDF) Role of digitization for German savings banks. Available from: https://www.researchgate.net/publication/344808656_Role_of_digitization_for_German_savings_banks [accessed Nov 28 2023].
What are the analytical tools supported by artificial intelligence technology, machine learning, deep learning, artificial neural networks available on the Internet that can be helpful in business, can be used in companies and/or enterprises for improving certain activities, areas of business, implementation of economic, investment, business projects, etc.?
Since OpenAI brought ChatGPT online in November 2022, interest in the possibilities of using intelligent chatbots for various aspects of business operations has strongly increased among business entities. Intelligent chatbots originally only or mainly enabled conversations, discussions, answered questions using specific data resources, information and knowledge taken from a selection of multiple websites. Then, in the following months, OpenAI released other intelligent applications on the Internet, allowing Internet users to generate images, photos, graphics, videos, solve complex mathematical tasks, create software for new computer applications, generate analytical reports, process various types of documents based on the given commands and formulated commands. In addition to this, in 2023, other technology companies also began to make their intelligent applications available on the Internet, through which certain complex tasks can be carried out to facilitate certain processes, aspects of companies, enterprises, financial institutions, etc., and thus facilitate business. There is a steady increase in the number of intelligent applications and tools available on the Internet that can support the implementation of various aspects of business activities carried out in companies and enterprises. On the other hand, the number of new business applications of said smart applications is growing rapidly.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the analytical tools available on the Internet supported by artificial intelligence technology, machine learning, deep learning, artificial neural networks, which can be helpful in business, can be used in companies and/or enterprises for improving certain activities, areas of business activity, implementation of economic, investment, business projects, etc.?
What are the AI-enabled analytical tools available on the Internet that can be helpful to business?
And what is your opinion on this topic?
What do you think about this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

What are the primary issues and problems in a big data workflow's data intake phase, and how can businesses assure efficient and reliable ingestion of varied data sources?
In the era of big data and artificial intelligence (AI), where aggregated data is used to learn about patterns and for decision-making, quality of input data seems to be of paramount importance. Poor data quality may lead not only to wrong outcomes, which will simply render the application useless, but more importantly to fundamental rights breaches and undermined trust in the public authorities using such applications. In law enforcement as in other sectors the question of how to ensure that data used for the development of big data and AI applications meet quality standards remains.
In law enforcement, as in other sectors, the key element of ensuring quality and reliability of big data and AI apps is the quality of raw material. However, the negative effects of flawed data quality in this context extend far beyond the typical ramifications, since they may lead to wrong and biased decisions producing adverse legal or factual consequences for individuals,Footnote11 such as detention, being a target of infiltration or a subject of investigation or other intrusive measures (e.g., a computer search).
source:
What are the biggest strategic challenges that insurance companies are facing as a result of digitalisation?
How is AI use in medical practice distinguished from big data analytics applications for health care delivery and population health?
How should the architecture of an effective computerised platform for detecting fakenews and other forms of disinformation on the internet built using Big Data Analytics, artificial intelligence and other Industry 4.0 technologies be designed?
The scale of the development of disinformation on the Internet including, among other things, fakenews has been growing in recent years mainly in social media. Disinformation is mainly developing on social media sites that are popular among young people, children and teenagers. The growing scale of disinformation is particularly socially damaging in view of the key objective of its pursuit by cybercriminals and certain organisations using, for example, the technique of publishing posts and banners using fake profiles of fictitious Internet users containing fakenews. The aim is to try to influence public opinion in society, to shape the general social awareness of citizens, to influence the assessment of the activities of specific policies of the government, national and/or international organisations, public or other institutions, to influence the ratings, credibility, reputation, recognition of specific institutions, companies, enterprises, their product and service offerings, individuals, etc., to influence the results of parliamentary, presidential and other elections, etc. In addition to this, the scale of cybercriminal activity and the improvement of cyber security techniques have also been growing in parallel on the Internet in recent years. Therefore, as part of improving techniques to reduce the scale of disinformation spread deliberately by specific national and/or international organisations, computerised platforms are being built to detect fake news and other forms of disinformation on the internet built using Big Data Analytics, artificial intelligence and other Industry 4.0 technologies. Since cybercriminals and organisations generating disinformation use new Industry 4.0 technologies in the creation of fake profiles on popular social networks, new information technologies, Industry 4.0, including but not limited to Big Data Analytics, artificial intelligence, deep learning, machine learning, etc., should also be used to reduce the scale of such harmful activities to citizens.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How should the architecture of an effective computerised platform for detecting factoids and other forms of disinformation on the Internet built using Big Data Analytics, artificial intelligence and other Industry 4.0 technologies be designed?
And what do you think about it?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best wishes,
Dariusz Prokopowicz

How will the rivalry between IT professionals operating on two sides of the barricade, i.e. in the sphere of cybercrime and cyber security, change after the implementation of generative artificial intelligence, Big Data Analytics and other technologies typical of the current fourth technological revolution?
Almost from the very beginning of the development of ICT, the rivalry between IT professionals operating on two sides of the barricade, i.e. in the sphere of cybercrime and cyber security, has been realized. In a situation where, within the framework of the technological progress that is taking place, on the one hand, a new technology emerges that facilitates the development of remote communication, digital transfer and processing of data then, on the other hand, the new technology is also used within the framework of hacking and/or cybercrime activities. Similarly, when the Internet appeared then on the one hand a new sphere of remote communication and digital data transfer was created. On the other hand, new techniques of hacking and cybercriminal activities were created, for which the Internet became a kind of perfect environment for development. Now, perhaps, the next stage of technological progress is taking place, consisting of the transition of the fourth into the fifth technological revolution and the development of 5.0 technology supported by the implementation of artificial neural networks based on artificial neural networks subjected to a process of deep learning constantly improved generative artificial intelligence technology. The development of generative artificial intelligence technology and its applications will significantly increase the efficiency of business processes, increase labor productivity in the manufacturing processes of companies and enterprises operating in many different sectors of the economy. Accordingly, after the implementation of generative artificial intelligence and also Big Data Analytics and other technologies typical of the current fourth technological revolution, the competition between IT professionals operating on two sides of the barricade, i.e., in the sphere of cybercrime and cybersecurity, will probably change. However, what will be the essence of these changes?
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How will the competition between IT professionals operating on the two sides of the barricade, i.e., in the sphere of cybercrime and cyber security, change after the implementation of generative artificial intelligence, Big Data Analytics and other technologies typical of the current fourth technological revolution?
How will the realm of cybercrime and cyber security change after the implementation of generative artificial intelligence?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

What are the possibilities of applying generative AI in terms of conducting sentiment analysis of changes in Internet users' opinions on specific topics?
What are the possibilities of applying generative artificial intelligence in carrying out sentiment analysis on changes in the opinions of Internet users on specific topics using Big Data Analytics and other technologies typical of Industry 4.0/5.0?
Nowadays, Internet marketing is developing rapidly, including viral Internet marketing used on social media sites, among others, in the form of, for example, Real-Time marketing in the formula of viral marketing. It is also marketing aimed at precisely defined groups, audience segments, potential customers of a specific advertised product and/or service offering. In terms of improving Internet marketing, new ICT information technologies and Industry 4.0/5.0 are being implemented. Marketing conducted in this form is usually preceded by market research conducted using, among other things, sentiment analysis of the preferences of potential consumers based on verification of their activity on the Internet, taking into account comments written on various websites, Internet forums, blogs, posts written on social media. In recent years, the importance of the aforementioned sentiment analysis carried out on large data sets using Big Data Analytics has been growing, thanks to which it is possible to study the psychological aspects of the phenomena of changes in the trends of certain processes in the markets for products, services, factor markets and financial markets. The development of the aforementioned analytics makes it possible to study the determinants of specific phenomena occurring in the markets caused by changes in consumer or investor preferences, caused by specific changes in the behavior of consumers in product and service markets, entrepreneurs in factor markets or investors in money and capital markets, including securities markets. The results from these analyses are used to forecast changes in the behavior of consumers, entrepreneurs and investors that will occur in the following months and quarters. In addition to this, sentiment analyses are also conducted to determine the preferences, awareness of potential customers, consumers in terms of recognition of the company's brand, its offerings, description of certain products and services, etc., using textual data derived from comments, entries, posts, etc. posted by Internet users, including social media users on a wide variety of websites. The knowledge gained in this way can be useful for companies to plan marketing strategies, to change the product and service offerings produced, to select or change specific distribution channels, after-sales services, etc. This is now a rapidly developing field of research and the possibilities for many companies and enterprises to use the results of this research in marketing activities, but not only in marketing. Recently, opportunities are emerging to apply generative artificial intelligence and other Industry 4.0/5.0 technologies to analyze large data sets collected on Big Data Analytics platforms. In connection with the development of intelligent chatbots available on the Internet, recently there have been discussions about the possibilities of potential applications of generative artificial intelligence, 5G and other technologies included in the Industry 4.0/5.0 group in the context of using the information resources of the Internet to collect data on citizens, companies, institutions, etc. for their analysis carried out using, among other things, sentiment analysis to determine the opinion of Internet users on certain topics or to define the brand recognition of a company, the evaluation of product or service offerings by Internet users. In recent years, the scope of applications of Big Data technology and Data Science analytics, Data Analytics in economics, finance and management of organizations, including enterprises, financial and public institutions is increasing. Accordingly, the implementation of analytical instruments of advanced processing of large data sets in enterprises, financial and public institutions, i.e. the construction of Big Data Analytics platforms to support organizational management processes in various aspects of operations, including the improvement of customer relations, is also growing in importance. In recent years, ICT information technologies, Industry 4.0/5.0 including generative artificial intelligence technologies are particularly rapidly developing and finding application in knowledge-based economies. These technologies are used in scientific research and business applications in commercially operating enterprises and in financial and public institutions. In recent years, the application of generative artificial intelligence technologies for collecting and multi-criteria analysis of Internet data can significantly contribute to the improvement of sentiment analysis of Internet users' opinions and the possibility of expanding the applications of research techniques carried out on analytical platforms of Business Intelligence, Big Data Analytics, Data Science and other research techniques using ICT information technology, Internet and advanced data processing typical Industry 4. 0/5.0. Most consumers of online information services available on new online media, including social media portals, are not fully aware of the level of risk of sharing information about themselves on these portals and the use of this data by technological online companies using this data for their analytics. I am conducting research on this issue. I have included the conclusions of my research in scientific publications, which are available on Research Gate. I invite you to cooperate with me.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the possibilities for the application of generative AI in terms of conducting sentiment analysis of changes in the opinions of Internet users on specific topics using Big Data Analytics and other technologies typical of Industry 4.0/5.0?
What are the possibilities of using generative AI in conducting sentiment analysis of Internet users' opinions on specific topics?
And what is your opinion on this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Dariusz Prokopowicz

How to build an intelligent computerized Big Data Analytics system that would retrieve real-time data and information from specific online databases, scientific knowledge indexing databases, domain databases, online libraries, information portals, social media, etc., and thus provide a database and up-to-date information for an intelligent chatbot, which would then be made available on the Internet for Internet users?
Almost every major technological company operating with its offerings on the Internet either already has and has made its intelligent chatbot available on the Internet, or is working on it and will soon have its intelligent chatbot available to Internet users. The general formula for the construction, organization and provision of intelligent chatbots by individual technology companies uses analogous solutions. However, in detailed technological aspects there are specific different solutions. The differentiated solutions include the issue of the timeliness of data and information contained in the created databases of digitized data, data warehouses, Big Data databases, etc., which contain specific data sets acquired from the Internet from various online knowledge bases, publication indexing databases, online libraries of publications, information portals, social media, etc., acquired at different times, data sets having different information characteristics, etc.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How to build an intelligent computerized Big Data Analytics system that would retrieve real-time data and information from specific online databases, scientific knowledge indexing databases, domain databases, online libraries, information portals, social media, etc., and thus provide a database and up-to-date information for an intelligent chatbot, which would then be made available on the Internet for Internet users?
How to build a Big Data Analytics system that would provide a database and up-to-date information for an intelligent chatbot made available on the Internet?
And what is your opinion on this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

The topic of my master's thesis is "The use of Big Data and Data Science technologies to assess the investment attractiveness of companies."
I plan to design and implement a machine for market analysis, using graphs.
I will be grateful to you for links to scientific articles on this topic.
How does AI differ from standard biostatistics?
What is “big data”? How does AI enable big dataset analysis? Can we argue for their synchronization for effective use of these two for efficient delivery in several diverse sectors (business, science, government).
Dear All,
I appreciate your kind help in doing the survey on the role of big data in cybersecurity, which I have given below. Your answers will be a big help in my research and knowledge.
Pass it on to all who're knowledgeable about big data and cybersecurity.
Sincerely Regards
Maytha Alshamsi
How should AI-assisted Big Data centers be developed so that they fit in with the Sustainable Development Goals?
How should Big Data centers aided by AI technology be developed so that they fit in with sustainability goals, so that they do not generate large amounts of electricity consumption and/or are powered by renewable and carbon-free energy sources?
Generative artificial intelligence technology, which, with the help of deep learning applied to artificial neural networks, is taught specific skills, performing activities previously performed only by humans, is finding more and more new applications in various branches of the economy, in various types of business entities. Generative artificial intelligence technology helps in solving complex tasks that require processing large sets of data in a relatively short period of time, which is already far beyond human capabilities. Therefore, more and more new tools based on generative artificial intelligence technology are being created, which are engaged in solving specific tasks, in which a number of specific criteria are required to be met in order to create a precisely specified product, project, innovative solution, finding a solution to a complex problem, and so on. This type of complex problem solving includes the creation of new solutions for green technology and eco-innovation, which can be helpful in connection with the need to accelerate and increase the efficiency of carrying out the green transformation of the economy, including the green transformation of the energy sector based on, among other things, the development of renewable and emission-free energy sources. However, paradoxically, generative artificial intelligence technology performing certain outsourced tasks i.e. based on large data sets collected in data centers, using Big Data Analytics technological solutions consumes large amounts of electricity. In a situation where these large amounts of electricity are generated by burning fossil fuels through dirty combustion energy, the aforementioned new technological solutions increasingly categorized as Industry 5.0 are unfortunately not described as green, pro-climate, pro-environment, pro-environment, pro-environment, sustainable, pursuing sustainable development goals, etc. Accordingly, Big Data centers assisted by artificial intelligence technology should be developed to fit in with sustainability goals, not to generate high electricity consumption and/or to be powered by renewable and carbon-free energy sources. The aforementioned Big Data centers assisted by artificial intelligence technology should therefore be designed and built in such a way that power plants generating energy from renewable sources are also built next to them or above them if they are built underground, such as wind farms and/or photovoltaic panel installations or other power plants generating energy by other means but emission-free. In the future, these may also include a new generation of nuclear power plants generating energy from currently generated spent fuel waste from currently operating nuclear power plants operating on the basis of widespread traditional nuclear technologies. Besides, in the future, another solution for emission-free clean energy may be the use of a new generation of nuclear power based on cold fusion. In addition to the above, the technologies categorized as energy futures also include energy based on green hydrogen and new types of energy resources, which may be extracted from space. An effective combination of the above-mentioned technologies, i.e. green energy technologies and ICT and Industry 4.0/5.0 information technologies, may lead to the creation of AI-assisted Big Data green data centers.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How should AI-assisted Big Data centers be developed so that they fit in with the Sustainable Development Goals, so that they do not generate a lot of electricity consumption and/or are powered by renewable and carbon-free energy sources?
How should AI-assisted Big Data centers be developed so that they fit in with sustainability goals?
And what is your opinion on this topic?
What is your opinion on this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Dariusz Prokopowicz

Hi,
I need help processing for big data. I am trying figure out ''How does Twitter process big data? Also, what can be done with big data?'' Can you explain briefly and/or can you suggest references for this subject?
Thanks..
Best regards...
With the advent of new technologies (e.g., AI, big data), according to reports, the shortage of technological talents may affect the operations of organizations. So how should the human resources department improve the retention rate of existing talents?
I believe that effective motivational strategies should be adopted, such as employee experiences such as flexible working hours or remote working.
I would like to ask for your opinion on this aspect, thank you very much!
1. Malware execution and analysis on IOT and GPU-based processer devices?
in that IOT and GPU both have different processors in that case how same or different kind of malware will impacts, how to do forensics investigation to club both the technologies ?
2.Malware execution and analysis on IOT and GPU-based processer devices?
in that IOT and GPU both have different processors in that case how same or different kind of malware will impacts in bid data environment
Dear Researchers, I am looking for open-source Gravity/Magnetic data for interpretations via Oasis montaj Software and Voxi Earth Modeling. Please specify some sources where form the data is easily accessible.
Regards,
Ayaz
Hello everyone. I have question about obtaining data from Internet.
In my research I will analyze comments from websites and social media platforms. And I am searching for applications/apps/technologies other tools to download comments from Internet to my computer.
Do you know any tools/apps to download comments for free?
There is around 10.000 comments and if I would copy/paste one by one it would take me a lot of time. I want to obtain data quickly.
Do you have any suggestions for me?
Thank you so much for help.
Regards, Nejc
Can the applicability of Big Data Analytics backed by artificial intelligence technology in the field be significantly enhanced when the aforementioned technologies are applied to the processing of large data sets extracted from the Internet and executed by the most powerful quantum computers?
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers, which also apply Industry 4.0/5.0 technologies, including generative artificial intelligence and Big Data Analytics technologies?
Can the scale of data processing carried out by the most powerful quantum computers be comparable to the processing that takes place in the billions of neurons of the human brain?
In recent years, the digitization of data and archived documents, the digitization of data transfer processes, etc., has been progressing rapidly.
The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Accordingly, developed economies in which information and computer technologies are developing rapidly and finding numerous applications in various economic sectors are called information economies. The societies operating in these economies are referred to as information societies. Increasingly, in discussions of this issue, there is a statement that another technological revolution is currently taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies classified as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence, including generative artificial intelligence with artificial neural network technology also applied and subjected to deep learning processes. As a result, the computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are gradually increasing. There is a rapid increase in the processing of ever larger sets of data and information. The number of companies, enterprises, public, financial and scientific institutions that create large data sets, massive databases of data and information generated in the course of a specific entity's activities and obtained from the Internet and processed in the course of conducting specific research and analytical processes is growing. In view of the above, the opportunities for the application of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted, are also growing rapidly. By using the combined technologies of Big Data Analytics, other technologies of Industry 4.0/5.0, including artificial intelligence and quantum computers in the processing of large data sets, the analytical capabilities of data processing and thus also conducting analysis and scientific research can be significantly increased.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and implemented by the most powerful quantum computers?
Can the applicability of Big Data Analytics supported by artificial intelligence technology in the field significantly increase when the aforementioned technologies are applied to the processing of large data sets extracted from the Internet and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets obtained from the Internet and realized by the most powerful quantum computers?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Thank you,
Warm regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

In your opinion, does it make sense to create a new generation of something similar to ChatGPT, which will use databases built solely on the basis of continuously updated data, information, objectively verified knowledge resources taken from online scientific knowledge bases, online scientific portals and online indexing databases of scientific publications?
I'm curious to know what you think about this? This kind of solution based on an intelligent publication search system and an intelligent content analysis system of retrieved publications on an online scientific portal could be of great help to researchers and scientists. In my opinion, the creation of a new generation of something similar to ChatGPT, which will use databases built solely on the basis of online scientific knowledge bases, online scientific portals and online scientific publication indexing databases makes sense if basic issues of copyright respect are met and such tools use continuously updated and objectively and scientifically verified knowledge, data and information resources. With such a solution, researchers and scientists conducting research on a specific topic would have the opportunity to review the literature within the millions of scientific publications collected on specific online scientific portals and scientific publication indexing databases. Besides, what is particularly important, the mentioned partially automated literature review would probably be realized in a relatively short time. Thus, an intelligent system for searching and analyzing the content of scientific publications would, in a short period of time, from among the millions of texts archived in specific scientific publication indexing databases, select those publications in which other researchers and scientists have described analogous, similar, related, correlated, related, etc. issues, results of scientific research conducted, selected publications within the same scientific discipline, the same topic or in the interdisciplinary field. Besides, an intelligent system for searching and analyzing the content of scientific publications could also categorize the retrieved publications into those in which other researchers and scientists confirmed analogous conclusions of conducted similar research, polemicized with the results of other researchers' research on a specific topic, obtained other results from conducted research, suggested other practical applications of obtained research results realized on the same or similar topic, etc. However, for ethical reasons and properly conducted research, i.e., respecting the research results of other researchers and scientists, it would be unacceptable for this kind of intelligent system for searching and analyzing the content of many publications available on specific databases for indexing scientific publications to enable plagiarism, i.e., to provide research results, provide retrieved content on specific issues and topics, etc., without accurately providing the source of the data, description of the source data, names of the authors of the publications, etc., and some unreliable researchers would take advantage of this opportunity. This kind of intelligent system for searching and analyzing the content of scientific publications should give for all searched publications full bibliographic descriptions, source descriptions, footnotes containing all the data that are necessary to develop full source footnotes for possible citation of specific studies, research results, theses, data, etc. contained in other publications written by other researchers and scientists. So, building this kind of intelligent tool would make sense if ChatGPT-type tools were properly improved and the system of laws for their use appropriately supplemented so that the use of ChatGPT-type tools does not violate copyrights and that these tools are used in accordance with ethics and do not generate misinformation. Improving these tools so that they do not generate disinformation, do not create "fictitious facts" in the form of descriptions, essays, photos, videos, etc. containing nicely described, presented never and nowhere seemingly facts is to keep Big Data systems updated, update data sets and information, based on which they create answers to questions, create descriptions, photos, companies and so on. This is important because current online tools like ChatGPT often create "nicely described fictitious facts," which is used to generate fake news and misinformation in online social media. When all that I have written above would be corrected and the use completed, and not only in some parts of the world but on a global scale, then the creation of a new generation of something similar to ChatGPT, which will use databases built solely on the basis of online scientific knowledge bases, online scientific portals and online indexing databases of scientific publications would make sense and could prove helpful to people, including researchers and scientists. Besides, the current online ChatGPT-type tools are not perfect, as they draw data not directly in real-time online from specific databases and knowledge contained in selected websites and portals, but draw information, knowledge, data from an offline database created some time ago. For example, currently the most popular ChatGPT still relies on a database of data, information, etc. contained in many publication texts downloaded from selected websites and web portals but not today or yesterday downloaded only in 2021! So these are data and information already outdated on many issues. Hence the absurdities, inconsistencies with the facts, creation of "fictitious facts" by ChatGPT in a significant part of the answers generated by this system to questions asked by Internet users. In view of the above, in a number of issues, both technological, organizational, formal, normative, etc., such intelligent systems should be improved so that they can be used in open access in the applications I wrote about above.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
In your opinion, does it make sense to create a new generation of something similar to ChatGPT, which will use databases built solely on the basis of continuously updated data, information, objectively verified knowledge resources taken from online scientific knowledge bases, online scientific portals and online indexing databases of scientific publications?
What do you think about creating a new generation of something similar to ChatGPT, which will use exclusively online scientific knowledge resources?
And what is your opinion about it?
What is your opinion on this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
Counting on your opinions, on getting to know your personal opinion, on a fair approach to the discussion of scientific issues, I deliberately used the phrase "in your opinion" in the question.
The above text is entirely my own work written by me on the basis of my research.
Copyright by Dariusz Prokopowicz

What is the use of big data artificial intelligence in Indian agriculture in achieving sustainable development goals?
Already, can the application of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in the strategic management of an enterprise?
Already today, can the application of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in the field of carrying out multi-faceted, complex strategic analysis of the business environment and determinants of company development, predictive analysis based on the processing of large data sets and, therefore, also in the field of strategic business management?
The ongoing technological progress is characterized by the dynamic development of Industry 4.0/5.0 technologies, technologies typical of the current fourth technological revolution, including ICT information and communication technologies, technologies for advanced multi-criteria processing of large data sets and information resources. The development of information processing technologies in the era of the current technological revolution termed Industry 4.0/5.0 is determined by the development and growth of applications of ICT information and communication technologies, Internet technologies and advanced data processing, which include. : Big Data Analytics, Data Science, cloud computing, artificial intelligence, machine learning, deep learning, personal and industrial Internet of Things, Business Intelligence, autonomous robots, horizontal and vertical data system integration, multi-criteria simulation models, digital twins, additive manufacturing, Blockchain, smart technologies, cybersecurity instruments, Virtual and Augmented Reality, and other advanced data processing technologies Data Mining. Technological advances in computing, emerging faster and faster microprocessors, more and more capacious and high-speed data storage disks, etc., are making it possible to process large data sets faster and more efficiently. In addition, numerous new applications of the aforementioned technologies are emerging in various sectors of the economy by combining these technologies in various configurations for new applications. Numerous business applications of these technologies in companies and enterprises are emerging. The implementation of these technologies into the business activities of companies, enterprises, financial and public institutions contributes to increasing the efficiency of the implementation of certain processes. In view of the above, therefore, there is much to suggest that if not yet now, then soon the application of a combination of artificial intelligence technologies, Big Data Analytics and quantum computers may be helpful in terms of carrying out multi-faceted, complex strategic analyses of the business environment and determinants of company development, predictive analyses based on the processing of large data sets and, therefore, also in terms of strategic business management.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the application of the combination of artificial intelligence technology, Big Data Analytics and quantum computers already be helpful in the field of strategic business management?
Can the use of a combination of artificial intelligence technology, Big Data Analytics and quantum computers assist in strategic business management?
What do you think about this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Thank you,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

Como se maneja la big data en el proceso logistico en una organizacion, cuales son los avances de los ultimos años
Dear Colleagues,
Title of Research Project: Exploring Reflective Learning Strategies in Big Data Analytics Education and Practice: A Cross-sectional Study on Adoption, Effectiveness, and Influencing Factors
I will appreciate your participation in this research study, which has been reviewed and approved by The Salomons Ethics Panel, Salomons Centre for Applied Psychology, Canterbury Christ Church University, UK. This questionnaire will take about 10-12 minutes to complete.
Here is the link to the questionnaire including the informed consent and participant information sheet: https://forms.gle/kFycqu7KXqKkzV8F9
Thanks,
Rossi A. Hassad, PhD, MPH, CStat, PStat
Canterbury Christ Church University, UK
How Big Data analytics has helped to reconstruct the history of our earth
Can the conduct of analysis and scientific research be significantly improved through the use of Big Data Analytics, artificial intelligence and quantum computers?
Can the possibilities of Big Data Analytics applications supported by artificial intelligence technology in the field increase significantly when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers, which also apply Industry 4.0/5.0 technologies, including generative artificial intelligence and Big Data Analytics technologies?
Can the scale of data processing carried out by the most powerful quantum computers be comparable to the data processing that is carried out in the billions of neurons of the human brain?
In recent years, the digitization of data and archived documents, digitization of data transfer processes, etc., has been progressing rapidly.
The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Accordingly, developed economies in which information and computer technologies are developing rapidly and finding numerous applications in various economic sectors are called information economies. The societies operating in these economies are referred to as information societies. Increasingly, in discussions of this issue, there is a statement that another technological revolution is currently taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies classified as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence, including generative artificial intelligence with artificial neural network technology also applied and subjected to deep learning processes. As a result, the computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are gradually increasing. There is a rapid increase in the processing of ever larger sets of data and information. The number of companies, enterprises, public, financial and scientific institutions that create large data sets, massive databases of data and information generated in the course of a specific entity's activities and obtained from the Internet and processed in the course of conducting specific research and analytical processes is growing. In view of the above, the opportunities for the application of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted, are also growing rapidly. By using the combined technologies of Big Data Analytics, other technologies of Industry 4.0/5.0, including artificial intelligence and quantum computers in the processing of large data sets, the analytical capabilities of data processing and thus also conducting analysis and scientific research can be significantly increased.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the conduct of analysis and scientific research be significantly improved, increase efficiency, significantly shorten the execution of the process of research work through the use of Big Data Analytics and artificial intelligence applied to the processing of large data sets and implemented by the most powerful quantum computers?
Can the applicability of Big Data Analytics supported by artificial intelligence technology in the field significantly increase when the aforementioned technologies are applied to the processing of large data sets obtained from the Internet and realized by the most powerful quantum computers?
What are the analytical capabilities of processing large data sets extracted from the Internet and realized by the most powerful quantum computers?
And what is your opinion about it?
What do you think about this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

Can the application of artificial intelligence and Big Data Analytics technologies help improve system energy security management processes and enhance this security?
Probably yes if the issue of new green technologies, the development of emission-free clean energy is a priority in the energy policy shaped by the government. Efficient application of artificial intelligence and Big Data Analytics technologies can help improve system energy security management processes and increase this security. However, it is crucial to effectively combine the functionality of artificial intelligence and Big Data Analytics technologies and efficiently apply these technologies to manage the risk of energy emergencies, analyze the determinants shaping the development of energy and energy production, analyze the factors shaping the level of energy security, and forecast future energy production in the context of forecasting changes in the level of energy demand, energy production from specific types of energy sources and the possibility of energy production from specific types of energy sources determined by specific determinants.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Can the application of artificial intelligence and Big Data Analytics technologies help improve the processes of systemic energy security management and enhance this security?
Can artificial intelligence and Big Data Analytics help improve systemic energy security management processes?
And what is your opinion on this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz

Exploring the role of AI and data analytics in improving our ability to predict and manage pandemics
With the advent of new technologies (for example, AI, big data), the shortage of scientific and technological talents may affect the organization's operations. So how should the human resources department improve the retention rate of existing talents?
I believe that effective motivational strategies should be adopted, such as employee experiences such as flexible working hours or remote working.
I would like to ask for your opinion on this aspect, thank you very much!
With the advent of new technologies (e.g., AI, big data), according to reports, the shortage of technological talents may affect the operations of organizations. So how should the human resources department improve the retention rate of existing talents?
I believe that effective motivational strategies should be adopted, such as employee experiences such as flexible working hours or remote working.
I would like to ask for your opinion on this aspect, thank you very much!
What are the possibilities for the applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?
The progressive digitization of data and archived documents, digitization of data transfer processes, Internetization of communications, economic processes but also of research and analytical processes is becoming a typical feature of today's developing developed economies. Currently, another technological revolution is taking place, described as the fourth and in some aspects it is already the fifth technological revolution. Particularly rapidly developing and finding more and more applications are technologies categorized as Industry 4.0/5.0. These technologies, which support research and analytical processes carried out in various institutions and business entities, include Big Data Analytics and artificial intelligence. The computational capabilities of microprocessors, which are becoming more and more perfect and processing data faster and faster, are successively increasing. The processing of ever-larger sets of data and information is growing. Databases of data and information extracted from the Internet and processed in the course of conducting specific research and analysis processes are being created. In connection with this, the possibilities for the application of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research being conducted, are also growing rapidly.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
What are the possibilities of applications of Big Data Analytics supported by artificial intelligence technology in terms of improving research techniques, in terms of increasing the efficiency of the research and analytical processes used so far, in terms of improving the scientific research conducted?
What are the possibilities of applications of Big Data Analytics backed by artificial intelligence technology in terms of improving research techniques?
What do you think on this topic?
What is your opinion on this issue?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Best wishes,
The above text is entirely my own work written by me on the basis of my research.
Copyright by Dariusz Prokopowicz
On my profile of the Research Gate portal you can find several publications on Big Data issues. I invite you to scientific cooperation in this problematic area.
Dariusz Prokopowicz

Is it possible to build a highly effective forecasting system for future financial and economic crises based on artificial intelligence technology in combination with Data Science analytics, Big Data Analytics, Business Intelligence and/or other Industry 4.0 technologies?
Is it possible to build a highly effective, multi-faceted, intelligent forecasting system for future financial and economic crises based on artificial intelligence technology in combination with Data Science analytics, Big Data Analytics, Business Intelligence and/or other Industry 4.0 technologies as part of a forecasting system for complex, multi-faceted economic processes in such a way as to reduce the scale of the impact of the paradox of a self-fulfilling prediction and to increase the scale of the paradox of not allowing a predicted crisis to occur due to pre-emptive anti-crisis measures applied?
What do you think about the involvement of artificial intelligence in combination with Data Science, Big Data Analytics, Business Intelligence and/or other Industry 4.0 technologies for the development of sophisticated, complex predictive models for estimating current and forward-looking levels of systemic financial, economic risks, debt of the state's public finance system, systemic credit risks of commercially operating financial institutions and economic entities, forecasting trends in economic developments and predicting future financial and economic crises?
Research and development work is already underway to teach artificial intelligence to 'think', i.e. the conscious thought process realised in the human brain. The aforementioned thinking process, awareness of one's own existence, the ability to think abstractly and critically, and to separate knowledge acquired in the learning process from its processing in the abstract thinking process in the conscious thinking process are just some of the abilities attributed exclusively to humans. However, as part of technological progress and improvements in artificial intelligence technology, attempts are being made to create "thinking" computers or androids, and in the future there may be attempts to create an artificial consciousness that is a digital creation, but which functions in a similar way to human consciousness. At the same time, as part of improving artificial intelligence technology, creating its next generation, teaching artificial intelligence to perform work requiring creativity, systems are being developed to process the ever-increasing amount of data and information stored on Big Data Analytics platform servers and taken, for example, from selected websites. In this way, it may be possible in the future to create "thinking" computers, which, based on online access to the Internet and data downloaded according to the needs of the tasks performed and processing downloaded data and information in real time, will be able to develop predictive models and specific forecasts of future processes and phenomena based on developed models composed of algorithms resulting from previously applied machine learning processes. When such technological solutions become possible, the following question arises, i.e. the question of taking into account in the built intelligent, multifaceted forecasting models known for years paradoxes concerning forecasted phenomena, which are to appear only in the future and there is no 100% certainty that they will appear. Well, among the various paradoxes of this kind, two particular ones can be pointed out. One is the paradox of a self-fulfilling prophecy and the other is the paradox of not allowing a predicted crisis to occur due to pre-emptive anti-crisis measures applied. If these two paradoxes were taken into account within the framework of the intelligent, multi-faceted forecasting models being built, their effect could be correlated asymmetrically and inversely proportional. In view of the above, in the future, once artificial intelligence has been appropriately improved by teaching it to "think" and to process huge amounts of data and information in real time in a multi-criteria, creative manner, it may be possible to build a highly effective, multi-faceted, intelligent forecasting system for future financial and economic crises based on artificial intelligence technology, a system for forecasting complex, multi-faceted economic processes in such a way as to reduce the scale of the impact of the paradox of a self-fulfilling prophecy and increase the scale of the paradox of not allowing a predicted crisis to occur due to pre-emptive anti-crisis measures applied. In terms of multi-criteria processing of large data sets conducted with the involvement of artificial intelligence, Data Science, Big Data Analytics, Business Intelligence and/or other Industry 4. 0 technologies, which make it possible to effectively and increasingly automatically operate on large sets of data and information, thus increasing the possibility of developing advanced, complex forecasting models for estimating current and future levels of systemic financial and economic risks, indebtedness of the state's public finance system, systemic credit risks of commercially operating financial institutions and economic entities, forecasting economic trends and predicting future financial and economic crises.
In view of the above, I address the following questions to the esteemed community of scientists and researchers:
Is it possible to build a highly effective, multi-faceted, intelligent forecasting system for future financial and economic crises based on artificial intelligence technology in combination with Data Science, Big Data Analytics, Business Intelligence and/or other Industry 4.0 technologies in a forecasting system for complex, multi-faceted economic processes in such a way as to reduce the scale of the impact of the paradox of the self-fulfilling prophecy and to increase the scale of the paradox of not allowing a forecasted crisis to occur due to pre-emptive anti-crisis measures applied?
What do you think about the involvement of artificial intelligence in combination with Data Science, Big Data Analytics, Business Intelligence and/or other Industry 4.0 technologies to develop advanced, complex predictive models for estimating current and forward-looking levels of systemic financial risks, economic risks, debt of the state's public finance system, systemic credit risks of commercially operating financial institutions and economic entities, forecasting trends in economic developments and predicting future financial and economic crises?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Warm regards,
Dariusz Prokopowicz

The fourth technological revolution currently underway is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to machine learning, deep learning, artificial intelligence, ... what's next? Intelligent thinking autonomous robots?
The fourth technological revolution currently underway is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to technologies learning machines, deep learning, artificial intelligence. Machine learning, machine learning, machine self-learning or machine learning systems are all synonymous terms relating to the field of artificial intelligence with a particular focus on algorithms that can improve themselves, improving automatically through the action of an experience factor within exposure to large data sets. Algorithms operating within the framework of machine learning build a mathematical model of data processing from sample data, called a learning set, in order to make predictions or decisions without being programmed explicitely by a human to do so. Machine learning algorithms are used in a wide variety of applications, such as spam protection, i.e. filtering internet messages for unwanted correspondence, or image recognition, where it is difficult or infeasible to develop conventional algorithms to perform the needed tasks. Deep learning is a kind of subcategory of machine learning, which involves the creation of deep neural networks, i.e. networks with multiple levels of neurons. Deep learning techniques are designed to improve, among other things, automatic speech processing, image recognition and natural language processing. The structure of deep neural networks consists of multiple layers of artificial neurons. Simple neural networks can be designed manually so that a specific layer detects specific features and performs specific data processing, while learning consists of setting appropriate weights, significance levels, value system for components of specific issues defined on the basis of processing and learning from large amounts of data. In large neural networks, the deep learning process is automated and self-contained to a certain extent. In this situation, the network is not designed to detect specific features, but detects them on the basis of the processing of appropriately labelled data sets. Both such datasets and the operation of neural networks themselves should be prepared by specialists, but the features are already detected by the programme itself. Therefore, large amounts of data can be processed and the network can automatically learn higher-level feature representations, which means that they can detect complex patterns in the input data. In view of the above, deep learning systems are built on Big Data Analytics platforms built in such a way that the deep learning process is performed on a sufficiently large amount of data. Artificial intelligence, denoted by the acronym AI (artificial intelligence), is respectively the 'intelligent', multi-criteria, advanced, automated processing of complex, large amounts of data carried out in a way that alludes to certain characteristics of human intelligence exhibited by thought processes. As such, it is the intelligence exhibited by artificial devices, including certain advanced ICT and Industry 4.0 information technology systems and devices equipped with these technological solutions. The concept of artificial intelligence is contrasted with the concept of natural intelligence, i.e. that which pertains to humans. In view of the above, artificial intelligence thus has two basic meanings. On the one hand, it is a hypothetical intelligence realised through a technical rather than a natural process. On the other hand, it is the name of a technology and a research field of computer science and cognitive science that also draws on the achievements of psychology, neurology, mathematics and philosophy. In computer science and cognitive science, artificial intelligence also refers to the creation of models and programmes that simulate at least partially intelligent behaviour. Artificial intelligence is also considered in the field of philosophy, within which a theory is developed concerning the philosophy of artificial intelligence. In addition, artificial intelligence is also a subject of interest in the social sciences. The main task of research and development work on the development of artificial intelligence technology and its new applications is the construction of machines and computer programmes capable of performing selected functions analogously to those performed by the human mind functioning with the human senses, including processes that do not lend themselves to numerical algorithmisation. Such problems are sometimes referred to as AI-difficult and include such processes as decision-making in the absence of all data, analysis and synthesis of natural languages, logical reasoning also referred to as rational reasoning, automatic proof of assertions, computer logic games e.g. chess, intelligent robots, expert and diagnostic systems, among others. Artificial intelligence can be developed and improved by integrating it with the areas of machine learning, fuzzy logic, computer vision, evolutionary computing, neural networks, robotics and artificial life. Artificial intelligence (AI) technologies have been developing rapidly in recent years, which is determined by its combination with other Industry 4.0 technologies, the use of microprocessors, digital machines and computing devices characterised by their ever-increasing capacity for multi-criteria processing of ever-increasing amounts of data, and the emergence of new fields of application. Recently, the development of artificial intelligence has become a topic of discussion in various media due to the open-access, automated and AI-enabled solution ChatGPT, with which Internet users can have a kind of conversation. The solution is based and learns from a collection of large amounts of data extracted in 2021 from specific data and information resources on the Internet. The development of artificial intelligence applications is so rapid that it is ahead of the process of adapting regulations to the situation. The new applications being developed do not always generate exclusively positive impacts. These potentially negative effects include the potential for the generation of disinformation on the Internet, information crafted using artificial intelligence, not in line with the facts and disseminated on social media sites. This raises a number of questions regarding the development of artificial intelligence and its new applications, the possibilities that will arise in the future under the next generation of artificial intelligence, the possibility of teaching artificial intelligence to think, i.e. to realise artificial thought processes in a manner analogous or similar to the thought processes realised in the human mind.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
The fourth technological revolution currently taking place is characterised by rapidly advancing ICT information technologies and Industry 4.0, including but not limited to machine learning technologies, deep learning, artificial intelligence, .... what's next? Intelligent thinking autonomous robots?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best regards,
Dariusz Prokopowicz

How can artificial intelligence technologies be used effectively in universities so that the development of artificial intelligence technologies exemplified by ChatGPT does not pose a threat to universities but rather is an increase in the possibilities for the development of universities, the development of scientific research, including the improvement of the efficiency of conducted research, analytical, teaching, scientific processes using large amounts of multi-criteria data processed on computerised Big Data Analytics platforms?
The development of artificial intelligence applications today is almost limitless. Artificial intelligence technologies have been developed for many years, but it is in the last few years that this development has significantly accelerated. On the other hand, thanks to the artificial intelligence system made available on the Internet, which is the ChatGPT language model, the topic of artificial intelligence has, since the end of 2022, become one of the main topics of discussion in various fields of knowledge and in the context of different scientific disciplines, business applications, etc. ChatGPT has also become one of the most popular online platforms rapidly gaining new users at a rate comparable to the most popular and fastest growing social media sites. However, the currently developing applications of ChatGPT's intelligent language model have also started to generate negative effects and have overtaken the process of adapting systemic solutions and regulations to the situation. There has emerged a serious risk of the rapid development of disinformation in online social media, with images, videos and texts generated by various artificial intelligence solutions that present what can be described as 'fictitious facts', which present something that is difficult to distinguish from real facts, real events taking place and to diagnose who or rather what created them. There is a serious risk of non-compliance with copyright in the creation of certain types of 'works' created by artificial intelligence. This also raises the question of the ethics of the creation of new works, works in which a reliably realised creativity is or should be included. Newly created works, such as photographs, films, textual studies, literary works, paintings, graphics, sculptures, architectural designs, technical and other innovations, computer programmes, patents, etc., contain the element of new solutions, concepts, innovation, etc., which are the result of human creativity. However, in the context of thousands of years of evolution of human abilities and creativity, it is only relatively recently that man has begun to assist himself in the processes of creative creation of something new, innovative solutions, new concepts, artistic works, etc., assisted by advanced technology that does this in principle for man, but according to assumptions and rules that man determines. In recent years, the aforementioned processes of using artificial intelligence in the creation of a kind of "works" created with the application of more and more data and information and within the framework of processes that are becoming more and more automated have been taking place at an increasingly rapid pace. The development of the ChatGPT intelligent language model technology, which is available on the Internet, shows how dynamically the use of new technology is taking place in order to, as it were, cede creative work that requires multi-criteria processing of large amounts of data and in increasingly automated processes. Since, for example, ChatGPT-created texts often lack full descriptions of data sources, source publications, bibliographic descriptions and lack information on the extent of possible plagiarism, the scale of possibilities for copyright infringement is large. Therefore, in the context of thesis texts written by students, essays for course credit at university, the use of a tool such as ChatGPT for this purpose generates serious risks of unreliability of writing this type of work. Therefore, it is necessary to create a system of digital marking of various types of "works" created by various artificial intelligence solutions, i.e. in addition to texts created by artificial intelligence, also the creation of photographs, films, innovations, patents, computer software, new drugs, technical projects, artistic works, etc. Such a system of digital marking of various types of "works" created by various artificial intelligence solutions will be helpful in the matter of distinguishing the effects of human work from the increasingly highly substitutable effects of advanced data processing carried out by artificial intelligence. In addition to this, computerised anti-plagiarism platforms and programmes should be improved in such a way that they diagnose the borrowing of text fragments, sentences, paragraphs, phrases from other texts, publications, articles, books, etc., and unattributed sources of data, information, formulas, formulas, models, definitions of new concepts, new concepts, projects, innovative solutions, etc., unattributed bibliographies. Therefore, the currently developed artificial intelligence solutions, such as ChatGPT and similar solutions, should be improved both from the technical and procedural side, as well as from the formal and legal side, thanks to which the scale of improper use of such tools, generating negative effects, will be significantly reduced, including, among others, the scale of unreliable writing of texts of journal and other articles, theses, descriptions of conducted research, results of conducted analyses, etc. In this way, by significantly reducing the scale of negative effects of the developed artificial intelligence applications, the possibilities of practical application of artificial intelligence in the scope of improving the performance of research, analytical and development works, research and development works, as well as the results of conducted analyses, etc., will be limited. In this way, significantly reducing the scale of negative effects of developed applications of artificial intelligence, also developed in universities, the possibilities of practical applications of artificial intelligence in improving the implementation of research, analytical, research and development work, descriptions of results of conducted research will be able to be developed in the future. In this way, effectively, artificial intelligence technologies can be used in universities so that the development of artificial intelligence technologies, of which ChatGPT is an example, does not pose a threat to universities, but rather that it is an increase in opportunities for the development of universities, the development of scientific research, including the improvement of the efficiency of conducted research and analytical processes with the use of large amounts of data processed multi-criteria on computerised Big Data Analytics platforms.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How effectively can artificial intelligence technologies be used in universities so that the development of artificial intelligence technologies as exemplified by ChatGPT does not pose a threat to universities but rather that it is an increase in the possibilities for the development of universities, the development of scientific research, including the improvement of the efficiency of the conducted research, analytical, teaching, scientific processes using large amounts of data processed multi-criteria on computerised Big Data Analytics platforms?
And what is your opinion on this?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best wishes,
Dariusz Prokopowicz

Can Artificial Intelligence and Big Data Analytics help in the development of sustainable organic agriculture based on planning, arranging and managing biodiverse, multi-species crop agriculture?
In your opinion, can the new technologies of Industry 4.0, including especially artificial intelligence, machine learning, deep learning applied in combination with big data, information and knowledge collected and processed on Big Data Analytics platforms help the development of sustainable organic agriculture based on planning, arranging and managing biodiverse, multi-species crops of agricultural crops?
The process of planning, designing, arranging sustainable crops of agricultural crops grown according to the formula of organic agriculture, which aims to restore highly sustainable, biodiverse natural agricultural ecosystems, should take into account many factors that are a mix of natural biotic, climatic, geological and abiotic factors and changes in these factors that have taken place over the last centuries or millennia within the framework of the development of human, unsustainable civilisation, the development of a robber economy based on intensive industrial development with ignoring the issue of negative externalities towards the surrounding environment.
Considering how this should be a complex, multifaceted process of planning, designing, managing and restoring highly sustainable biodiverse forest and sustainable agricultural ecosystems, the application in this process of new generations of Industry 4.0 technologies, including, above all, artificial intelligence based on large sets of data, information and knowledge concerning many different aspects of nature, ecology, climate, civilisation, etc. collected and processed on Big Data Analytics platforms may prove to be of great help.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
In your opinion, can the new technologies of Industry 4.0, including, above all, artificial intelligence, machine learning, deep learning applied in combination with large data sets, information and knowledge collected and processed on Big Data Analytics platforms help the development of sustainable organic agriculture based on the planning, arrangement and management of biodiverse, multi-species crops of agricultural crops?
Can artificial intelligence and Big Data Analytics help in the development of sustainable organic agriculture?
What is your opinion?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Best wishes,
Dariusz Prokopowicz

In your opinion, can the new technologies of Industry 4.0, including, above all, artificial intelligence, machine learning, deep learning applied in combination with large sets of data, information and knowledge collected and processed on Big Data Analytics platforms, help in the satellite analysis of the rate of biodiversity loss of the planet's different natural ecosystems?
As part of the technological advances that have been taking place in recent years, which are also rapidly advancing as part of the development of ICT information technologies and Industry 4.0, more and more sophisticated analytical instruments and research techniques are being developed to carry out increasingly complex, multifaceted and Big Data-based analyses of the various processes taking place in nature and to obtain increasingly precise results from the research conducted. With the combination of ICT information technology and Industry 4.0 with satellite analysis technology, the analyses of changes in the biodiversity of the planet's various natural ecosystems carried out using satellites placed in planetary orbit are also being improved. Taking into account the negative human impact on the biodiversity of the planet's natural ecosystems that has been taking place since the beginning of the development of the first technological and industrial revolution, and especially in the Anthropocene epoch from the mid-20th century onwards, there is a growing need to counteract these negative processes, a need to increase the scale and outlays allocated to the improvement of nature conservation systems and instruments, including the protection of the biodiversity of the planet's natural ecosystems.
Improving nature conservation and biodiversity protection systems also requires cyclic surveys of the state of biodiversity of individual terrestrial and marine natural ecosystems of the planet and analyses of progressive environmental degradation and the rate of biodiversity loss. In the situation of obtaining more precise results of research concerning changes in the state of the natural environment and the rate of loss of biodiversity of particular terrestrial and marine natural ecosystems of the planet occurring in various climate zones, changes in the state of the climate and diagnosing key civilisational determinants generating those changes, it is possible to apply specific actions and systemic solutions within the framework of counteracting negative processes of degradation of the natural environment and loss of biodiversity within the framework of improving nature protection techniques more effectively and adapted to the specific nature of a given local biosphere, climate conditions, diagnosed processes of the aforementioned changes but also economic factors. In this connection, the technology of artificial intelligence, which has been developing particularly rapidly in recent years, can also prove helpful in the process of improving the planning, design, management and restoration of natural ecosystems, taking into account a high degree of sustainability, biodiversity and naturalness, i.e. the restoration of natural ecosystems that existed in a specific area centuries ago. In the process of the aforementioned restoration of sustainable, highly biodiverse terrestrial and marine natural ecosystems of the planet, many primary factors must also be taken into account, including geological and climatic factors as well as the modifications previously applied to the area by man concerning geology, land irrigation, drainage, microclimate, soil quality, environmental pollution, the presence of certain invasive species of flora, fauna, fungi and microorganisms. Therefore, the process of planning, design, management and restoration of biodiverse natural ecosystems should take into account many of the above-mentioned factors that are a mix of natural biotic, climatic, geological and abiotic factors and changes in these factors that have taken place over the last centuries or millennia, i.e. changes and side-effects of the development of human, unsustainable civilisation, the development of a robber economy based on intensive industrial development with ignoring the issue of negative externalities towards the surrounding natural environment.
Considering how this should be a complex, multifaceted process of planning, designing, arranging and restoring the planet's biodiverse, natural ecosystems, the application in this process of the new generations of Industry 4.0 technologies, including, above all, artificial intelligence based on large sets of data, information and knowledge concerning many different aspects of nature, ecology, climate, civilisation, etc., collected and processed on Big Data Analytics platforms, can be of great help. On the other hand, artificial intelligence technology combined with satellite analytics can also be of great help in improving research processes aimed at investigating changes in the state of the planet's biosphere, including analysis of the decline in biodiversity of individual ecosystems occurring in specific natural areas and precise diagnosis of the rate of the aforementioned negative changes resulting in environmental degradation and the key determinants causing specific changes.
I will write more about this in the book I am currently writing. In this monograph, I will include the results of my research on this issue. I invite you to join me in scientific cooperation on this issue.
Counting on your opinions, on getting to know your personal opinion, on an honest approach to discussions in scientific problems, and not on ready-made answers generated in ChatGPT, I deliberately used the phrase "in your opinion" in the question.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
In your opinion, can the new technologies of Industry 4.0, including especially artificial intelligence, machine learning, deep learning applied in combination with large datasets, information and knowledge collected and processed on Big Data Analytics platforms help in the satellite analysis of the rate of biodiversity loss of the planet's various natural ecosystems?
Can artificial intelligence and Big Data Analytics help in the satellite analysis of the rate of biodiversity loss of the planet's different natural ecosystems?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Counting on your opinions, on getting to know your personal opinion, on an honest approach to discussing scientific issues and not ChatGPT-generated ready-made answers, I deliberately used the phrase "in your opinion" in the question.
The above text is entirely my own work written by me on the basis of my research.
I have not used other sources or automatic text generation systems such as ChatGPT in writing this text.
Copyright by Dariusz Prokopowicz
Thank you very much,
Warm regards,
Dariusz Prokopowicz

By combining the technologies of quantum computers, Big Data Analytics, artificial intelligence and other Industry 4.0 technologies, is it possible to significantly improve the predictive analyses of various multi-faceted macroprocesses?
By combining the technologies of quantum computers, Big Data Analytics, big data analytics and information extracted from e.g. large numbers of websites and social media sites, cloud computing, satellite analytics etc. and artificial intelligence in joint applications for the construction of integrated analytical platforms, it is possible to create systems for the multi-criteria analysis of large quantities of quantitative and qualitative data and thus significantly improve predictive analyses of various multi-faceted macro-processes concerning local, regional and global climate change, the state of the biosphere, natural, social, health, economic, financial processes, etc.?
Ongoing technological progress is increasing the technical possibilities of both conducting research, collecting and assembling large amounts of research data and their multi-criteria processing using ICT information technologies and Industry 4.0. Before the development of ICT information technologies, IT tools, personal computers, etc. in the second half of the 20th century as part of the 3rd technological revolution, computerised, semi-automated processing of large data sets was very difficult or impossible. As a result, the building of multi-criteria, multi-article, big data and information models of complex macro-process structures, simulation models, forecasting models was limited or practically impossible. However, the technological advances made in the current fourth technological revolution and the development of Industry 4.0 technology have changed a lot in this regard. The current fourth technological revolution is, among other things, a revolution in the improvement of multi-criteria, computerised analytical techniques based on large data sets. Industry 4.0 technologies, including Big Data Analytics technology, are used in multi-criteria processing, analysing large data sets. Artificial Intelligence (AI) can be useful in terms of scaling up the automation of research processes and multi-faceted processing of big data obtained from research.
The technological advances taking place are contributing to the improvement of computerised analytical techniques conducted on increasingly large data sets. The application of the technologies of the fourth technological revolution, including ICT information technologies and Industry 4.0 in the process of conducting multi-criteria analyses and simulation and forecasting models conducted on large sets of information and data increases the efficiency of research and analytical processes. Increasingly, in research conducted within different scientific disciplines and different fields of knowledge, analytical processes are carried out, among others, using computerised analytical tools including Big Data Analytics in conjunction with other Industry 4.0 technologies.
When these analytical tools are augmented with Internet of Things technology, cloud computing and satellite-implemented sensing and monitoring techniques, opportunities arise for real-time, multi-criteria analytics of large areas, e.g. nature, climate and others, conducted using satellite technology. When machine learning technology, deep learning, artificial intelligence, multi-criteria simulation models, digital twins are added to these analytical and research techniques, opportunities arise for creating predictive simulations for multi-factor, complex macro processes realised in real time. Complex, multi-faceted macro processes, the study of which is facilitated by the application of new ICT information technologies and Industry 4.0, include, on the one hand, multi-factorial natural, climatic, ecological, etc. processes and those concerning changes in the state of the environment, environmental pollution, changes in the state of ecosystems, biodiversity, changes in the state of soils in agricultural fields, changes in the state of moisture in forested areas, environmental monitoring, deforestation of areas, etc. caused by civilisation factors. On the other hand, complex, multifaceted macroprocesses whose research processes are improved by the application of new technologies include economic, social, financial, etc. processes in the context of the functioning of entire economies, economic regions, continents or in global terms.
Year on year, due to technological advances in ICT, including the use of new generations of microprocessors characterised by ever-increasing computing power, the possibilities for increasingly efficient, multi-criteria processing of large collections of data and information are growing. Artificial intelligence can be particularly useful for the selective and precise retrieval of specific, defined types of information and data extracted from many selected types of websites and the real-time transfer and processing of this data in database systems organised in cloud computing on Big Data Analytics platforms, which would be accessed by a system managing a built and updated model of a specific macro-process using digital twin technology. In addition, the use of supercomputers, including quantum computers characterised by particularly large computational capacities for processing very large data sets, can significantly increase the scale of data and information processed within the framework of multi-criteria analyses of natural, climatic, geological, social, economic, etc. macroprocesses taking place and the creation of simulation models concerning them.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
Is it possible, by combining the technologies of quantum computers, Big Data Analytics, big data analytics and information extracted from, inter alia, a large number of websites and social media portals, cloud computing, satellite analytics, etc., and artificial intelligence in joint applications of building integrated analytical platforms? and artificial intelligence in joint applications for the construction of integrated analytical platforms, is it possible to create systems for the multi-criteria analysis of large quantities of quantitative and qualitative data and thereby significantly improve predictive analyses of various multi-faceted macro-processes concerning local, regional and global climate change, the state of the biosphere, natural, social, health, economic, financial processes, etc.?
By combining the technologies of quantum computers, Big Data Analytics, artificial intelligence and other Industry 4.0 technologies, is it possible to significantly improve the predictive analyses of various multi-faceted macroprocesses?
By combining the technologies of quantum computers, Big Data Analytics, artificial intelligence, is it possible to improve the analysis of macroprocesses?
What do you think about this topic?
What is your opinion on this subject?
Please respond,
I invite you all to discuss,
Thank you very much,
Warm regards,
The above text is entirely my own work written by me on the basis of my research.
I have not used other sources or automatic text generation systems such as ChatGPT in writing this text.
Copyright by Dariusz Prokopowicz
Dariusz Prokopowicz

Hello everyone,
I am Danillo Souza, and I am currently a Post-Doc Researcher at Basque Center for Applied Mathematics (BCAM). I am currently working on the Mathematical, Computational and Experimental Neuroscience Group (MCEN). One of the challenges of my work is to derive optimal tools to exact topological and/or geometrical information from Big data.
I am trying to submit a work to arXiv and unfortunately, an endorsement in Physics - Data Analysis and Statistics is required. I was wondering if some researcher could be my endorser in this area.
Beforehand, I appreciate your efforts in trying to help me.
With kind regards,
Danillo
Email: dbarros@bcamath.org
Danillo Barros De Souza requests your endorsement to submit an article
to the physics.data-an section of arXiv. To tell us that you would (or
would not) like to endorse this person, please visit the following URL:
https://arxiv.org/auth/endorse?x=UOKIX3
If that URL does not work for you, please visit
http://arxiv.org/auth/endorse.php
and enter the following six-digit alphanumeric string:
Endorsement Code: UOKIX3
Dear Scholars, Researchers, and Academics,
We are pleased to announce a Call for Papers for the upcoming Special Issue to be hosted by the Mesopotamian Academic Press. This prestigious event is dedicated to fostering intellectual exchange and advancing scholarship in the field of Computer Science.
The Mesopotamian Academic Press takes pride in its commitment to nurturing the academic community by providing a platform for thought-provoking discussions and interdisciplinary collaborations. We invite contributions from scholars, researchers, and academics working in various disciplines, such as Big Data, Cybersecurity, Information Technology , and beyond, to submit their original research papers and engage in lively discussions that delve into the multifaceted dimensions.
Sincerely,
Mesopotamian Academic Press
https://mesopotamian.press/journals/index.php/index/index
It is a laborious task to search an extensive library of documents for useful information. With the advancement of big data and smart technologies, could it be feasible to create a smart robot to help scientists read literature? How can this be achieved?
Robot Capabilities: Search and Summarisation. We ask the smart robot a question, and it searches the library of written works and gives us a brief answer.
Which new ICT information technologies are most helpful in protecting the biodiversity of the planet's natural ecosystems?
What are examples of new technologies typical of the current fourth technological revolution that help protect the biodiversity of the planet's natural ecosystems?
Which new technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are helping to protect the biodiversity of the planet's natural ecosystems?
How do new Big Data Analytics and Artificial Intelligence technologies, including deep learning based on artificial neural networks, help protect the biodiversity of the planet's natural ecosystems?
New technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are finding new applications. These technologies are currently developing rapidly and are an important factor in the current fourth technological revolution. On the other hand, due to the still high emissions of greenhouse gases generating the process of global warming, due to progressive climate change, increasingly frequent weather anomalies and climatic disasters, in addition to increasing environmental pollution, still rapidly decreasing areas of forests, carried out predatory forest management, the level of biodiversity of the planet's natural ecosystems is rapidly decreasing. Therefore, it is necessary to engage new technologies, including ICT information technologies, technologies categorized as Industry 4.0/Industry 5.0, including new technologies in the field of Big Data Analytics and Artificial Intelligence in order to improve and scale up the protection of the biodiversity of the planet's natural ecosystems.
In view of the above, I address the following question to the esteemed community of scientists and researchers:
How do the new technologies of Big Data Analytics and artificial intelligence, including deep learning based on artificial neural networks, help to protect the biodiversity of the planet's natural ecosystems?
Which new technologies, including ICT information technologies, technologies categorized as Industry 4.0 or Industry 5.0 are helping to protect the biodiversity of the planet's natural ecosystems?
What are examples of new technologies that help protect the biodiversity of the planet's natural ecosystems?
How do new technologies help protect the biodiversity of the planet's natural ecosystems?
And what is your opinion on this topic?
What do you think about this topic?
Please answer,
I invite everyone to join the discussion,
Thank you very much,
Warm regards,
Dariusz Prokopowicz
The above text is entirely my own work written by me on the basis of my research.
In writing this text I did not use other sources or automatic text generation systems.
Copyright by Dariusz Prokopowicz
