Article

Ajax: A New Approach to Web Applications

Authors:
If you want to read the PDF, try requesting it from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... First of all, the technology that has had the most impact on the evolution of the Web 2.0 is AJAX (Asynchronous JavaScript And XML) [35]. As its name stands for, it enables Web browsers to send and receive data from a server asynchronously, i.e. without blocking the Web page. ...
... They also offer different paradigms and thus the practices are diverging over time. For example, in recent years, there have even been communities emphasizing that the CSS should be written within the JavaScript code 35 . ...
... Developers now design architecture as depicted on the right part of Figure 1 In addition, since the introduction of Docker in March 2013, developers started to package their micro-services into lightweight containers. These containers are black-boxes from the in- 35. https://medium.com/dailyjs/what-is-actually-css-in-js-f2f529a2757 ...
Thesis
The World Wide Web is mainly composed of two types of application components: applications and services. Applications, whether they are mobile or Web applications, i.e. intended to be used from a browser, have in common that they are a kind of text to holes and communicate with the services to customize the application for each user. It is therefore the service that owns and manages the data. To make this communication possible, the services offer APIs following the REST architecture. The management of the life cycle of a REST API is then a central element of the development of systems on the Web. The first step in this life cycle is the definition of the requirements of an API (functionality and software properties). Then, the technologies that will allow it to be designed, implemented and documented are chosen. It is then implemented and documented and put online. From then on, applications can use it. Then follows a phase of maintenance and evolution of the API, in which bugs are fixed and functionalities evolve to adapt to the changes of its users' expectations. In this thesis, we review the methods and technologies that accompany the developer during this life cycle. We identify two open challenges. First, there are many technologies for creating and documenting an API. Choosing the most relevant technologies for a project is a difficult task. As a first contribution of this thesis, we establish criteria to compare these technologies. Then, we use these criteria to compare existing technologies and propose three comparison matrices. Finally, to simplify this selection, we have developed an open-source wizard available on the Web, which guides the developer in his choice. The second challenge we have identified is related to the maintenance and evolution of REST APIs. The existing literature does not allow a REST API to evolve freely, without the risk of breaking the applications that use it (their clients). The second contribution of this work is a new approach to the co-evolution of REST APIs and their clients. We have identified that by following 7 rules governing the documentation of the API and the data they return in response to its clients, it is possible to create Web user interfaces capable of adapting to the majority of evolutions of REST APIs without producing bugs, nor breaking them and without even requiring the modification of their source code.
... The name was coined by Jesse James Garret who was among the first to realize the significance of the new technology. Garrett (2005). decoupled changes in what the user sees from communication with the remote server. ...
... 47 The stand-alone platform cannot cut its price enough to 41 Calore (2014). 42 Garrett (2005).. 43 https://www.quora.com/Do-programmers-still-use-AJAX (viewed 1/16/20). ...
Preprint
Full-text available
The purpose of this chapter is to use value structure analysis to better understand how sponsors of digital exchange platforms capture value and maintain strategic bottlenecks. I begin by describing the three core technological processes that lie at the heart of all digital exchange platforms: (1) search and ad placement; (2) dynamic pricing; and (3) data analysis and prediction. These processes are carried out automatically and at high speed by computerized algorithms. The algorithms consist of steps, all of which are essential to the successful completion of a transaction or transmission of a message. However, bottlenecks are inevitable any synchronized multi-step flow process. The platform sponsor's first responsibility is thus to resolve bottlenecks in its core processes when and where they arise. One of an exchange platform's critical core processes is to generate predictions as to what the user would like to see and/or purchase. Better predictions speed up the flow and reduce the unproductive time users spend on the platform. Thus to be competitive today, platform sponsors must have the ability to gather data, generate predictions and test hypotheses within time intervals measured in milliseconds. Predictions can be improved by tracking individuals' behavior online, but only at the cost of reducing their privacy. The inevitable tension between performance and privacy has yet to be resolved. The chapter goes on to discuss how platform sponsors can "lock-in" members of an ecosystem through a combination of visible instructions, data storage, and specific inertia. It concludes by describing successful and unsuccessful attempts to disintermediate exchange platforms.
... With these goals on mind, a set of screen layouts and a storyboard [6] of all process were designed. For the design of these elements we considered guidelines for the design of the user interface [8] [9] and specifically the web user interface [4]. By example, we take some elements from the LMS and the IM system in order to implement them into Mensajero with the objective of reusing the user experience and knowledge in similar systems and facilities them to use it. ...
... A fundamental element for the search and notification module is AJAX [4]. AJAX (Asynchronous JavaScript and XML) is a combination of web technologies. ...
Conference Paper
Full-text available
The participation of the user is fundamental for the development of successful user interfaces. The end user's practice and knowledge can improve new or existing user interfaces of information systems. In this paper we present our experience and results of incorporating the end user into a developer team. The goal was to improve the user interface for an Instant Messaging (IM) component. This component was developed to improve the communication and collaboration between the users of an open source Learning Management System (LMS). The IM component integrates in a transparent way into the main LMS used by the Autonomous University of Yucatan. We show the process and results of the end user participation to improve the user interface of the IM component.
... Instead of loading a new webpage, an AJAX engine written in JavaScript is loaded by the browser. This engine is responsible for communicating with the server, allowing user interaction with the application to happen asynchronously and independent of the communication with the server(Garrett, 2005). ...
Thesis
Collaborative work has become an essential part of education, especially in higher education. Educational institutions are making increasing use of web-based technologies to support communication and collaboration among students. In this thesis, I address the problems of complexity and inflexibility in collaborative tools provided by educational institutions, such as Course Management System (Blackboard) and the different collaborative features it provides. There have been research studies that indicated the unsatisfactory adoption rate of collaborative features in CMS by students and teachers due to the complication and inflexibility of these tools (Guidry and BrckaLorenz, 2010; Papastergiou, 2006; Rosato et al., 2007). An examination of the key features and success factors in existing collaborative tools can provide insight into the design of a collaborative tool that better addresses students’ needs. First, I conducted a survey study with students in an urban university in Midwestern U.S, through which I identified the key features and success factors in collaborative tools from students’ perspectives. Second, through the integration of Web 2.0 technologies, I designed and developed U-Connect, a dynamic and interactive collaborative tool that allows students to easily and quickly create or join groups and share their ideas, thoughts and resources with each other while working in groups. Third, I conducted a pilot study to evaluate U-Connect and proposed a set of additional features and design recommendations for collaborative tools to better support academic group work.
... [1] The situation has generated the demand for programs running on the clientside, therefore Javascript support was added to web browsers. Javascript support and and the AJAX [2] technology was a huge step forward. The code running on the client-side could initiate HTTP requests and communicate to the server without reloading the user side after every user interaction. ...
Conference Paper
Full-text available
The leading platforms in modern software development for applications with the user interface are web and mobile platforms. In the earlier days, it was common that the purpose of the client side of these applications only display data and process interactions, while the backend ran all business logic in the background. Recently a new trend can be identified that the client apps include complex business logic as well. Modern web frameworks and mobile technologies rely on the client device's computing capacities for the maximal performance, therefore provide built-in services to cache data and run steps of the business logic on the client device. This paradigm is illustrated by the success of PWA technology and by offline applications. The user interface of modern applications is designed by UX designers, who focus rather on the ease of use and user experience then safety and soundness, which often leads to disrupting the well-defined business process by introducing new sub-steps and alternative ways. These tendencies altogether result that the client-side of modern applications also implement a lot of business logic that already implemented by the backend in a more strict, straightforward approach. In a more abstract approach, the client and the server work with the same type of documents by performing two different business processes that exist at the same time. The two layers of the application are often developed by different teams, where frontend and backend specialists work simultaneously. Altogether this results in new risks that should be mitigated by continuously evaluating the consistency of the two business processes during the whole development life-cycle. This evaluation should identify if the two processes are consistent, special tools have to be introduced that ensure that after performing a particular iteration in the development roadmap, the consistency 0
... The frontend, on the other hand, is constructed utilizing HTML, CSS, and JavaScript, which are relatively familiar among developers, enabling users to quickly change it. Asynchronous JavaScript and XML (AJAX) (Garrett, 2007) is used to manage communication between the frontend and backend, allowing our system to support asynchronous messaging. Our annotation tool also supports Common Gateway Interface and Fast Common Gateway Interface; using the latter may greatly cut response time and enhance the user experience for annotators. ...
Preprint
Full-text available
Weak labeling is a popular weak supervision strategy for Named Entity Recognition (NER) tasks, with the goal of reducing the necessity for hand-crafted annotations. Although there are numerous remarkable annotation tools for NER labeling, the subject of integrating weak labeling sources is still unexplored. We introduce a web-based tool for text annotation called SciAnnotate, which stands for scientific annotation tool. Compared to frequently used text annotation tools, our annotation tool allows for the development of weak labels in addition to providing a manual annotation experience. Our tool provides users with multiple user-friendly interfaces for creating weak labels. SciAnnotate additionally allows users to incorporate their own language models and visualize the output of their model for evaluation. In this study, we take multi-source weak label denoising as an example, we utilized a Bertifying Conditional Hidden Markov Model to denoise the weak label generated by our tool. We also evaluate our annotation tool against the dataset provided by Mysore which contains 230 annotated materials synthesis procedures. The results shows that a 53.7% reduction in annotation time obtained AND a 1.6\% increase in recall using weak label denoising. Online demo is available at https://sciannotate.azurewebsites.net/(demo account can be found in README), but we don't host a model server with it, please check the README in supplementary material for model server usage.
... The next step was to move from server-side document generation to client-side. And later AJAX (Asynchronous JavaScript and XML) came into play [10]. It introduced a new level of User Experience and allowed to create dynamic user interfaces for complex systems rater than simple documents. ...
Article
Full-text available
This paper explores the concept of micro-frontends in web-development. Micro-frontends are the logical evolution of architecture for the front-end side of web-applications. It was evolved in conditions of continuous development browser platform that allowed to create rich applications that are easy to distribute and deliver to the user and at the same time provide an experience close to desktop applications. The article begins with an overview of how web-development and the architectural approaches within it evolved, including emergence of micro-services concept in back-end development , and existing researches on micro-frontends. Then, a case study on implementing the concept in an actual project is presented and issues risen during development are discussed. Topics of routing, static assets serving, and repository organization in the projects are discussed based on the experience of development. As a result of the case study, the micro-frontends architecture is found to better suit complex large-scale projects and big teams with experienced developers. The approach increases the complexity of project structure and development. At the moment of publication, the community does not have any ready-to-use solutions that can bootstrap the project and handle the additional complexity .
... Was okonomisch zählt ist der Faktor " Sessions pro Server" . Speziell Webseiten die AJAX [7], Dynamisches HTML und Flash vielfältig einsetzen, können dabei ein Problem darstellen -hier wird manchmal sehr viel CPU-Leistung beansprucht, was zur Folge hat dass man die einzelnen Sessions bzgl. Ihres Ressourcenverbrauchsüberwachen und gegebenenfalls eingreifen muss (z.B.: die Priorität der ressourcenhungrigen Browser-Sitzung verringern). ...
... The term Ajax (shorthand for Asynchronous JavaScript and XML) is known to have been coined by Jesse James Garrett in the year 2005 (Langley, 2006 (Garrett, 2005). The term very quickly became conventional to describe a phenomenon that started developing about eight years earlier and that came to the limelight with the use of such technologies by Google (Bisson, 2005;Gomes, 2005;LaMonica, 2005a) . ...
Article
Full-text available
As the Web enters its third decade of existence, I draw attention to the need to better understand the Web as a potential reference case for how an information system transforms through incremental innovations, with particular focus on the Web’s advancement as a communication media platform. As a necessary research step in this quest, I critically examine whether one can use existing media capacity theories and media-related buzzwords (such as rich media, multimedia, hypermedia, social media) to characterize Web innovations as media. I examine and clarify these buzzwords’ origins, meanings, and relationship with media capacity theories. I also elucidate discrepancies between them. Via inductive reasoning, I synthesize three media capacity dimensions (sensibility support, interactivity support and logistical support) as potential framework for objective media characterization. Each dimension could metamorphize into individual theories or one theory (e.g., sensibility interactivity and logistical support theory (SILST)). I present these dimensions’ indicators and demonstrate three-dimensional typology of Web innovation milestones anchored on the three dimensions—a step forward in substantiating the framework’s applicability to media capacity characterization. The final publication is available at the URL https://aisel.aisnet.org/cais/vol48/iss1/36/
... JavaScript became a mainstream language when Asynchronous JavaScript and XML (Extensible Markup Language or AJAX) was developed in 2005 by Jesse James Garett (Garrett 2005). AJAX is a combination of web technologies that can be used to build applications that can quickly update a user interface without reloading an entire web page or application (Huang 2019;MDN 2020;Roth 2017). ...
Article
Full-text available
... The next step was to move from server-side document generation to client-side. And later AJAX (Asynchronous JavaScript and XML) came into play [10]. It introduced a new level of User Experience and allowed to create dynamic user interfaces for complex systems rater than simple documents. ...
Preprint
This paper explores the concept of micro-frontends in web-development. Micro-frontends are the logical evolution of architecture for the front-end side of web-applications. It was evolved in conditions of continuous development browser platform that allowed to create rich applications that are easy to distribute and deliver to the user and at the same time provide an experience close to desktop applications. The article begins with an overview of how web-development and the architectural approaches within it evolved, including emergence of micro-services concept in back-end development, and existing researches on micro-frontends. Then, a case study on implementing the concept in an actual project is presented and issues risen during development are discussed. Topics of routing, static assets serving, and repository organization in the projects are discussed based on the experience of development. As a result of the case study, the micro-frontends architecture is found to better suit complex large-scale projects and big teams with experienced developers. The approach increases the complexity of project structure and development. At the moment of publication, the community does not have any ready-to-use solutions that can bootstrap the project and handle the additional complexity.
... The collected browser statistics are standard values like webdriver, webgl, header, language, device meomory, etc. according to the FingerprintJS [3]. To store the data online we used a local Apache server [76] with a MySQL database [56], to which the data was sent via Ajax [52] using Javascript [55]. Figure 3 shows the normalized gaze and mouse movement data. Each row corresponds to a separate image. ...
Preprint
Full-text available
In this work we inspect different data sources for browser fingerprints. We show which disadvantages and limitations browser statistics have and how this can be avoided with other data sources. Since human visual behavior is a rich source of information and also contains person specific information, it is a valuable source for browser fingerprints. However, human gaze acquisition in the browser also has disadvantages, such as inaccuracies via webcam and the restriction that the user must first allow access to the camera. However, it is also known that the mouse movements and the human gaze correlate and therefore, the mouse movements can be used instead of the gaze signal. In our evaluation we show the influence of all possible combinations of the three information sources for user recognition and describe our simple approach in detail. The data and the Matlab code can be downloaded here https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=%2FThe%20Gaze%20and%20Mouse%20Signal%20as%20additional%20Source%20...&mode=list
... Further processing and analysis steps in the machine learning module, such as energy disaggregation, can be achieved with these energy consumption data. The processed digital data of the smart meter is sent to the master device's server via the Asynchronous Java-Script and XML (AJAX) technique [32] and is displayed in real-time as a dynamic line chart on the dashboard of the web-based GUI. ...
Article
Full-text available
Recent advances in machine learning and computer vision brought to light technologies and algorithms that serve as new opportunities for creating intelligent and efficient manufacturing systems. In this study, the real-time monitoring system of manufacturing workflow for the Smart Connected Worker (SCW) is developed for the small and medium-sized manufacturers (SMMs), which integrates state-of-the-art machine learning techniques with the workplace scenarios of advanced manufacturing systems. Specifically, object detection and text recognition models are investigated and adopted to ameliorate the labor-intensive machine state monitoring process, while artificial neural networks are introduced to enable real-time energy disaggregation for further optimization. The developed system achieved efficient supervision and accurate information analysis in real-time for prolonged working conditions, which could effectively reduce the cost related to human labor, as well as provide an affordable solution for SMMs. The competent experiment results also demonstrated the feasibility and effectiveness of integrating machine learning technologies into the realm of advanced manufacturing systems.
... Finally, the best performing models were saved and rendered available for use via a GUI or web-interface, using PHP (PHP 7.1.28), CSS (Wium Lie, 1996), and AJAX (Garrett, 2005) open-source technologies. ...
Article
Full-text available
The purity of seeds is the most important factor in agriculture that determines crop yield, price, and quality. Rice is a major staple food consumed in different forms globally. The identification of high yielding and good quality paddy seeds is a challenging job and mainly dependent on expensive molecular techniques. The practical and day-to-day usage of the molecular-laboratory based techniques are very costly and time-consuming, and involves several logistical issues too. Moreover, such techniques are not easily accessible to paddy farmers. Thus, there is an unmet need to develop alternative, easily accessible and rapid methods for correct identification of paddy seed varieties, especially of commercial importance. We have developed iRSVPred, deep learning based on seed images, for the identification and differentiation of ten major varieties of basmati rice namely, Pusa basmati 1121 (1121), Pusa basmati 1509 (1509), Pusa basmati 1637 (1637), salt-tolerant basmati rice variety CSR 30 (CSR-30), Dehradoon basmati Type-3 (DHBT-3), Pusa Basmati-1 (PB-1), Pusa Basmati-6 (PB-6), Basmati -370 (BAS-370), Pusa Basmati 1718 (1718) and Pusa Basmati 1728 (1728). The method has an overall accuracy of 100% and 97% on the training set (total 61,632 images) and internal validation set (total 15,408 images), respectively. Furthermore, accuracies of greater than or equal to 80% have been achieved for all the ten varieties on the external validation dataset (642 images) used in the study. The iRSVPred web-server is freely available at http://14.139.62.220/rice/.
... Özellikle çeşitli özel ve kamu kurum ve kuruluşlarında istihdam edilen yetişmiş uzman personel tarafından, küresel konumlama sistemi (GPS), optik/radar uydu sistemleri ve coğrafi bilgi sistemleri (CBS) yardımıyla gerçekleştirilen uygulamalar, konumsal veri üretimini ve kartografik sunumunu oldukça kolaylaştırmıştır. 2000'li yılların başına kadar masaüstü yazılımlarla sürdürülen bu süreç, web teknolojilerinin gelişimiyle Google Maps ve Bing Maps gibi çevrimiçi arayüzlerin de harita sunumunda kullanılmasını sağlamıştır (Garrett, 2005). ...
Article
Full-text available
Coğrafi bilgilerin gönüllüler tarafından üretilmesi ve paylaşılması son yıllarda giderek artmaktadır. Gönüllülük esaslı faaliyetlerin akademik bir çalışma ile ele alınıp incelenmesi, katkı sağlanan verilerin gelişimini olumlu yönde etkileyecektir. En popüler gönüllü coğrafi bilgi platformu olan OpenStreetMap aracılığıyla gönüllü katılımcılar, çevrimiçi bir arayüzden faydalanarak katkı sunmaktadır. Katılımcılar, coğrafi veri oluşturma, bu verilere ait semantik etiket bilgilerini üretme ve var olan veri ve etiket bilgilerini güncelleme işlemleriyle büyük gezegen verisini (planet.osm) giderek daha kapsayıcı hale getirmeye çalışmaktadır. Birçok araştırmacı OpenStreetMap verilerinin kalitesini inceleyen ölçüler hakkında çeşitli çalışmalar yapmışken, az sayıda çalışma veriyi üreten gönüllü katılımcıların davranışlarını incelemiştir. Bu çalışmada, yerleşim alanlarındaki OpenStreetMap yol (highway=residential) verilerine ilişkin çeşitli etiket bilgileri karşılaştırılarak, gönüllü katılımcıların yerleşim alanlarında veri girişi eğilimleri üzerine bir araştırma yapılmıştır. Gönüllülerin semantik katkılarının istatistiki değerleri kullanılarak karşılaştırmalı grafikler üzerinden çeşitli ilişkiler kurulmuş ve yorumlanmıştır. Elde edilen sonuçlar, OpenStreetMap gönüllülerinin residential (yerleşim-içi) yolların isimlerine daha fazla katkı sağladıklarını göstermiştir. Bununla birlikte, proje amacının gönüllüler tarafından anlaşıldığı ve o yönde aktiviteler gerçekleştirdikleri tespit edilmiştir. Buna karşın, gerçekleştirilen tüm katkılar semantik çeşitlilik bakımından oldukça iyi olsa da semantik bilgilerin tamlığı bakımından değerlendirildiğinde yeterli değildir.
... Per ovviare alla lentezza del passaggio di informazioni sincrono da server a client, nel 2005 è stata sviluppata una tecnica chiamata AJAX (Garrett 2005). Sebbene esistessero implementazioni sperimentali prima di allora, fu solo nel 2004 e 2005 che la relativa implementazione divenne una best practice, anche grazie allo sviluppo dell'applicazione Google Maps che ne fece largo uso (Kawamata 2007). ...
Book
The series Ca' Foscari Japanese Studies , strong in the tradition well-established at the Dipartimento di Studi sull’Asia e sull’Africa Mediterranea, aims to be an international benchmark for the studies on Japan. It collects publications of high scientific rigor aimed at documenting the most original and theoretically advanced developments in this field, ranging from classicism to modernity. Its highly multidisciplinary vocation results in a production structured in four thematic areas: Arts and Literature; History and Society; Religion and Thought; Linguistic and Language Education.
... The term "Ajax" (Garrett 2005) refers to a combination of techniques that allow continuous client-side updating with server-side information. AJAX stands for Asynchronous JavaScript And XML. ...
... The tool's interface also uses an event and object model operated with the aid of the jQuery library 11 , which also enables asynchronous communication among client and server, using the Ajax protocol (GARRETT et al., 2005). This feature runs natively on all modern browsers without the need to install additional software, which meets R2 and R3 requirements. ...
Thesis
Full-text available
A system-of-systems (SoS) is an arrangement of independent systems that work in synergy to fulfill missions that any of these systems in isolation cannot accomplish. SoS could be observed in several domains such as urban mobility, healthcare, and smart cities, to mention a few. A significant concern of SoS engineers refers to the independence of the constituent systems that have the autonomy to stop contributing or abandon an SoS, making it difficult to guarantee the quality of SoS at design time. This study aims to investigate good practices and recommendations that can be applied to the design of SoS to assure its proper operation. There fore, we herein adopted the term “heuristics” to refer to such good practices and recommendations. An exploratory study was conducted to understand the concerns regarding an SoS operating in a Brazilian public organization. We further conducted a systematic mapping study (SMS) to identify which practices have been applied in SoS design. The results were discussed in a focus group to organize the first set of heuristics. Finally, we surveyed experts to evaluate which heuristics are appropriate to SoS design, resulting in a heuristics catalog. To facilitate the understanding and implementation, the heuristics in this catalog have been organized into groups. Each heuristic was categorized according to its suitability for use based on the coordination level of the SoS. A tool was created that incorporates some of the heuristics from catalog to verify verify how the heuristics can facilitate the SoS design process. A feasibility study was conducted with practitioners to evaluate the ease of use and the usefulness of the tool. It was expected that the heuristics catalog and the tool would support researchers and professionals in the SoS design process and help identify critical issues during the design phase.
... Some general JavaScript libraries such as jQuery and Backgrid were also used. JSON [25] was used as the data transfer format between server and client. ...
Article
Full-text available
Environmental data visualization tools are used for data validation, maintenance, and assessment of status and trends of environmental variables. However, currently available visualization tools do not meet all the requirements needed for efficient data management. In this paper, we propose a new approach and present a web-based implementation of a visualization tool that focuses on efficient visualization and seamless integration of maps and graphs. Our approach emphasizes the spatio-temporal relationship of environmental data stored in a relational database. Several new advantages emerge with this approach. Through a case study of two oceanographic datasets - the HarmoNIA project database and the Croatian national monitoring database - we show how this approach enables intuitive status and trend assessments, simplifies data validation, enables updating of corrections, and is suitable for a web-based implementation that works efficiently even with large datasets.
... The view part of our web applications was constructed using JavaServer Pages (JSP) [40]. To improve the user experience, we used asynchronous JavaScript and XML (AJAX) [41], thus allowing the user's interaction with the application to happen asynchronously. ...
Article
Full-text available
Anonymous service delivery has attracted the interest of research and the industry for many decades. To obtain effective solutions, anonymity should be guaranteed against the service provider itself. However, if the full anonymity of users is implemented, no accountability mechanism can be provided. This represents a problem, especially when referring to scenarios in which a user, protected by anonymity, may perform illegally when leveraging the anonymous service. In this paper, we propose a blockchain-based solution to the trade-off between anonymity and accountability. In particular, our solution relies on three independent parties (one of which is the service provider itself) such that only the collaboration of all three actors allows for the disclosure of the real identity of the user. In all other cases, anonymity is guaranteed. To show the feasibility of the proposal, we developed a prototype with user-friendly interfaces that minimize the client-side operations. Our solution is then also effective from the point of view of usability.
Thesis
Full-text available
Existe un gran número de soportes (medios) en los cuales puede desarrollarse la narrativa visual, en el ámbito digital se enriquece la posibilidad de utilizar mezclas y variantes de diferentes medios y utilizarlas para construir mensajes con un gran poder narrativo. Esto sucedió en las páginas web con los recursos multimedia, sin embargo ellos poseían una estructura narrativa muy similar a la del video. Cuando surgió el uso del efecto parallax scrolling en el desarrollo de páginas web, éste marcó una tendencia de diseño y también la pauta de un estilo narrativo en web, desde el año 2011 y cuyo uso sigue presente en 2020. Se ha utilizado como un medio para contar distintos tipos de historias con una gran variedad de efectos visuales. Con ello se abre una ventana al estudio acerca de las características que hacen del PLS-Web un medio para el desarrollo de la narrativa visual y también la oportunidad de plantear un modelo para su producción y análisis.
Article
Full-text available
The number of web applications for both personal and business use will continue to increase. The popularity of web applications has grown, increasing the need to estimate Quality of Experience for web applications (Web QoE). Web QoE helps providers to understand how their end-users perceive quality and point towards areas to improve. Waiting time has been proven to have a significant influence on user satisfaction. Most studies in the field of Web QoE have focused on modelling Web QoE for the user’s first interaction with the application, e.g., the waiting time for the first page load to complete. This does not include a user’s subsequent interactions with the application. Users keep interacting with the application beyond the first page load resulting in an experience that consists of a series of waiting times. In this study, we have chosen web maps as a use case to investigate how to measure waiting time for a user’s interactions across a web browsing session, and to measure the correlation between waiting time and user-reported perceived quality. We provide a short survey of existing Web QoE estimation metrics and models. We then propose two new measures: interactive Load Time (iLT) and Total Completed interactive Load (TCiL) to establish the waiting time associated with a web application user’s interactions. A subjective study confirms a logarithmic relationship for interactive web application sessions between iLT and perceived quality. We compare the correlation between QoE for iLT and the state of the art, non-interactive equivalent, Page Load Time (PLT)/Waiting Time. We demonstrate how the iLT/QoE fitting curve deviates from PLT/QoE. The number of clicks in completing tasks and TCiL are explored to explain the connections between user’s interactions behaviour and the perceived quality.
Article
How a sidekick scripting language for Java, created at Netscape in a ten-day hack, ships first as a de facto Web standard and eventually becomes the world's most widely used programming language. This paper tells the story of the creation, design, evolution, and standardization of the JavaScript language over the period of 1995--2015. But the story is not only about the technical details of the language. It is also the story of how people and organizations competed and collaborated to shape the JavaScript language which dominates the Web of 2020.
Article
Full-text available
I here present a tool, called “AJICSS”, intended to make two things easier: first, developers’ transition from CSS to Javascript; and second, implementation of hypertext capabilities that involve real-time modifications to documents. In short, AJICSS: lets you embed Javascript directly inside CSS rules (multiple reasons for this are discussed), and trigger it on demand. provides functions to facilitate useful things such as dynamic layout and content changes. facilitates implementing several hypertext features, including structural transclusion, stretchtext, user-controllable content changes and filtering, and spreadsheet-like tables. AJICSS is written in Javascript, but is much easier to use than Javascript alone, and in particular makes it fairly easy (even for those with little programming background) to accomplish several long-neglected hypertext capabilities.
Article
Full-text available
This research is based on the design, implementation and testing of computer management system economic indicators. Exposes the fundamental themes about obtaining information on how the business development level of the Provincial Department of Economy and Planning of Ciego de Avila is coated. Information obtained as a basis for understanding and analysis of theoretical knowledge to generate decision is based, the result being the creation of a computerized system for the management of economic indicators. Based on the improvement of service and showing a control in the organization of information more easily, easily, safely and quickly. A variety of reports and views of each of the indicators monthly and yearly is provided. All these proposals together to take on a larger scale to facilitate the work of the workers making the whole process in a semi-automatic application giving way to the development of technologies in the country's labor sector.
Chapter
One major direction in research on technology-enabled learning systems revolves round the notion of generating optimal learning pathways. Two examples of the application areas that could be presented as search and optimization problem in the context of Artificial Intelligence are:- (1) Adaptive selection and sequencing of learning objects based on the learning profiles, preferences and abilities of individual learners; (2) Automatic composition of assessment or examination papers based on instructors‘ specifications. In this chapter, we present a critical discussion of the research which is concerned with the application of the paradigm of “swarm intelligence” in these two areas. The main aim of this survey is to highlight the new trends and key research achievements that have been realised in the last few years. We will also outline a range of relevant research issues and challenges that have been generated by this body of work.
Chapter
Web systems were initially supported by a client–server architecture and three standards (URL, HTTP, and HTML), and have considerable evolution in the last two decades. Usability, scalability, maintenance, portability, robustness, security, and integration with other systems are the main challenges of this software category. This tutorial presents the history and evolution of Web-based software architectures. We discuss current software architectural styles, patterns, and development platforms based on client-side and server-side technologies. In addition, we also discuss Web 3.0 requirements such as communication protocols, microservices, MV* browser-based frameworks, boilerplates client-side code, asynchronous programming, and integration with cloud computing infrastructures.
Chapter
The Fog of Things (FoT) proposes a paradigm which uses the Fog Computing concept to deploy Internet of Things (IoT) applications. The FoT exploits the processing, storage, and network capacity of local resources, allowing for the integration of different devices in a seamless IoT architecture, and it defines the components which compose the FoT paradigm describing their characteristics. This chapter presents the FoT paradigm and relates it to IoT architecture describing the main characteristics and concepts from the sensor and actuator communication to gateways, and local and cloud servers. Lastly, this chapter presents SOFT-IoT platform as a concrete implementation of FoT, which uses microservice infrastructure distributed along devices in the IoT system.
Article
Full-text available
The primary method for facilitating asynchronous communication over the Web is asynchronous JavaScript and XML (AJAX). Although AJAX provides the Web with warranted real-time functionality, unconventional techniques of programming are required at the cost of extensive use of resources. Implementing asynchronous communication over the Web, WebSockets, which is an emerging protocol, has the potential to address many challenges. There has been no independent study analyzing AJAX and WebSockets quantitatively. This paper, therefore, provides the following contributions to integrating Web technologies in realtime systems. First, it quantitatively compares AJAX and Websocket performance in terms of connection time by varying the number of requests over the web. Second, it compares the data bytes transferred by varying the number of requests. Third, it quantitatively compares network bandwidth consumption between AJAX and WebSocket
Chapter
Major Internet services are required to process a tremendous amount of data at real time. As we put these services under the magnifying glass, it's seen that distributed object storage systems play an important role at back-end in achieving this success. In this chapter, overall information of the current state-of –the-art storage systems are given which are used for reliable, high performance and scalable storage needs in data centers and cloud. Then, an experimental distributed object storage system (CADOS) is introduced for retrieving large data, such as hundreds of megabytes, efficiently through HTML5-enabled web browsers over big data – terabytes of data – in cloud infrastructure. The objective of the system is to minimize latency and propose a scalable storage system on the cloud using a thin RESTful web service and modern HTML5 capabilities.
Conference Paper
Full-text available
Push Communication is an integral requirement in modern Rich Web-based Applications, to implement the features like push notifications or real-time updates. Aspects like push-communication related concepts and their development technologies – focusing on the roots of them and the rationale behind their advancements – are not collectively discussed in any available forum. An intensive literature survey was conducted on identifying the very roots of the push-communication and its evolution towards understanding the abstract architectural formalism of the push-communication in Rich Web-based Applications, also focusing on the aforementioned aspects. We collected and documented the literature regarding the evolution of the push-communication, for archiving and also for reviewing and comparing the reasoning behind the improvements of them over time. We also tried to capture the knowledge to answer some important questions like is push-communication important and how difficult to integrate push-communication into the Rich Web-based Applications? We expect to study the artefacts identified through the survey to identify the abstract characteristics of the push-communication to realize the integration of the push-communication into the Rich Web-based Applications in the form of Delta-Communication.
Chapter
Product-related information can be found in various data sources and formats across the product lifecycle. Effectively exploiting this information requires the federation of these sources, the extraction of implicit information, and the efficient access to this comprehensive knowledge base. Existing solutions for product information management (PIM) are usually restricted to structured information, but most of the business-critical information resides in unstructured documents. We present a generic architecture for federating heterogeneous information from various sources, including the Internet of Things, and argue how this process benefits from using semantic representations. A reference implementation tailor-made to business users is explained and evaluated. We also discuss several issues we experienced that we believe to be valuable for researchers and implementers of semantic information systems, as well as the information retrieval community.
Article
The explosion of information has resulted in incremental search becoming an essential tool for many websites. This technology provides real-time suggestions by sending the current query to the server. Despite encryption, search requests can be leveraged by passive attackers to infer the query typed by the user. In this paper, we show that at least nine of Alexa’s top 50 websites have serious side-channel leaks. More importantly, we use information theory to quantify the leakage and report the upper bound of recognition accuracy that an attacker can achieve. We further develop a generic attack attempting to infer users’ queries by monitoring web search traffic. Experimentally, the attack performance is close to the theoretical bounds. The most vulnerable website allows up to 53% of English queries and 76% of Chinese queries to be identified from 825k and 140k queries, respectively. Overall, our work highlights the prevalence of such side-channel leaks on the Internet and provides insights for developers to help mitigate the threat.
Preprint
Full-text available
Earlier there was no mode of online communication between users. In big or small organizations communication between users posed a challenge. There was a requirement to record these communications and store the data for further evaluation. The idea is to automate the existing Simple Chat Room system and make the users utilize the software so that their valuable information is stored digitally and can be retrieved for further management purposes. There was no online method of communicating with different users. There were many different interfaces available in the market but this method of using windows sockets to communicate between nodes would be fast and reliable. The main objective of our Simple Chat Room project is to create a chat application that helps different users to communicate with each other through a server connected. This is a simple chat program with a server and can have many clients. The server needs to be started first and clients can be connected later. Simple Chat Room provides bidirectional communication between client and server. It enables users to seamlessly communicate with each other. The user can chat using this chat application. If the user at the other end is active then they can start a chat session. The chat is recorded in this application.
Article
Full-text available
Earlier there was no mode of online communication between users. In big or small organizations communication between users posed a challenge. There was a requirement to record these communications and store the data for further evaluation. The idea is to automate the existing Simple Chat Room system and make the users utilize the software so that their valuable information is stored digitally and can be retrieved for further management purposes. There was no online method of communicating with different users. There were many different interfaces available in the market but this method of using windows sockets to communicate between nodes would be fast and reliable. The main objective of our Simple Chat Room project is to create a chat application that helps different users to communicate with each other through a server connected. This is a simple chat program with a server and can have many clients. The server needs to be started first and clients can be connected later. Simple Chat Room provides bidirectional communication between client and server. It enables users to seamlessly communicate with each other. The user can chat using this chat application. If the user at the other end is active then they can start a chat session. The chat is recorded in this application.
Article
Due to the COVID-19 emergency, Ca’ Foscari University of Venice has reorganized a large part of its courses for on-line delivery through the LMS Moodle. Unfortunately, this platform, besides having a particularly complex interface to use, does not have modules dedicated to the teaching of Japanese. Addressing this issue, this paper proposes the design of a module for kanji teaching integrated with the Moodle workflow, to produce complete and printable kanji sheets with all the necessary information, thus reducing teacher workload. The methodology used derives from an Instructional System Design development model (SAM, Successive, Approximation Model) that allows continuous feedback from the teacher, so as to adapt the developed prototype to his/her teaching needs.
Chapter
Due to the high maintenance cost of traditional online auction system, poor load balance ability and down problems occurring in the peak value of system access, this paper aims to propose a solution to transit from traditional online auction system to cloud computing to get higher work efficiency, lower expenditure and less energy cost. The GAE platform is used to deploy application of the overall framework of online auction system in cloud computing, including development environment as well as online auction system components, and software architecture. The online auction negotiation algorithms in cloud computing are also proposed. Based on these key technologies, the business processes of the online auction system in cloud computing is designed, including users' login system, starting an auction, bidding processes, online auction data storage, and logout system. The online auction system constructed on GAE platform with cloud computing resources and storage ability can reduce the pressure of terminal equipment, which is more robust than traditional online auction system facing users' changeable needs in the process of online auction.
Chapter
Geographic information is created by manipulating geographic (or spatial) data (generally known by the abbreviation geodata) in a computerized system. Geo-spatial information and geomatics are issues of modern business and research. It is essential to provide their different definitions and roles in order to get an overall picture of the issue. This article discusses about the problematic of definitions, but also the technologies and challenges within spatial data fusion.
Article
Full-text available
La tecnología como otras áreas se encuentra en constante evolución, buscando la “mejor” herramienta a nuestras necesidades, pero ¿nuestras necesidades son siempre iguales? ¿estamos evolucionando?, por ejemplo no estamos satisfechos si no contamos con una computadora con red inalámbrica, que reproduzca archivos mp3, etc, pero ¿y el software?, utilizamos productos que nos son de gran utilidad, pero ni son los mejores ni están al alcance de todos en relación a costo y beneficio, sin darnos la oportunidad de mejores opciones de software con menor hardware y costo de licencia. Este trabajo está enfocado en el uso de un S.O. JeOs (Sólo lo Necesario de un S.O.), el cual fue realizado en SUSE Studio, que permite crear versiones a medida de SUSE Linux, personalizado con aplicaciones libres orientadas a materias de la carrera de TICs en la Universidad Tecnológica del Usumacinta mitigando la gran cantidad de virus y software pirata.
Chapter
Dieses Kapitel befasst sich mit den Grundlagen des E-Shops als virtueller Verkaufsraum eines Unternehmens. Einleitend wird der E-Shop definiert und die Kerntreiber des elektronischen Verkaufs herausgestellt. Anschließend werden die Anforderungen an E-Shop-Systeme durch elektronische Verkaufsprozesse dargestellt sowie die Pflege, die äußere Gestaltung und die Basisfunktionen eines E-Shop-Systems erläutert. Die Systemanforderungen basieren dabei auf den allgemeinen Qualitätsmerkmalen internetbasierter Software: Benutzbarkeit (Usability), Barrierefreiheit (Accessibility), Skalierbarkeit, Erweiterbarkeit, Internationalisierbarkeit und Sicherheit.
Chapter
Forms are a common part of web applications. They provide a method for the user to interact with the web application. However, forms in traditional applications require entire web pages to be refreshed every time they are submitted. This model is inefficient and should be replaced with Ajax-enabled forms. Ajax is a set of web development technologies that enables web applications to behave more like desktop applications, thus allowing a richer, more interactive and more efficient model for interactions between the user and the web application. This paper presents a refactoring system called Form Transformation Tool (FTT) to assist web programmers refactor traditional forms into Ajax-enabled forms while ensuring that functionality before and after refactoring is preserved.
Thesis
Full-text available
III摘要股市是现代经济活动的重要标志之一。有效市场假说认为每当相关信息出现时,这个信息就会被快速的传播,进而反映到股票的价格上。也就是说没有方法可以使投资者在不承担额外风险的情况下,得到额外的收益。但是随着相关领域研究的进步,发现了一些由于投资者心理因素产生的非理性投资选择产生的一些可以被利用的规律,其中包括过度反应以及股市的周期。这些规律可能存在于在上海证券交易所交易的股票之中。本篇论文中提出并实现了一个基于过度反应的、以股价变动程度为反转基准的预测股市未来走向的算法。通过自动根据既往股市数据进行模拟交易,确定以何种股价变动程度为反转阈值可以得到较高的且稳定的回报。除此之外,还实现了多步买入算法来减少投资的风险,并通过实验确定其对于投资收益的影响。最后,通过自动计算上述算法在不同股票上的适用程度,生成出适合使用本论文中所提出的算法进行投资的股票的列表。研究的结果表明,以本论文所提出的算法为依据在推荐的股票上在推荐的时机进行的交易可以获得比当年股市型基金平均收益更高的收益。可以使用本论文中基于Julia的软件系统,实现为用户提供股票推荐和买入卖出时机推荐。
Chapter
Online experimentation allows students from anywhere to operate remote instruments at any time. The current techniques constrain users to bind to products from one company and install client side software. We use Web services and Service Oriented Architecture to improve the interoperability and usability of the remote instruments. Under a service oriented architecture for online experiment system, a generic methodology to wrap commercial instruments using IVI and VISA standard as Web services is developed. We enhance the instrument Web services into stateful services so that they can manage user booking and persist experiment results. We also benchmark the performance of this system when SOAP is used as the wire format for communication and propose solutions to optimize performance. In order to avoid any installation at the client side, the authors develop Web 2.0 based techniques to display the virtual instrument panel and real time signals with just a standard Web browser. The technique developed in this article can be widely used for different real laboratories, such as microelectronics, chemical engineering, polymer crystallization, structural engineering, and signal processing.
Chapter
The programming capabilities of the Web can be viewed as an afterthought, designed originally by non-programmers for relatively simple scripting tasks. This has resulted in cornucopia of partially overlapping options for building applications. Depending on one’s viewpoint, a generic standards-compatible web browser supports three, four or five built-in application rendering and programming models. In this paper, we give an overview and comparison of these built-in client-side web application architectures in light of the established software engineering principles. We also reflect on our earlier work in this area, and provide an expanded discussion of the current situation. In conclusion, while the dominance of the base HTML/CSS/JS technologies cannot be ignored, we expect Web Components and WebGL to gain more popularity as the world moves towards increasingly complex web applications, including systems supporting virtual and augmented reality.
ResearchGate has not been able to resolve any references for this publication.