Article

As we may think The Atlantic Monthly

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Researchers in the field of hypermedia would argue that the Web is a hypermedia system which has not fully benefited from research carried out in that mature field which predates the Web by a number of decades (Bieber et al, 1997). Hypermedia evolved from hypertext research which some have argued began as early as 1945 when Vannevar Bush proposed the first automated system for managing information about documents and their relationships through the use of trails (Bush, 1945). It was not before the 1960s however, that serious research in the areas of hypertext and hypermedia started. ...
... MEMOIR adopted an agent based architecture Pikrakis et al., 1998) Users of the system were provided with the facility of grouping sets of documents they perceived as interesting into trails that were stored in a shared organisation memory. The notion of trails goes back to the pioneering work of Bush (Bush, 1945) where a trail is defined by a set of documents a user employs for the accomplishment of some task. In MEMOIR, a trail is defined as a collection of URLs that a user creates during a browsing session through manual selection from the entire set of URLs the user has visited. ...
... The utilisation of user 'trails' for information finding is a relatively old idea first proposed by Bush (Bush, 1945). In the context of Web navigation, a user trail is basically the list of URIs a user follows towards the achievement of a given task. ...
Thesis
p>The main objective of this work is to address the problem of information overload within small groups, driven by similar goals in a way that would enable the delivery of personalised and non-intrusive browsing recommendations and hints as well as aid the users in their information finding activities. The basic idea upon which this work builds is that information gained and created by a user navigating the information space can be used to assist other users in their navigation and information finding activities. The presented model utilises, extends, and combines ideas from open hypermedia with those from Web assistants and recommender systems to achieve its goals. The result of this combination is manifested in the idea of 'linking in context' which this work presents as a novel way of offering Web users recommendations for concepts related to what they are browsing. The integration of the various concepts is facilitated by the use of a multiagent framework. Creating a flexible and open architecture that can accommodate these goals as well as identifying information finding and recommendation building blocks, is one important dimension of this work. Developing a linking model to embrace context on the user and document level, is another.</p
... Circular causality, applied to learning and mental models, refers to the capacity for systems (individuals, machines) to analogically affect one another's thinking through collaboration. A byproduct of this cybernetic theory is the internet, emerging from Vannevar Bush's (1945) post-World War II essay, "As We May Think," which discussed potentials for an online web of trails to augment knowledge within a participatory network of users and information. ...
... The third part describes Pask's learning technologies, and suggests their capabilities align with original conceptions of the internet envisioned by Vannevar Bush (1945) and the creation of nascent communities like whole earth 'lectronic link (WELL; Rheingold, 1994). Today's internet algorithms have drifted from this ethos, providing steady information streams using a mixture of trace histories and corporate interests to turn the destiny of social influence into contingencies (Tilak & Glassman, 2020). ...
... They may take information from the social environment and use it to inform new behavior and thought. This falls in line with Bush's (1945) web of trails, which outlines how humans may interact with technologies, gain information, share it with others, and make analogies to their own experiences to decide further actions and thoughts. Bush's conception of distributed systems powered the internet's creation and foreshadowed personal computing (Tilak et al., 2022). ...
Article
Full-text available
This three-part paper reinforces crosscurrents between cybernetician Gordon Pask’s work towards creating responsive machines applied to theater and education, and Vygotsky’s theory, to advance sociohistorical approaches into the Internet age. We first outline Pask’s discovery of possibilities of a neoclassical cybernetic framework for human-human, human-machine, and machine-machine conversations. The second part outlines conversation theory as an elaboration of the reconstruction of mental models/concepts by observers through reliance on sociocultural psychological approaches, and applies concepts like the zone of proximal development, and perezhivanie to Paskian aesthetic technologies. The third part interprets Pask’s teaching/learning devices as zones of proximal development, and outlines how Paskian algorithms in digital devices like THOUGHTSTICKER have been generalized on today’s Internet, supplemented by corporate interests. We conclude Paskian theory may offer understandings of the roles of Internet technologies in transforming human thinking, and suggest (re)designing tools incorporating algorithms contextually advancing conceptual understanding deviating from current indexing approaches.
... Shneiderman's work [66] further stated that "The remarkably rapid dissemination of HCI research has brought profound changes that enrich people's lives", but also providing a tire-tracks diagram showing how HCI research on subjects such as hypertext, direct manipulation, etc. turned into product innovations by industry. Similarly, product innovations over the years mirror the early ideas of canon HCI visions [11,74]. Other research detailed successful cases of tech transfer, such as the translation of the multi-touch interface from research into the Apple iPhone and Microsoft Surface, while highlighting a long time lag between initial research and commercialization, which can be 20 years or more [12,32,66]. ...
... The abstract information of the paper and their academic influence (e.g., number of published papers, citation count) are missing or hard to process in the original Microsoft Academic Graph metadata. 11 To further expand data information about authors, papers, citations, and venues, we utilize the Semantic Scholar Academic Graph API, 12 which fills in this data. ...
Preprint
Full-text available
What is the impact of human-computer interaction research on industry? While it is impossible to track all research impact pathways, the growing literature on translational research impact measurement offers patent citations as one measure of how industry recognizes and draws on research in its inventions. In this paper, we perform a large-scale measurement study primarily of 70,000 patent citations to premier HCI research venues, tracing how HCI research are cited in United States patents over the last 30 years. We observe that 20.1% of papers from these venues, including 60--80% of papers at UIST and 13% of papers in a broader dataset of SIGCHI-sponsored venues overall, are cited by patents -- far greater than premier venues in science overall (9.7%) and NLP (11%). However, the time lag between a patent and its paper citations is long (10.5 years) and getting longer, suggesting that HCI research and practice may not be efficiently connected.
... "He is a precursor of Bush (1945), Engelbart (1963), Nelson (1983Nelson ( , 1987, and others who have set the hypertext/hypermedia agenda in recent years and [he] anticipated many of the features of Bush's memex, Nelson's Xanadu, and hypertext." « [Otlet] est un précurseur de Bush (1945), Engelbart (1963), Nelson (1983Nelson ( , 1987 Bush (1890Bush ( -1974 livre un réquisitoire contre l'indexation documentaire traditionnelle : "Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. ...
... "He is a precursor of Bush (1945), Engelbart (1963), Nelson (1983Nelson ( , 1987, and others who have set the hypertext/hypermedia agenda in recent years and [he] anticipated many of the features of Bush's memex, Nelson's Xanadu, and hypertext." « [Otlet] est un précurseur de Bush (1945), Engelbart (1963), Nelson (1983Nelson ( , 1987 Bush (1890Bush ( -1974 livre un réquisitoire contre l'indexation documentaire traditionnelle : "Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. ...
Thesis
Cette thèse porte sur la relation entre le travail intellectuel et son outillage, à travers une étude de l’héritage épistémologique de Paul Otlet (1868-1944), le premier théoricien de la documentation. Elle traite du problème de l’organisation et de la gestion des connaissances savantes sous l’angle de la documentation personnelle. Il s’agit à la fois d’un travail théorique en documentologie et en organisation des connaissances, et d’un travail réflexif basé sur la conception et l’utilisation d’un outil de visualisation de graphe documentaire nommé Cosma. À travers l’analyse des écrits, schémas et réalisations documentaires d’Otlet, nous établissons que la logique de réseau qu’il propose n’est pas seulement de nature institutionnelle mais qu’elle s’applique également aux documents eux-mêmes. Nous reprenons une hypothèse émise par W. Boyd Rayward, jamais réellement éprouvée, qui consiste à établir un parallèle entre les composants fondamentaux du travail d’Otlet (principe monographique, classification décimale universelle) et ceux des systèmes hypertextuels (nœuds, liens). Nous vérifions empiriquement ce parallèle en nous appuyant sur les réalisations du programme ANR HyperOtlet, et nous le généralisons à la notion de graphe pour proposer les éléments d’une théorie relationnelle de l’organisation des connaissances. Nous caractérisons l’épistémologie de la documentation personnelle hypertextuelle comme réflexive et heuristique : la représentation du graphe met en évidence la nature réticulaire de certains processus d’écriture et, par-là, de pensée ; elle sert d’aide-mémoire, avec une logique d’émergence informationnelle. À partir de ce travail, nous proposons la notion de cosmographie comme mise en ordre d’un univers intellectuel par l’écriture, entre idiotexte (écriture comme mémoire prothétique singulière), hypertexte (écriture réticulaire) et architexte (écriture de l’écriture).
... Endowing machines with relational learning and reasoning skills over diverse inputs is a longstanding goal in artificial intelligence [Bush, 1945, Koller et al., 2007, Davis and Marcus, 2015, Lake et al., 2017, Battaglia et al., 2018. Within this broad goal, the sub-discipline of graph learning focuses on relational prediction tasks over data that can be organized naturally into interconnected networks of nodes and edges. ...
... [it] is awe-inspiring beyond all else in nature" [Bush, 1945]. Our goal is to expand on the idea of associating individuals' personal information items-what Bush called books, records, and communications, and what today might be files, emails, and messages-according to their higher-level purposes and usages. ...
Thesis
Many important problems in machine learning and data mining, such as knowledge base reasoning, personalized entity recommendation, and scientific hypothesis generation, may be framed as learning and inference over a graph data structure. Such problems represent exciting opportunities for advancing graph learning, but also entail significant challenges. Because graphs are typically sparse and defined by a schema, they often do not fully capture the underlying complex relationships in the data. Models that combine graphs with rich auxiliary textual modalities have higher potential for expressiveness, but jointly processing such disparate modalities--that is, sparse structured relations and dense unstructured text--is not straightforward. In this thesis, we consider the important problem of improving graph learning by combining structure and text. The first part of the thesis considers relational knowledge representation and reasoning tasks, demonstrating the great potential of pretrained contextual language models to add renewed depth and richness to graph-structured knowledge bases. The second part of the thesis goes beyond knowledge bases, toward improving graph learning tasks that arise in information retrieval and recommender systems by jointly modeling document interactions and content. Our proposed methodologies consistently improve accuracy over both single-modality and cross-modality baselines, suggesting that, with appropriately chosen inductive biases and careful model design, we can exploit the unique complementary aspects of structure and text to great effect.
... La interactividad es un bucle de comunicación, fundamentalmente bidireccional, que, a diferencia de la interacción, se encuentra mediada generalmente por un artefacto electrónico. Este artefacto electrónico es el que ha acarreado la enorme confusión terminológica de atribuir que existe interactividad entre humano-máquina, cuando en realidad la interactividad se refiere a la interacción mediada entre humano-humano (Bush, 1945;Ponce-Díaz, 2018a;Weiner, 1965). ...
... ¿o es todo a la vez?" (Alayón-Gómez, 2009b).La respuesta más completa a estas interrogantes sería una afirmación a la última, que es todo a la vez. Y es a esta literatura, que es digital en su producción, manipulación y almacenaje, a la que llamamos hipertextual (Ponce-Díaz, 2018a)(Aarseth, 1997).En 1945 Vannevar Bush concibió́ la idea original del hipertexto y la describe en un artículo referente al dispositivo MEMEX(Bush, 1945), pero fue Ted Nelson quien acuñó el término y creó Xanadú́3 0 ; pero fue hasta 30Nelson(1965)concebía las computadoras como máquinas de "media" y buscó una generación de media que superara las 1967 que, junto a Andries Van Dam, diseñaran el Hypertext Editing System [HES], (Barnet, 2010). Cuando Ted Nelson utilizó el término "Hipertexto" en 1965 tenía la intención de generar nuevas maneras de organizar un texto para que el lector pudiera abordarlas en la secuencia que deseara, en oposición a solamente seguir la secuencia asignada por el autor. ...
Book
Full-text available
Cartografías para navegar Narrativas Ergódicas: Notas [hacia] el análisis del discurso de los videojuegos desde los Estudios Visuales es parte de una investigación en proceso que busca ser una invitación al desarrollo de metodologías y herramientas para el análisis del discurso de los videojuegos. En los siguientes apartados ofrecemos diversas tentativas conceptuales que abonen a una aproximación del análisis del discurso de los videojuegos, entendiendo a los mismos como manifestaciones de la cultura visual contemporánea. Con este ensayo aspiramos a generar materiales que sean accesibles y de beneficio a cuerpos estudiantiles interesados en el análisis de los videojuegos y sus narrativas; lo que permita abonar a la generación de conocimiento por medio del abordaje crítico de estos tipos de texto desde la transdisciplina de los Estudios Visuales y el Análisis del discurso. Buscamos que este ensayo sea uno de muchos pasos para la generación de más y mejores teórías, instrumentos y cartografías para el análisis del discurso de los videojuegos. Romano Ponce Díaz Iván Ávila González (coautor) ISBN 978-607-99587-2-5
... La interactividad es un bucle de comunicación, fundamentalmente bidireccional, que, a diferencia de la interacción, se encuentra mediada generalmente por un artefacto electrónico. Este artefacto electrónico es el que ha acarreado la enorme confusión terminológica de atribuir que existe interactividad entre humano-máquina, cuando en realidad la interactividad se refiere a la interacción mediada entre humano-humano (Bush, 1945;Ponce-Díaz, 2018b;Weiner, 1965). ...
... En 1945 Vannevar Bush concibió la idea original del hipertexto y la describe en un artículo referente al dispositivo MEMEX (Bush, 1945), pero fue Ted Nelson quien acuñó el término y creara Xanadú; 14 pero fue hasta 1967 que junto a Andries Van Dam diseñaran el Hypertext Editing System [HES], (Barnet, 2010). Cuando Ted Nelson utilizó el término "Hipertexto" en 1965 tenía la intención de generar nuevas maneras de organizar un texto para que el lector pudiera abordarlas en la secuencia que deseara, en oposición a solamente seguir la secuencia asignada por el autor. ...
Chapter
Full-text available
La expansión y masificación de las tecnologías digitales de información ha acarreado el surgimiento de construcciones léxicas particulares, especialmente en las comunidades de entusiastas y aficionados a los videojuegos. Eventos como la pandemia de COVID-19 ha acelerado la transcodificación de esas construcciones léxicas y su adopción en comunidades más extensas. La mencionada transcodificación ha acarreado que, en el momento de estudiar a los videojuegos, vocablos como interactividad, interacción y retroalimentación se utilicen como sinónimos. Utilizando los planteamientos teóricos de los Estudios Visuales, retomaremos los estudios de Elena Fernández de Molina Ortés, Espen J. Aarseth y Jeronimo Alayón Gómez, con la finalidad de verificar al concepto de interactividad frente al de cibernética y de abonar al uso de un léxico más informado en los estudios del discurso de los videojuegos, la literatura ergódica, ciberliteratura e hipertextos.
... La interactividad es un bucle de comunicación, fundamentalmente bidireccional, que, a diferencia de la interacción, se encuentra mediada generalmente por un artefacto electrónico. Este artefacto electrónico es el 165 que ha acarreado la enorme confusión terminológica de atribuir que existe interactividad entre humano-máquina, cuando en realidad la interactividad se refiere a la interacción mediada entre humano-humano (Bush, 1945;Ponce-Díaz, 2018b;Weiner, 1965). ...
... En 1945 Vannevar Bush concibió la idea original del hipertexto y la describe en un artículo referente al dispositivo MEMEX (Bush, 1945), pero fue Ted Nelson quien acuñó el término y creara Xanadú; 14 pero fue hasta 1967 que junto a Andries Van Dam diseñaran el Hypertext Editing System [HES], (Barnet, 2010). Cuando Ted Nelson utilizó el término "Hipertexto" en 1965 tenía la intención de generar nuevas maneras de organizar un texto para que el lector pudiera abordarlas en la secuencia que deseara, en oposición a solamente seguir la secuencia asignada por el autor. ...
Book
Full-text available
En la década de los 90 del siglo pasado se produjo, al menos a nivel teórico, un cambio importarte en la concepción del componente léxico en el ámbito de la enseñanza de las segundas lenguas. Bogaards, entre otros, rompió con la distinción tradicional entre la enseñanza de la gramática y el léxico al señalar que los dos componentes son interdependientes y que la lengua debe contemplarse como un conjunto de elementos léxicos que requieren estructuras sintácticas. Es decir, se puso el foco en el vocabulario como pieza nuclear de la enseñanza aprendizaje en las segundas lenguas. Esta concepción, en mi opinión, debería también servir de punto de partida en la enseñanza de la lengua materna, aunque, claro está, con su adaptación a las peculiaridades propias y a las diferencias que esta tiene con la enseñanza de lenguas extranjeras. En ambos casos se ha dejado que sean los propios alumnos los que aprendan por su cuenta el léxico, mientras los docentes nos dedicamos a enseñarles las reglas gramaticales y otros aspectos de los distintos niveles lingüísticos. No quiero con esto afirmar que el léxico haya estado ausente de los libros y de las clases, pero no lo ha estado de manera sistematizada, organizada. Y cuando la falta de tiempo obliga a seleccionar ejercicios o contenidos, es el nivel léxico el que queda postergado. Hay dos preguntas a mi modo de entender la enseñanza del léxico que son claves. Por un lado, ¿qué léxico enseñar?, pregunta que nos lleva irremediablemente al concepto de selección léxica. Selección léxica por etapas educativas, por materias. También nos lleva a otra pregunta ¿qué vocabulario deben conocer nuestros alumnos al final de tal o cual etapa educativa? Responder a estas preguntas resulta imprescindible para que el profesor se implique en la tarea de aprendizaje de los alumnos y pueda, para ello, programar convenientemente la enseñanza. Las respuestas no son nunca sencillas y en este caso tampoco, pero merece la pena intentarlo. La otra gran pregunta que debemos hacernos es ¿cómo enseñar el léxico? En las respuestas a esta pregunta se ha avanzado más. Desde hace tiempo en los libros dedicados a la enseñanza de la lengua española hay ejercicios sobre léxico, pero como señalé antes, estos ejercicios no están organizados en torno a un objetivo final. Y además se olvida en muchos casos una cuestión fundamental: la competencia léxica no solo es conocer palabras, sino también y principalmente, saber usarlas. La enseñanza del léxico, además, debe implicar coordinadamente a todos los profesores, no es una tarea exclusiva de los profesores de lengua. Se echa cada vez más en falta la práctica, los ejercicios en los que el alumno debe usar aquellas palabras que ha aprendido, la redacción de textos. Se olvida también que para que una palabra pase a la memoria permanente necesita, según los casos, de entre 6 y 10 exposiciones de media. Es decir, no vale con que el alumno lea una palabra y anote su significado. También se observa con preocupación el uso cada vez menor de los diccionarios en las aulas. Y eso que hoy con los medios tecnológicos que tenemos su uso cada vez es más sencillo. Los diccionarios, los buenos diccionarios, son una fuente inagotable de información sobre las palabras. No solo nos dicen su significado, también sus características morfológicas y sus posibilidades combinatorias; además de contar con ejemplos de uso. Debemos enseñar a los alumnos a utilizar los diccionarios, y no solo en su función decodificadora (para entender un texto), sino también en su función codificadora (para construir textos). El léxico debe ocupar un lugar nuclear en la enseñanza de una lengua y, por eso, libros como el que contiene esta presentación deben ser debidamente valorados. Dividido en cuatro grandes apartados, Léxico y disponibilidad léxica, Léxico y estudios del discurso, Léxico, comunicación y redes sociales y Léxico y didáctica de la lengua y la literatura aborda diferentes temas y desde diferentes enfoques, lo que lo hace mucho más valioso. José Antonio Bartol Hernández UNIVERSIDAD DE SALAMANCA, ESPAÑA
... Recommender systems are embedded in Information Retrieval (IR) systems, whose main goal is "to provide, to its user, access to informa- In 1945, the Director of the Office of Research and Scientific Development for the US government during World War II, Vannevar Bush, published the famous article "As We May Think" (Bush, 1945). The text shed light on the problem of what he called a "growing mountain of research" (Bush, 1945, p. 112). ...
... To solve the issue of informational explosion, Bush (Bush, 1945) recommended to use the (then incipient) information technologies. He ...
Thesis
Full-text available
Scholarly communication is increasingly being mediated by Academic Social Media (ASM) platforms, which combine the functions of a scientific repository with social media features such as personal profiles, followers and comments. In ASM, algorithmic mediation is responsible for filtering the content and distributing it in personalised individual feeds and recommendations according to inferred relevance to users. However, if communication among researchers is intertwined with these platforms, in what ways may the recommendation algorithms in ASM shape scholarly communication? Scientific literature has been investigating how content is mediated in data-driven environments ranging from social media platforms to specific apps, whereas algorithmic mediation in scientific environments remains neglected. This thesis starts from the premise that ASM platforms are sociocultural artefacts embedded in a mutually shaping relationship with research practices and economic, political and social arrangements. Therefore, implications of algorithmic mediation can be studied through the artefact itself, peoples’ practices and the social/political/economic arrangements that affect and are affected by such interactions. Most studies on ASM focus on one of these elements at a time, either examining design elements or the users’ behaviour on and perceptions about such platforms. In this thesis, a multifaceted approach is taken to analyse the artefact as well as the practices and arrangements traversed by algorithmic mediation. Chapter 1 reviews the literature about ASM platforms, and explains the history of algorithmic recommendations, starting from the first Information Retrieval systems to current Recommender Systems, highlighting the use of different data sources and techniques. The chapter also presents the mediation framework and how it applies to ASM platforms, before outlining the thesis. The rest of the thesis is divided in two parts. Part I focuses on how recommender systems in ASM shape what users can see and how users interact with and through the platform. Part II investigates how, in turn, researchers make sense of their online interactions within ASM. The end of Chapter 1 shows the methodological choices for each following chapter. Part I presents a case study of one of the most popular ASM platforms in which a walkthrough method was conducted in four steps (interface analysis, web code inspection, patent analysis and company inquiry using the General Data Protection Regulation (GDPR)). In Chapter 2 it is shown that almost all the content in ASM platforms are algorithmically mediated through mechanisms of profiling, information selection and commodification. It is also discussed how the company avoids explaining the workings of recommender systems and the mutually shaping characteristic of ASM platforms. Chapter 3 explores the distortions and biases that ASM platforms can uphold. Results show how profiling, datafication and prioritization have the potential to foster homogeneity bias, discrimination, the Matthew effect of cumulative advantage in science and other distortions. Part II consists of two empirical studies involving participants from different countries in interviews (n=11) and a research game (n=13). Chapter 4 presents the interviews combined with the show and tell technique. The results show the participant’s perceptions on ASM affordances, that revolve around six main themes: (1) getting access to relevant content; (2) reaching out to other scholars; (3) algorithmic impact on exposure to content; (4) to see and to be seen; (5) blurred boundaries of potential ethical or legal infringements, and (6) the more I give, the more I get. We argue that algorithmic mediation not only constructs a narration of the self, but also a narration of the relevant other in ASM platforms, configuring an image of the relevant other that is both participatory and productive. Chapter 5 presents the design process of a research game and the results of the empirical sessions, where participants were observed while playing the game. There are two outcomes for the study. First, the human values researchers relate to algorithmic features in ASM, the most prominent being stimulation, universalism and self-direction. Second, the role of the researcher’s approach (collaborative, competitive or ambivalent) in academic tasks, showing the consequential choices people make regarding algorithmic features and the motivations behind those choices. The results led to four archetypal profiles: (1) the collaborative reader; (2) the competitive writer; (3) the collaborative disseminator; and (4) the ambivalent evaluator. The final chapter summarises the ways in which ASM platforms forges people’s perceptions and the strategies people employ to use the systems in benefit of their careers, answering each research question. Chapter 6 discusses the implications of algorithmic mediation for scholarly communication and science in general. The dissertation ends with reflections on human agency in data-driven environments, the role of algorithmic inferences in science and the challenge of reconciling individual user’s needs with broader goals of the scientific community. By doing so, the contribution of this thesis is twofold, (1) providing in-depth knowledge about the ASM artefact, and (2) unfolding different aspects of the human perspective in dealing with algorithmic mediation in ASM. Both perspectives are discussed in light of social arrangements that are mutually shaped by artefact and practices.
... Para Saracevic (1996, p. 42), Bush (1945), como cientista do MIT e em plena Segunda Guerra Mundial, não só aponta o problema da "explosão informacional" como também sua possível solução com o uso das "tecnologias da informação", criando o cenário para o surgimento da Ciência da Informação (CI) nos anos 50. Mooers (1951) aponta um dos caminhos da C.I. que ele denominaria como Recuperação da Informação (RI) através de seu protótipo Zatocoding. ...
Article
Full-text available
O objetivo geral desta pesquisa foi analisar se há uma variação temporal característica da distribuição de valores de termos relevantes ao longo do tempo da produção de textos que possa contribuir como um critério para o processo de sua indexação automática. Foram analisadas as teses de doutorado dos programas de pós-graduação (PPGs) da área de Ciências Humanas da UFMG, considerando-se 7 PPGs distintos, sendo cada um deles um corpus, com um total de 929 teses defendidas período de 12 anos, de 2007 a 2018. Os termos considerados foram todos os sintagmas nominais contidos nos próprios textos das teses. Cada sintagma nominal recebeu um valor associado à sua relevância como descritor de acordo com os critérios de frequência do termo na própria tese (TF - Term Frequency) e com o inverso da frequência de ocorrência do termo no total de teses de cada PPG (IDF - Inverse Document Frequency). As teses foram divididas em 12 grupos em cada PPG para o cálculo da data média de defesa das teses e da média de pontuação consolidada dos termos relevantes nas teses. Como resultados, identificou-se o comportamento característico de cada PPG através de um gráfico de dispersão do nível médio de pontuação de relevância ao longo do tempo. Para cada gráfico de cada um dos 7 PPGs foi adicionada uma linha de tendência, considerando seu respectivo R², e feita sua análise específica. Todos os comportamentos de distribuição temporais foram caracterizados em equações e podem ser aplicados como critério para indexação automática.
... As Bush envisioned it, memex, a portmanteau of memory and expansion, is a 'future device…in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory' (Bush, 1945). ...
Article
This article explores collaborative conversation as a method to surface multiple perspectives on community engagement and forms of knowledge creation in the Congruence Engine project. Our exchanges naturally converged around four main areas: the multiple meanings of the term ‘community’ and the nature of these relationships; the modes and spaces for engagement; the different nature of knowledge emerging from these interactions; and, finally, a series of practical issues and challenges that can act as potential barriers. The article also reflects on the opportunities of dialogic writing to enable participatory, inclusive and polyvocal approaches in the development of a national collection.
... Progressive arrangements such as these in Michigan (and those reported by Chelin (2015) in the UK), offer the promise of creating levels of understanding between practitioners that allow for discovery and invention of new and better ways to bring readers to information. Bush (1945) quite rightly observed that the human mind does not work as a serial problemsolution stop-go mechanism. Rather, 'it operates by association. ...
Chapter
Full-text available
PLEASE ONLY CITE FROM THE ORIGINAL Introduction This chapter focuses upon a problem with public libraries that only an academic library can fix. This problem originates in a view of public library collection development as simply a response to user demand. Where this becomes the prevailing wisdom, as it has in the United Kingdom and Australia, the result is often that quality materials, such as publications by university presses, are shunned by selectors. For readers who would ordinarily choose this material the remedies to such a state of affairs are few and far between. One remedy is for the disenfranchised reader of quality materials to use the services of an academic library. While this is certainly an option for readers who live close to an academic library, and in much of the developed world it seems that academic libraries do open their doors to these community borrowers, questions arise as to why should the academic library take on the responsibility for catering to these community borrowers. What has occurred in our conception of the public library that quality materials are, largely, no longer acquired? In this chapter, a sociology of knowledge perspective is used to look for explanations of this phenomenon, but more so to provide a way forward based on the understanding that in many cases 'the horse has bolted' with regard to public library collection quality and what remains is to model a new social future for public-academic library partnership that can bring fluidity to citizens' conceptions of what 'my library' means in order to ensure that important works are available to all, and not only the select few. The explanation that seems to best fit the problem is that public librarians take this course in support of an ideological commitment to reading. Within their professional context, reading any material is preferable to a potential user not reading. In terms of the social practice, what seems clear here is how public librarians have appropriated 'routines and interpretations' relating to democratic access to resources and have reinterpreted and reinvented routines as a form of knowledge that they 'feed back into the field of action'-discourses on reading and civic participation, epistemology and the like (Reichertz 2013, p. 2 4). It is important to not mistake the argument made here as one linked to traditional literary questions of quality or even of scientific truth. What is sought is an understanding of how public librarians can hold-and hold deeply-the dichotomous view that they are serving users best when they refrain from guiding their reading at all. Advocated here is a means by which unhelpful polemics that locate materials selection in class-based narratives might be transcended (but not forgotten). I take the view that strong advocates for public libraries as ineluctably focused on quality materials (for example, Usherwood 2007) need to be better understood by practitioner communities. Critiques of Usherwood (and his ilk), such as that made by Pateman and Williment (2013) in their 'community-led model', which also, laudably, focuses on delivering quality materials for all users, have too little to say, I believe, on deeper questions of truth and associated epistemic concerns relating to collections. Shared concerns for quality materials can, it is hoped, provide an opportunity for us to question just how deep this sense of incommensurability in (public library) selection really does run. I believe a further synergy is due for development that can help to reconcile these views and that it cannot be achieved without considering the social role of the academic library in communities as part of broader strategies for the sector. In order to better understand how it is that public libraries have established a less than fulsome relationship with the knowledge concept it is helpful to approach the problem from outside the limited perspective that librarianship can offer. Reichertz's (2013) explication of a hermeneutic sociology of knowledge is a helpful starting point to explain the problem of knowledge that librarians seem to so often miss in managing information. Through an extended focus on engaging with this problem as a sociological one, and through engaging with practitioners and the historical narrative of librarianship, it becomes possible to purposively reflect on what responsibilities academic libraries might have for citizen readers left without resources by a circulation-focused public library model. In this chapter a range of factors that underpin why communities should consider new library models that are better able to contribute to the development of social capital are discussed. These include how greater equity can be fostered in delivering quality reading materials and working with cultures of reading that are substantially more complex than is often assumed. I then look to how integrated models of the public and academic library can offer the opportunity to deliver to communities whole-of-life solutions to their information needs and how this helps to maximise the growth of social and cultural capital. Finally, I seek to articulate how the development of a shared future for communities-and all of their libraries-can help users not only gain, but maintain, access to knowledge.
... Early efforts for digital documentation and creating links among these documents and digital archiving, simply, digital memory conceptualization dated back to 1945, when Bush (1945) introduced the concept of Memex (Memory Index). He asked readers to: ...
Article
Digital technologies have transformed the conventions of preserving, recalling, and forgetting the past as they provide new digital tools and platforms to remember, to forget and to collect data for individuals, societies, and corporations. With the convergence of new media, memory gains a global aspect along with its personal and local characteristics and turns into the digitally mediated memory. These technologies enable digital memory to be indexed, archived, circulated, and processed infinitely in cyberspace. Therefore, the advancements in the Web and cloud computing technologies yield new dimensions for memory studies to be discussed from a political economy perspective since digitally mediated memory has some economic, political, societal, and cultural impacts on societies. This study conceptually scrutinizes the commodification processes of digital memory and analyzes its material and immaterial bases from a political economy perspective, and claims that they are fundamentally interwoven. The rare earths which are used to produce technological devices are considered as the material basis. Additionally, major technology corporations using these rare earths, and their data centers are taken as the extensions of its materiality. Digitally archived, managed, and retrieved memory is considered as data, which represent immaterial basis of digital memory. The materiality and immateriality of digital memory are not regarded as independent from the inherent power relations and ideologies of the current data economy. Thus, this study aims to discuss digital memory from a political economy perspective to reveal the flow between its materiality and immateriality and the inherent power relations in the data economy. It also poses the potential challenges, risks, and outcomes we may encounter in such an economic system.
... In the 1840s, Ada Lovelace envisioned artificial systems based on Babbage's machines assisting humans in musical composition [2] [3]. In the 1940s, Vannevar Bush envisioned the Memex and discussed how employing associative linking could enhance a human's ability to store and retrieve information [4]. The Memex made the human more efficient but did not actually do any of the thinking on its own. ...
Preprint
We will soon be surrounded by artificial systems capable of cognitive performance rivaling or exceeding a human expert in specific domains of discourse. However, these cogs need not be capable of full general artificial intelligence nor able to function in a stand-alone manner. Instead, cogs and humans will work together in collaboration each compensating for the weaknesses of the other and together achieve synthetic expertise as an ensemble. This paper reviews the nature of expertise, the Expertise Level to describe the skills required of an expert, and knowledge stores required by an expert. By collaboration, cogs augment human cognitive ability in a human/cog ensemble. This paper introduces six Levels of Cognitive Augmentation to describe the balance of cognitive processing in the human/cog ensemble. Because these cogs will be available to the mass market via common devices and inexpensive applications, they will lead to the Democratization of Expertise and a new cognitive systems era promising to change how we live, work, and play. The future will belong to those best able to communicate, coordinate, and collaborate with cognitive systems.
... However, ideas like this were a century before their time. In the 1940s, Vannevar Bush envisioned a system called the Memex and discussed how employing associative linking could enhance a human's ability to store and retrieve information [4]. Similar to the abovementioned calculating devices, the Memex made the human more efficient but did not actually do any of the thinking on its own. ...
Preprint
We are entering an era in which humans will increasingly work in partnership and collaboration with artificially intelligent entities. For millennia, tools have augmented human physical and mental performance but in the coming era of cognitive systems, human cognitive performance will be augmented. We are only just now beginning to define the fundamental concepts and metrics to describe, characterize, and measure augmented and collaborative cognition. In this paper, the results of a cognitive augmentation experiment are discussed and we calculate the increase in cognitive accuracy and cognitive precision. In the case study, cognitively augmented problem solvers show an increase of 74% in cognitive accuracy (the ability to synthesize desired answers) and a 27% increase in cognitive precision (the ability to synthesize only desired answers). We offer a formal treatment of the case study results and propose cognitive accuracy and cognitive precision as standard metrics to describe and measure human cognitive augmentation.
... Depending on how one wants to define , the earliest publications of information science is As We May Think by Bush (1945). When comparing entrepreneurship research and information systems science to physics (see Table 4-1), the research methodologies used for a type of science such as physics may not be the best for a study on the crossroads of entrepreneurship and information systems, as in the case for this research. ...
Thesis
Start-ups have gained media attention since Google, Facebook and Amazon were launched in the 1990s. The book Lean Start-up, published in 2011, was another important milestone for digital start-up literature. As unicorn companies emerge around the world, topics highlighted in the news include the vast amount of capital that digital start-ups are raising, the ways in which these digital ventures are disrupting industries, and their global impact on digital economy. However, digital start-ups, digital venture ideas, and their venture creation process lack a unified venture creation model, as there is a gap in the re- search on entrepreneurial processes in a digital context. This research is an explorative study of the venture creation process of innovative digital start-ups that examines what is missing from entrepreneurial process models in a digital technology context and investi- gates how early stage digital start-ups conduct the venture creation process, starting with the pre-phase of antecedents and ending with the launch and scaling of the venture. The research proposes a novel process model of innovative digital start-up venture crea- tion and describes the nature and patterns of the process. A conceptual model was devel- oped based on the entrepreneurship, information systems, and digital innovation litera- ture and empirically assessed with a multi-method qualitative research design. The data collected from semi-structured interviews, internet sources, and observation field notes covered 34 innovative digital start-ups and their founders. Interviews were conducted in- ternationally in high-ranking start-up ecosystems, and the data were analysed with the- matic analysis and fact-checked by triangulating internet data sources. The contribution to entrepreneurship theory is a new illustrative model of the venture creation process of innovative digital start-ups, including the emergent outcome of the process having a digi- tal artefact at its core (e.g., mobile apps, web-based solutions, digital platforms, software solutions, and digital ecosystems). Digital platforms and their multiple roles in the process are presented, as well as the role of critical events as moderators of the process which trigger new development cycles. During the venture creation process, the recombining of digital technologies, modules, and components enabled by digital infrastructures, plat- forms, and ecosystem partners represent digital technology affordances. This recombina- tion provides opportunities for asset-free development of digital venture ideas.
... Partly envisioned by Vannevar Bush in his prescient 1945 Atlantic article, "As We May Think," components of a digital infrastructure for scholarship slowly emerged in the decades following World War II [37]. Development accelerated with the founding of the World Wide Web in the early 1990s and received a major boost in the USA in 1994 with the national Digital Libraries Initiative, jointly funded by the National Science Foundation (NSF), the Defense Advanced Research Projects Agency, and the National Aeronautics and Space Administration. ...
Article
Full-text available
This article advances the thesis that three decades of investments by national and international funders, combined with those of scholars, technologists, librarians, archivists, and their institutions, have resulted in a digital infrastructure in the humanities that is now capable of supporting end-to-end research workflows. The article refers to key developments in the epigraphy and paleography of the premodern period. It draws primarily on work in classical studies but also highlights related work in the adjacent disciplines of Egyptology, ancient Near East studies, and medieval studies. The argument makes a case that much has been achieved but it does not declare “mission accomplished.” The capabilities of the infrastructure remain unevenly distributed within and across disciplines, institutions, and regions. Moreover, the components, including the links between steps in the workflow, are generally far from user-friendly and seamless in operation. Because further refinements and additional capacities are still much needed, the article concludes with a discussion of key priorities for future work.
... A era tecnológica que estamos vivendo, provavelmente, teve seu início marcado por pensamentos e eventos do século passado. Vannevar Bush (1945) já previa o que seria possibilitado pelo ambiente on-line em sua idealização de um dispositivo, denominado Memex, forma abreviada de Memory Extension (Extensão da Memória), que serviria para auxiliar a memória e guardar conhecimentos. Bush imaginou e descreveu, de maneira detalhada, uma máquina capaz de armazenar montanhas de informações, fácil e rapidamente acessíveis. ...
... O surgimento dos textos eletrônicos resgata o período final e o que se sucede à Segunda Guerra Mundial. Em um artigo intitulado "As we may think", Vannevar Bush (1945), engenheiro e político estadunidense, constatou que a comunicação se tornou mais rápida no pós-Guerra, aumentando o número de obras científicas e permitindo expansão do conhecimento, fenômeno similar ao do surgimento da prensa móvel. No entanto, houve questionamento acerca de uma forma viável para o acesso sistematizado a tais dados. ...
... One of the great riddles of online education, and online behaviour in general, is the development of vibrant knowledge-building collectives. The development of joint agency through electronic connections was always one of the great promises of the Internet (Bush, 1945), from early communities like the Whole Earth Lectronic Link (WELL) and early open-source coding communities, such as Linux and Apache. There have been multiple attempts to study emergent online communities in educational contexts through frameworks including classroom community (Rovai, 2002), communities of inquiry (Garrison et al., 2010) connectivism (Clarà & Barberà, 2013;Siemens, 2005), and knowledge forums (Lei & Chan, 2018;Scardamalia, 2004). ...
Article
Full-text available
This paper investigates how psychological needs spurring self-determined motivation relate to collective efficacy for flourishing in online learning communities. Self-determination theory posits individuals experience intrinsic motivation to flourish at educational tasks because of targeted satisfaction of the three psychological needs: autonomy, relatedness, and competence. However, studies conducted to investigate collective, technology-assisted learning processes suggest competence and relatedness may play a pivotal role in online community engagement and knowledge-sharing. Moreover, informal gaming experiences may mirror the collaborative skills needed in online educational/professional communities. These insights suggest confidence in one’s abilities to contribute to a community, the perception of a strong, supportive social culture in the online classroom, and informal online experiences may lead to self-determined motivation enabling agents in distributed, technology-assisted classrooms to collectively flourish. Little work has been done to examine effects of need satisfaction on collective efficacy in using online technologies. To fill this research gap, we used structural equation modelling to investigate perceptions of 636 undergraduate students enrolled in classes within an education department at a midwestern university employing weekly asynchronous blogging. Our results suggest students’ experience with multiplayer gaming, and need satisfaction towards competence and relatedness correlate with higher collective efficacy in technology-assisted classrooms employing discussion forums.
... The idea of using devices to record our environment, or everything in our daily lives is described as far back as Memex, 1945, "a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility" (Bush, 1945;Gemmell et al., 2002). What is changing is the portability of these systems and computer memory available. ...
Thesis
Full-text available
Abstract When we forget things, we feel anxious which can impact our day negatively. Some individuals believe they are forgetful, so emphatically, it disrupts their day. There has been little discussion about perceived forgetfulness in design and HCI, combined with few studied smart objects to aid with memory. However, embedded systems, radio frequency identification (RFID) and HCI research provides inspiration towards creating a solution. Challenges of creating a day-to-day smart object that can enhance a user’s lifestyle are explored and recommended design guidelines for creating a smart object in a specific domain are the focus of this thesis. Using an experience-centred approach, ‘Message Bag’ and ‘Tag Along’ are two purpose built object-based memory aids that have emerged as a result of investigating the design processes for smart objects. The work examines smart objects in the context of forgetting what items to pack in a bag. A solution presented is a device consisting of an RFID system involving (a) pre-tagging essential items; (b) scanning those tagged items and; (c) viewing a corresponding light illuminate, to communicate to the user. Although the conceptual model is simple, success depends on a combination of technical design, usability and aesthetics. These scanning interactions result in a person feeling more confident as suggested through autoethnography reporting, real-world, third person engagements - single user walkouts, conference demos, professional critiques, and residential weekends with potential users (focus group) studies conducted. My work involved extensive autobiographical research and design-led enquiries. Testing was undertaken with investigative prototypes, followed by field testing high-fidelity prototypes. This involved an in-the-wild comparative study involving six users over several months. Results show that people feel more confident and respondents claim no longer needing to continually check items are packed, thus ‘gaining time’, and feeling less forgetful. Although the application of RFID is not new to ubiquitous computing, this implementation, styling and system immediacy is novel. This thesis presents the development of ten prototypes as well as design guidelines. The research provides a solid base for further exploration, and includes discovery of the importance of a user’s style universe and extreme ease-of-use. I conclude with the presentation of early positive results including; (i) the unique form factor becomes a reminder itself and; (ii) usability coupled with the intuitive nature of the system is shown to be essential. We found that when you are creating a smart object, usability and an intuitive nature is even more important than in a standard system. When dealing within the domain of forgetfulness, this is paramount.
... In this instance, the relationship reflects a physical and psychological interaction between the human body and NIV machine to augment the user's residual function [54]. While symbiosis usually refers to living organism, the concept of symbiosis between human-machine has been evident for over half a century [55] and is usually an obligate relationship, that is, with humans dependent on the machine for survival. To date, only a few studies have described the relationship developed between people with a chronic illness and the machines that keep them alive. ...
Article
Purpose: Neuromuscular disorders (NMD) encompasses a wide range of conditions, with respiratory weakness a common feature. Respiratory care can involve non-invasive ventilation (NIV) resulting in fewer hospital admissions, a lower mortality rate and improved quality of life. The aim of this study was to explore the 'lived experience' of NIV by people with NMD. Methods: Interpretive Phenomenological Analysis (IPA) with semi-structured, face to face interviews with 11 people with NMD, using bi-level positive airway pressure for NIV for more than 12 months. Results: Three themes were interpreted: (i) Alive, with a life; (ii) Me and 'that' machine; and (iii) Precariousness of this life. NIV enabled hope, independence and the opportunity to explore previously perceived unattainable life experiences. Yet, participants felt dependent on the machine. Furthermore, practical considerations and fear of NIV failure created a sense of precariousness to life and a reframing of personal identity. Conclusion: The findings highlight the broad ranging positive and negative effects that may occur for people with NMD when using this important therapy. Ongoing non-judgemental support and empathy are required from health professionals as the use of NIV challenged concepts such as 'living life well' for people with NMD. IMPLICATIONS FOR REHABILITATIONNeuromuscular disorders may result in respiratory weakness requiring non-invasive ventilation (NIV).When prescribed early, NIV can results in fewer hospital admissions, a lower mortality rate and improved quality of life.The relationship of people with NMD with their NIV machine is complex and impacts on and requires adjustment to their identity.NIV users acknowledged that NIV provided hope but simultaneously recognised the precariousness of NIV on their life.In order to better support people with NMD healthcare professionals need to better understand how the physical, psychological and social implications of NIV affect an individual's life.
... While the idea of leveraging stigmergic markers for collective sense-making has a long history [2], most contemporary open-source and decentralization efforts have focused on sematectonic (content-creating) stimergy, such as code, social media, financial ledgers and executable contracts [4]. Closest to our proposal is the Solid eco-system [18] that, similarly, targets "re-decentralizing the web" [20], and empowers individuals to control their data. ...
Preprint
Full-text available
The web has become a dominant epistemic environment, influencing people's beliefs at a global scale. However, online epistemic environments are increasingly polluted, impairing societies' ability to coordinate effectively in the face of global crises. We argue that centralized platforms are a main source of epistemic pollution, and that healthier environments require redesigning how we collectively govern attention. Inspired by decentralization and open source software movements, we propose Open Source Attention, a socio-technical framework for "freeing" human attention from control by platforms, through a decentralized eco-system for creating, storing and querying stigmergic markers; the digital traces of human attention.
... Mais adiante, em 1945, o trabalho de Vannevar Bush representa um marco evolutivo para a GI. Bush (1945) publicou um artigo intitulado As we may think, onde pensou uma máquina capaz de armazenar e organizar toda a informação da humanidade (BARBOSA, 2008, p. 6). ...
Article
Full-text available
Information Management (IM) is a necessity in several organizational contexts and this requirement extends to processes and their respective information flows. However, from literature consulted, it is noted that there is no practical and practical consensus on what IM is and what are its characteristics. Under this argument, the proposition of this study is justified. This article aims to raise the theoretical and conceptual aspects of IM by proposing a current and consensual definition for this term. Methodologically, at first, a bibliographic research on the principles and foundations of IM was carried out. In a second moment, based on Dahlberg's Theory of Concept and on the UNISIST indexing principles, a current and consensual definition for IM was proposed. The following question was answered: considering the theoretical and conceptual panorama, what are the essential characteristics to propose a definition for IM term? As a result, it was found that IM is related to the information life cycle management, information flows, organizational management and information and communication technology. It is hoped that the advances provided by this study bring light and be used by professionals and the academic community, whether in the improvement of theory or in organizational practice related to IM. Keywords: Information Management; Information Resource Management; Concept Theory; Indexing Principles
Chapter
Full-text available
This chapter delineates the scope and core ideas of the book. The Digital Turn had promised to bring neutrality, fairness and accuracy to research; it created the illusion that it was possible to incorporate technology in knowledge creation practices whilst still operating within the traditional model of disciplines separation. But by adding further complexity to reality and by worsening existing inequalities in the world, the digital transformation has increasingly exposed how illusory such promises were. More recently, the dramatic increase in technology adoption brought about by the COVID-19 pandemic has conclusively established the inadequacy of our current model of knowledge. In this chapter, I examine how the rigid division into compartmentalised, competing disciplines has contributed to exalting computational methods as neutral whilst stigmatising consciousness and criticality as carriers of injustice. Taking the humanities as a focal point, I retrace schisms between the humanities, the digital humanities and critical digital humanities; these are embedded, I argue, within the old dichotomy of science versus humanities. In moving beyond the current static framework, I argue for a new model of knowledge creation.
Article
Full-text available
Em uma sociedade cada vez mais conectada, a produção de conteúdos digitas nas redes socias, blogs e outras plataformas têm apresentado um crescimento exponencial em termos de produção de informacional. O presente estudo apresenta como o uso das técnicas de Search Engine Optimization (SEO), pode contribuir no processo de ranqueamento nos mecanismos de busca e tem como objetivo identificar quais as técnicas de Search Engine Optimization devem ser aplicadas para se obter uma melhor relevância nos mecanismos de busca. Considerando a falta de investigação dedicadas entre as temáticas, utilizou-se como metodologia a revisão de literatura, com o propósito de contextualizar o cenário de pesquisa atual entre sobre ambas as temáticas. Percebeu-se que todos os estudos selecionados aplicaram as técnicas de Search Engine Optimization em ambientes informacionais digitais. Conclui-se que as técnicas de Search Engine Optimization quando aplicadas a ambientes informacionais digitais, podem trazer vários benefícios a esses ambientes digitais como por exemplo: melhorar a visibilidade, o ranqueamento nos mecanismos de busca e transformar esses ambientes cada vez mais relevantes nos mecanismos de busca, assim facilitando e contribuindo para a recuperação da informação na Web.
Thesis
Full-text available
This thesis is based on the inquiry of how we could quantify the information behavior patterns through the logs collected while people search and seek for the information. At the same time, it addresses the socio-cognitive characteristics of the user and explores how those characteristics could be predicted through the computer analysis of the logs collected while search tasks are done. It is based on the premise that if we could quantify information behavior patterns by computer, the output of such a process could be used as feedback for the information systems that are adaptive to the user's needs. Log analysis as a method in information science and particularly in the information behavior domain is rarely used. Most of the existing literature is based on qualitative research or quantitative methods such as surveys. And as we live in the times of ultralarge systems that already have some of those techniques implemented in them but closed in a black box, we must understand those techniques. The main challenge of this thesis approach is how to quantify behavior based on the logs collected. We could understand that logs collected are one of the rare data which could be collected and then processed by the computer to describe behavior. In particular, it is important as some theories told us that most human behavior is not based on rationality but is influenced by the environment. To simulate the environment of the user's day-to-day information searching process the laboratory experiment was chosen. Users were first given the survey that was used to describe their socio-cognitive characteristics from the perspective of information need and information behavior. After filling the survey, participants had to fulfill four information searching tasks in the domain of music, and logs from those searches were collected. The biggest challenge was based on the issue that by analyzing data by a human there was a clear difference between the user socio-cognitive characteristics and his logs collected in the process of information search. But if we want to design adoptive information systems computers should do such a process and use its output as feedback to adapt to the user. To do so algorithm was developed that transforms the absolute numbers into relative ones. And only after doing so the computer could process data and evaluate the prediction in terms of recognizing users' characteristics. As a result of this thesis, some user characteristics could be recognized through the logs and the algorithm showed results. And as such, this thesis provides the one step towards a better understanding of how users use information systems, how those information systems could be designed to be more adaptive and human-friendly.
Article
Recommender Systems are omnipresent in our digital life. Most notably, various media platforms guide us in selecting videos, but recommender systems are also used for more serious goals, such as news selection, political orientation and work decisions. As argued in this survey and position article, the paradigm of recommendation-based feeds has changed user behaviour from active decision making to rather passively following recommendations and accepting possibly suboptimal choices that are deemed “good enough”. We provide a historic overview of media selection, discuss assumptions and goals of recommender systems and identify their shortcomings, based on existing literature. Then, the perspective changes to hypertext as a paradigm for structuring information and active decision making. To illustrate the relevance and importance of active decision making, we present a use case in the field of TV or media selection and (as a proof of concept) carried over to another application domain: maintenance in industry. In the discussion section, we focus on categorising these actions on a spectrum of “system-1” (fast and automated) tasks and “system-2” (critical thinking) tasks. Further, we argue how users can profit from tools that combine active (spatial) structuring and categorising with automatic recommendations, for professional tasks as well as private, leisure activities.
Article
Full-text available
Em uma sociedade cada vez mais conectada, a produção de conteúdos digitas nas redes socias, blogs e outras plataformas têm apresentado um crescimento exponencial em termos de produção de informacional. O presente estudo apresenta como o uso das técnicas deSearch Engine Optimization(SEO), pode contribuir no processo de ranqueamento nos mecanismos de busca e tem como objetivo identificar quais as técnicas de Search Engine Optimizationdevem ser aplicadas para se obter uma melhor relevância nos mecanismos debusca.Considerando a falta de investigação dedicadas entre as temáticas, utilizou-se como metodologia a revisão de literatura, com o propósitode contextualizar o cenário de pesquisa atual entre sobre ambas as temáticas. Percebeu-se que todos os estudos selecionados aplicaram as técnicas de Search Engine Optimizationem ambientes informacionais digitais. Conclui-se que as técnicas de Search Engine Optimizationquando aplicadas a ambientes informacionais digitais, podem trazer vários benefícios a esses ambientes digitais como por exemplo: melhorar a visibilidade, o ranqueamento nos mecanismos de busca e transformar esses ambientes cada vez mais relevantes nos mecanismos de busca, assim facilitando e contribuindo para a recuperação da informação na Web.
Article
Full-text available
Discute-se uma possível apro-ximação entre a Ciência da Informação e o conceito de Ciência Cidadã. Concei-tua-se a Ciência Aberta e o Acesso Aberto enquanto movimentos que discutem a transparência do fazer científico e garan-tem o acesso à informação produzida, e analisa-se o papel da Ciência Cidadã nesse contexto. A metodologia empregada é a análise de conteúdo. Identifica-se pouca produção por parte da Ciência da Informação acerca da Ciência Cidadã. Apesar disso, a Biblioteconomia apresenta relatos de sucesso em parceria com a Ciência Cidadã. Conclui-se que a relação entre as áreas tende a se intensificar no futuro, com a necessidade de se trazer experiên-cias empreendidas pelas bibliotecas para serem analisadas no campo interdiscipli-nar da Ciência da Informação.
Chapter
The discussion of postmodernism can be identified in a wide variety of fields and ranges from literature, architecture, and epistemology to cultural theory. At the same time, the term postmodern has a semantic range that leads Eco (2003) to understand it as a ‘passe-partout’ (Eco 2003, p. 77). In the following, the term postmodern is understood less as a designation as an era than as an epistemological approach. From this perspective, postmodernism represents a critical-subversive approach to the claims to totality and normative tendencies that unfold in modernity. ‘Postmodernism’ consequently results from the relational structure of modernism/postmodernism. In the sense of a differential interplay of demarcation, both terms define reciprocal – “Although there are significant differences between the Postmodernist theories can all focus on the criticisms of modernism” (Yaakoby 2012, p. 10).
Book
A presente publicação de matriz conceptual propõe-se revisitar conceitos de epistemologia e ciência da informação, desde a epistemologia ou teoria do conhecimento convencionais na ciência moderna às epistemologias nas ciências da informação, o que, entre outras revisões críticas da literatura, implica atender a um resgate da epistemologia (projecto, propostas e processo de transformação) nas últimas décadas (Nunes, 2008)1. Esta proposta mais recente de “pragmatismo epistemológico” advoga um programa alternativo de alternativas, opondo a todas as formas de soberania epistémica a noção de “ecologia de saberes”
Article
В статье рассматривается проблема становления бесконтактного мира — трансформации всех типов межчеловеческого взаимодействия в направлении взаимодействия с интерфейсом, с различного рода приложениями-сервисами, и сопутствующего изменения форм работы с навыками и знаниями. Анализируются генеалогические истоки и концептуальные границы феномена геймификации, принципы и масштабы ее применимости. Диагностируется, что геймификация, будучи призвана снять отчужденность свойственную бесконтактному миру — вернуть эмоциональные связи (в том числе и в дистанционно-образовательные формы), в действительности превращается в систему рейтингов и индикаторов, подчиняющую человека агональной логике. Такая геймификация оказывается машинно-ориентированной, т.к. по своим требованиям приближает человека к машине. Следовало бы, напротив, приблизить машину к человеку — создать человеко-ориентированную геймификацию. Принципы такой геймификации с человеческим лицом — путем сегментации психотипов, понимания мотивационно-поведенческих аспектов и критической функции игры — показаны в статье. Делается вывод, что в условиях бесконтактного мира геймификация должна подчиняться не только бихевиориальной логике и психологии, но и творческо-критическому началу — ее задача переключать внимание с результата на процесс, со скорости отклика на погружение в проблему, с выбора решения на новую творческую формулировку вопроса, с готовых вариантов на изобретение — в противном случае геймификация останется формой цифровой эксплуатации человеческого капитала.
Research
Full-text available
Since the 1980s, two different paradigms have reshaped industrial societies: the neoliberal paradigm and a research and innovation paradigm. Both have been conceptualized and translated into strong policies with massive economic and social consequences. They provide divergent responses to the environmental transition. The neoliberal paradigm is based on economic models and geopolitical solutions. The research and innovation paradigm's goal is to manage knowledge differently in order to reorient the evolution of society. Since the mid-1990s, a version of the research and innovation paradigm has led to the design of large-scale research and innovation policies. This book examines how these policies have evolved and how they can be extended and reformed to respond to present and future environmental constraints. It studies the transformation of the conception, organization, and role of science and technology in the evolution of industrial societies and explores the future of these developments. The book offers three unique lines of inquiry. The first is to focus not specifically on economics, sociology, political science, or history, but on knowledge creation from an institutional and reflexive point of view. The second is to establish a convergence between the British school of science and technology studies and the research trends opened by the work of Michel Foucault. Both introduced trans-disciplinary and policy-oriented research associating case studies, long-term perspectives, and theory. The third is to consider climate change as the overwhelming challenge of our time. The book is an insightful guide for students, scholars, and researchers across the humanities and social sciences, including philosophy, political science, law, economics, business, and media.
Chapter
The history of artificial intelligence in medicine (AIM) is intimately tied to the history of AI itself, since some of the earliest work in applied AI dealt with biomedicine. This chapter first provides a brief overview of the early history of AI, but then focuses on AI in medicine (and in human biology) and provides a summary of how the field has evolved since the earliest recognition of the potential role of computers in the modeling of medical reasoning and in the support of clinical decision making. The growth of medical AI has been influenced not only by the evolution of AI itself, but also by the remarkable changes in computing and communication technologies. Accordingly, this chapter anticipates many of the topics that are covered in subsequent chapters, providing a concise overview that lays out the concepts and progression that are reflected in the rest of this volume.KeywordsArtificial intelligence historyAIM historyAI winterRoles of knowledge and data in AIMModeling expertiseExpert systemsData scienceMachine learningAIM and clinical decision support
Article
Full-text available
Knowledge is a part of life and it make us to bring food and shelter so we need to get gather knowledge either form the source of knowing people or from books today availability of e-resources in a college library is abandon and its very common to retrieve the resources from internet or from the digital library especially the setup of digital library at colleges is magnificent. But their proper and maximum use is a matter for discussion. We are living in Digital Era, now availability of Electronic resource in academic Library is very common. Because the internet –in particular, the World Wide Web is rapidly Displacing the user of external storage media, such as floppy disks and CD-ROMs. Due to information revaluation the Digital Library are growing worldwide. The user need regarding electronic resource is increasing day by day, according to their academic requirement. This overview is weighted toward resources available on the internet. Also, this review is limited to resources that provide information that is not available in traditional media and that is associated with established scholars or institutions. This paper discuss the Types Advantages, Benefit of Electronic resources and utilize the maximum e-resources and give few suggestion for improvement of services of E-resources for the future need. For an instance, The present paper examines the existence of various e-resource databases in Guru Gobind Singh Indraprastha University Library like this university there are many universities or colleges have adopted the system of digital forum for the purpose of study it helps lots of readers as well as scholars. The study also highlights the preferences and importance of online resources among the teachers and research scholars. Electronic resources (or e-resources) are materials in digital format accessible electronically. Examples of e-resources are electronic journals (e-journal), electronic books (e-book) and online databases in varied digital formats such as Adobe Acrobat documents, (pdf), and Web Pages, (html). E-resources can also include articles from newspapers, dictionaries or encyclopaedias as well as images, and many other items. The library purchases and subscribes to many items in electronic or e-format so that you can have access free of charge anywhere you have an internet connection. If you would like to check out some of our eresources, you can find e-books in our catalogue directly from the LRC's homepage or in several databases holdings which can be accessed at www.aamu.edu/library like these are the examples to retrieve data from the database this is called digital form of information gathering and also called eresources.
Article
Full-text available
"Kaleidoscopic reading online. The hypothesis Angela Pop examines is that the Internet brings with it new textual-discursive practices, including a new type of reading: kaleidoscopic reading. This emerges from information published in digital frames created by the world of Internet. It is characterized by a non-linear reception path, in which the reader follows milestones created by hyperlinks, in the page displayed by the computer. It is a quick reading, which goes from text to hypertext and aims to familiarize the readers with the content they are reading. An issue in an online environment concerns the role the internet user might play. Pop distinguishes three such online roles: 1. first, the internet reader [from the French lecteurnaute, i.e. lecteur + internaute (=internet user)] is the passive reader who only enjoys Internet content. He only reads texts written by others, and for him the Internet is a source of information. 2. second, a virtual author, one who publishes various papers (and content), which makes us consider him an author in the real world. 3. A third role is played by the internet scriptor (in French: scriptornaute) a person who makes comments online, on texts published by virtual authors. A transition of the Internet user from one “role” to another can take place in the process of kaleidoscopic reading. Pop identifies three essential types of kaleidoscopic reading: circular, spiral and open. “Ecrilecture” (from the French écrire=to write + lecture=to read) coexists with online kaleidoscopic reading. Keywords: Internet, communication, “kaleidoscopic” reading, internet lecturer, author, internet scriptor, “écrilecture”"
Thesis
Full-text available
Objekte/Wissensdinge/Artefakte/Untersuchungsgegenstände der Geisteswissenschaften besitzen eine jeweils spezifische Affordanz und Prägekraft, die durch die Affordanz des Computers ermittelt werden muss. Hier kann ein Computer die bloße Rolle des performanten Bibliothekregals oder Zettelkasten übernehmen, oder er kann durch seine vielfältigen medialen Eigenschaften neue geisteswissenschaftliche Erfahrungen, Explorationsräume und innovative Zugänge zu den Untersuchungsgegenständen produzieren. Um diese Diskussion führen zu können, sollten u.a. medienarchäologische und -ökologische Aspekte in Betracht gezogen werden. Im Kontext der Masterarbeit „Digital Humanities und Digitalisierungspraktiken: eine mediale Analyse geisteswissenschaftlicher Digitalisate.“ wird auf ein Ergebnis gehofft, das den Mehrwert des Zusammenspiels von den Fachrichtungen wie Medienkulturwissenschaften, Digital Humanities und Game Studies demonstriert. Diese Schnittstelle bietet einen Fundus von Werkzeugen an, die ein Licht auf den aktuellen, den historischen und den perspektivischen Umgang mit digitalen Medien in den Geisteswissenschaften werfen können.
Thesis
Full-text available
O nível de qualidade das informações que trafegam em vias digitais equivale à precisão empregada na representação da realidade para fins de armazenamento e recuperação. Tal esforço deve compreender a realização de tratativas exaustivas sobre uma miríade heterogênea de informações, para possibilitar sua captura, compreensão, organização, representação precisa e, consequentemente, eficiência na recuperação. Uma vez que tais práticas não sejam conhecidas e aplicadas, se mantém obstáculos que inviabilizam a recuperação semântica de informações, mesmo com a evolução tecnológica. A presente pesquisa contempla um estudo teórico, empírico e de implicações metodológicas sobre interoperabilidade e ontologias em suas múltiplas especificidades. A pesquisa justifica-se pela falta de entendimento sobre a abrangência dos dois temas citados e as contribuições que a ontologia aplicada oferece para as demandas de interoperabilidade; as características que tornam a ontologia um recurso preferencial para a concepção de arquiteturas de interoperabilidade e sistemas de integração de dados; os aspectos que favorecem a gestão de dados eficiente e econômica em relação às tecnologias adotadas nas organizações; a abundância de abordagens, métodos, técnicas e tecnologias de informação que devem ser articuladas em soluções de interoperabilidade; a lacuna de representação do conhecimento de alto nível no setor elétrico; a expertise da Ciência da Informação em representação do conhecimento e recuperação da informação, provendo soluções para outros campos científicos, inclusive no escopo da interoperabilidade semântica.
Article
Full-text available
The lives of Sunday School children during the Covid-19 pandemic faced two different conditions. The first condition is that the stewardship of the children's Sunday School must continue and be sustainable. The second condition is that the pandemic is a strict restriction; hence physical contact and social interaction are impossible. These are challenges and at once opportunities for the church, especially the Sunday School teachers, in responding to them. Indonesian Anglican Church Ichthus Congregation of Medan responds that establishing a zoom platform for meeting the spiritual needs of Sunday School children is worthy of appreciation. Preliminary findings through a study of the provision of questionnaires to 32 respondents indicate that virtual online allows becoming a teaching and learning instrument in these difficult times of pandemic. However, its use must be accompanied by teachers' creativity and efforts to increase the absorption of teaching materials. Using a quantitative method with the Likert scale questionnaire to measure the response to some questions shows a benchmark for the material's absorption level, which is estimated to be above the minimum quantity. This is undoubtedly a great hope for the church and, in this case, the Sunday School to improve and be qualified in using media technology and mastery of teaching materials with a touch of creativity that can attract Sunday School children to grow in their spirituality.
Article
Full-text available
The purpose of the study is to demonstrate that the real intellectual process involves both evolutionarily and functionally interacting patterns of linear and nonlinear discourse. The author analyzes the pattern of evolution of the document and its systemic forms of organization as the interaction of a linear and non-linear form of its organization. The erroneous position is justified that the nonlinear form of organization of discourse is only postmodern novation. Objective prerequisites and subjective strategies are revealed, which led to a revolution in documentary science of the era of the formation of a scientific picture of the world and the advent of the ideology of the Enlightenment. The scientific novelty also consists in the demonstration that as a result of the creation of man-machine intelligent technologies, an epistemological and semiotic revolution occurred, associated with the transition from a two-dimensional to a three-dimensional organization of knowledge, the core of which was electronic hypertext.
Chapter
O título compõe o projeto “Patrimônio Musical na Bahia”, que compreende a arquivologia musical como um campo de conhecimento que reúne conceitos e técnicas com o fim de atender as necessidades específicas relativas à organização de acervos ligados à música, compreendendo manuscritos, impressos, discos e documentos tradicionais, como cartas e missivas.
Article
En este trabajo se analizan tres episodios a nuestro juicio determinantes para la actual forma adoptada por la Era Digital. Analizamos las circunstancias bajo las cuales tuvieron lugar comprobando en todos los casos que las soluciones alcanzadas disponían de alternativas más comunes y esperables que las finalmente adoptadas. Se contraponen unas a otras para concluir que la Era Digital se adelantó a lo razonablemente esperable en su momento adoptando formas y configuraciones que favorecieron inesperadamente su alcance y penetración.
Thesis
Online and offline community are both studied but not as an intersection. There is a gap in the literature on the nature of community that is blended online with offline and geographically situated. SPENCE, a Model of online/offline community with measurement principles - capabilities - was formulated. It aims to provide an integrated view of residential online/offline community that offers a lens of synthesis. It is based on the definition: social exchange using channels of digital multi-media and physical expression, leading to permanent social ties connected across social graphs, from proximity informed by a diversity of values, interests and needs, bounded in settlement combining physical and cyber place, curated by an entrepreneur. SPENCE has six facets - settlement, proximity, exchange, net/latticework, channels and entrepreneur; and four capabilities - trust, influence, information and intelligence. iii Two Case Studies, based on online/offline communities in London, deployed the methods of interview, survey and online social network study to discover the nature of online/offline community, how to investigate it and what policy initiatives could be implemented to develop it. The Survey and Twitter Study methods were merged into a Twofold Instrument. The contributions of the thesis are: the Model SPENCE; novel concepts derived from the Model i.e. decile fabric, net/latticework, VINs ratio, diverse cohesion, specific cohesion, and capabilities, which offer updates on established concepts. The affordances of online/offline community include situated cognition, blended relations between people with cohesions in the social fabric predicated on a greater exchange of informal/formal assets. It is recommended that national digital infrastructure is developed to extend online/offline community, either as independent instances or as an integrated national platform. A twofold investigation method, measuring the national total of decile fabric, would offer a pragmatic automated approach to assist a national development programme.<br/
ResearchGate has not been able to resolve any references for this publication.