Article

Recognition of Paper-Based Conceptual Models Captured Under Uncontrolled Conditions

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Today, modeling and subsequent digital model representations are essential constituents in collaborative endeavors on organizational change. Once created, models need to be digitized for shared stakeholder understanding and further processing. Whenever paper serves as carrier medium, it is likely to disrupt further processing, elicitation, and modeling. While digital environments support transformation processes in collaborative modeling from its very beginning, the necessary technical infrastructure still might hamper situated capturing of models. Hence, this contribution aims to reduce the need for sophisticated technical components by enabling stakeholders to capture their paper-based models in a situation-sensitive way. We present a system that enables capturing paper-based models with mundane technical means by end users under uncontrolled conditions. We describe the components developed for recognition of these models and embed it in a mixed-modality workflow supported by a tabletop interfacing a web platform for further processing. As our empirical evidence demonstrates, this approach enables both situated and error-tolerant capturing of hand-drawn conceptual models by individual users. Moreover, it can be integrated with more sophisticated IT-based modeling tools for further digital processing.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Comprehand Cards (Oppl 2017;Oppl et al. 2017) enable creating models of work without the need for any dedicated technical infrastructure. The aim of this component is to allow for collaborative modeling whenever the need arises, especially of that which is directly situated in the actual work context. ...
... Comprehand Cards (Oppl 2017;Oppl et al. 2017) enable creating models of work without the need for any dedicated technical infrastructure. The aim of this component is to allow for collaborative modeling whenever the need arises, especially of that which is directly situated in the actual work context. ...
... The recognition engine is designed to be used with pictures taken by smartphone cameras without any strict constraints on image angles and lighting conditions (Oppl et al. 2017). Pictures of a model are uploaded to an online platform that acts as a front-end for model extraction. ...
Chapter
Full-text available
This chapter instantiates the proposed framework for developing consensually shared workflow designs. It shows how the framework can be put to operation using instruments that have been successfully deployed in practice. These instruments enable articulation and alignment of work process knowledge, allow its representation and transfer within organizations, and facilitate acting on these representations for validation and implementation in diverse organizational settings. We here adopt an organizational learning perspective and situate the presented socio-technical instruments along a multi-perspective learning chain informed by the components of the framework. It thus offers instruments supporting the articulation and alignment of work process knowledge, its multi-faceted representation and flexible manipulation, as well as support for processing the resulting models for validation and refinement. Using the framework as a coherent lens on the requirements on support and the respective features of the instruments allows to offer an integrated view, which demonstrates digital work design support in organizational practice.
... The modeling cards bear visual markers that can be recognized and uniquely identified in the picture. The optical marker recognition engine (Oppl et al. 2017) used for this purpose is based upon the ReacTIVision system (Kaltenbrunner and Bencina 2007). Based upon the coordinates of each marker, the cards contained in the image can be identified and extracted. ...
Chapter
Full-text available
This chapter introduces methodological support for transitioning from existing to envisioned work processes via direct actor involvement. It suggests direct actor involvement in the alignment and validation of novel work practices, in particular when digital workflows or instruments are involved that fundamentally impact the modes of individual operation and collaboration. Alignment is required for consolidating various inputs for further processing. In particular, actively involving process participants in process modeling creates a challenge for consolidated digital work design. Process participants are not expected to have modeling skills and usually, they prefer to externalize their knowledge through diagrams that are as simple as possible in terms of both syntax and semantics (‘natural modeling’). Alignment leads to accommodation of novel perspectives on a work process according to the participants’ individual mental models, allowing their implementation in organizational practice after validation.
... It, however, cannot be assumed to be fully developed for all modeling participants. Appropriate forms of representations and scaffolds can thus help to mitigate deficiencies in this area and allow including people without prior modeling experiences in work articulation and design activities (Oppl et al. 2017). In this light, we here introduce a method based on structure elaboration techniques (Groeben and Scheele 2000) that scaffolds the articulation process and still leads to models that represent both, the functional and interactional aspects of work processes. ...
Chapter
Full-text available
In this chapter, business enablers and resources are at the center of interest, as they are required to generate valuable assets for the market. Looking at those elements beyond a requirements engineering perspective to deliver products and services, human-centric value chain and their analysis can help in apprehending how an enterprise creates valuable elements through a set of core and support activities. Both are assumed to contribute to the sustainable existence of the producing organization in competitive and continuously changing environments, based on products or services for which customers are creating revenues. Representational carriers of work knowledge are business processes, as functional activities of work force transform goods and information. Value creation resides in the context-rich design and execution of work processes rather than the processed or created assets. Although value created in this way has a tangible component, a second component, the intangible part, is of equal importance. Both need to be externalized and represented for (re-)design. In this chapter, various methodologically grounded instruments are introduced, ranging from individual to collective elicitation, and tackling tangible and intangible transactions among concerned stakeholders.
... Such sources of permeations have not been considered in the present study, but could easily be included in the structure elaboration technique. Furthermore, handling the complexity of the evolving elaboration structures could be made easier by technically supporting the articulation and elicitation process itself, e.g., by capturing intermediate results (as demonstrated in [21]) or by using an interactive surface guiding the articulation process (as demonstrated in [22]). Finally, our future research will also focus on putting the profiles to practical use by creating technical instruments that actively support boundary management activities as, e.g., outlined by Schneider et al. [23]. ...
Conference Paper
Full-text available
Permanent reachability via mobile communications technologies has become a ubiquitous phenomenon. The traditional boundaries between peo- ples’ different contexts in their lives become blurry and begin to dissolve, if they are not actively maintained. Such boundary management activities allow to indi- vidually determine which communication request are considered acceptable in a particular context. Existing research in this field has a used a fixed set of pre- specified contexts to examine boundary management activities and identify dif- ferent boundary profiles. Based on results from context-aware computing and mental model research, we argue for an open-ended, individual set of contexts to be considered for boundary management. Consequently, we develop an open structure elaboration technique to allow for individual specification of contexts and the information necessary to create a boundary profile, as identified in related work. The method is validated in an exploratory study, which was designed to verify the hypothesis that boundary management should be based on individually specified contexts, and show the feasibility of the proposed method. The results indicatively confirm our assumptions and show that the method can be used to elicit the required information.
... From a user perspective it is not so much a usability but rather a semantic problem beyond the application domain, since critical situations in operation occur at the interface of instances and model representations. Empirical findings from the field, in terms of usability, methodology, and functionality, as e.g., provided by Oppl et al. [26] could guide future research to that respect. ...
Conference Paper
Full-text available
Since Subject-oriented Business Process Management (S-BPM) models can be executed after validation without further transformation, toolshave been developed to support model execution. Asthese tools target not only non-disruptive execution of process models but also intuitive ease of use, stakeholders could expect effective and efficient implementation support of business processes. In the presented study we have challenged 3 stakeholder-centered tools refining, validating, and executing a complex process model. We wanted to knowhow much effort needs to be spent when a prototypical applicationis generated from a process specificationprovided by theinvolved stakeholders. The tools were a commercially available suite, a tool currently approaching the market, and a research tool. Our assessmentstudy reveals substantial effortthat needs to be spent for refining and validating process models, before being able to generate an interactive process experience. Hence, (S-)BPM tool developers are encouraged tosupport stakeholders according to the identified needs.
... Future studies will examine the effects of the tool on the main target group, namely organizational stakeholders without any background in business process management and modeling. Third, from a technical point of view, the tool needs to be integrated in the chain of instruments we have proposed for work process articulation and elicitation [25,27]. This would allow to deploy the concept of virtual enactment in larger-scale use cases that would allow to examine its effects on organizational learning and development processes. ...
Conference Paper
Full-text available
Validation of business process models under involvement of stakeholders is usually performed by moderated model walkthroughs. We explore the potential of combining these established practices with interactive enactment of executable prototypes that is widely used in UI prototyping and model-based interactive system design. Based on these fields, we develop the concept of process elaboration through scaffolded virtual enactment. The proposed concepts are instantiated in a web-based tool that enables to explore the potential of the approach. An initial exploratory study could confirm that the approach is considered supportive by end users and can be used to elaborate existing process models or develop models of work process from ground up.
Article
The papers in this special section focus on handwriting and drawing processes for user-centered systems. The papers provide a wide and updated overview of the frontier of research in the field of humancentered systems based on drawing and handwriting processing. Through the papers, some of the most relevant directions of further research are highlighted with specific attention to components related to human–machine interaction. The Guest Editors hope that this issue brings forth the importance of automated systems related to automatic processing of drawing and handwriting.
Article
Full-text available
The longest phase in a facility's lifecycle is its maintenance period, during which operators perform activities to provide a comfortable living and working environment as well as to upkeep equipment to prevent functional failures. In current practice operators need a considerable amount of time to manually process dispersed and unformatted facility information to perform an actual task. Existing research approaches rely on expensive hardware infrastructure or use artificial, thus unesthetic Augmented Reality (AR) markers. In this paper we present a natural marker based AR framework that can digitally support facility maintenance (FM) operators when navigating to the FM item of interest and when actually performing the maintenance and repair actions. Marker detection performance experiments and case studies on our university campus indicate the feasibility and potential of natural markers for AR-based maintenance support.
Conference Paper
Full-text available
Face recognition in presence of either occlusions, illumination changes or large expression variations is still an open problem. This paper addresses this issue presenting a new local-based face recognition system that combines weak classifiers yielding a strong one. The method relies on sparse approximation using dictionaries built on a pool of local features extracted from automatically cropped images. Experiments on the AR database show the effectiveness of our method, which outperforms current state-of-the art techniques.
Article
Full-text available
This article presents an experimental study aimed at investigating the learning effectiveness of concept mapping for computer-supported collaborative problem solving. The main assumption underlying this research is that shared cognition is substantial for cognitive construction and reconstruction and that concept mapping is an effective tool for mediating computer-supported collaboration. Three scenarios for “mediated group interaction” by concept mapping have been designed—distributed, moderated, and shared. They are based on the assumption that the form in which knowledge is shared strongly influences the process of shaping, and shared cognition subsequently influences the effectiveness of collaborative learning. These three scenarios demonstrated differential effects towards various aspects of learning effectiveness both at the group and at the individual level. It is concluded that both the mode of sharing and the representation of knowledge as expressed by students are more important than the access to the distributed resources itself. The sharing scenarios showed to be most appropriate for establishing a supportive learning environment.
Conference Paper
Full-text available
This demo paper presents an interactive tabletop interface with tangible building blocks to engage business domain experts in process modelling. This interface, called Metasonic Touch, is a commercial product based on results of the European research project IANES (Interactive Acquisition, Negotiation and Enactment of Subject-Oriented Business Process Knowledge). BPM conference attendees will be able to use Metasonic Touch and experience the ease and playfulness with which it allows collaboratively modelling, understanding and discussing a process. The target audience includes BPM researchers and practitioners interested in agile and stakeholder-oriented approaches to process modelling.
Conference Paper
Full-text available
Business process elicitation needs to capture, document, and share the expertise and experiences develop over time by the involved workers. This paper presents an approach for elicitation of subject-oriented business process models from operatively involved people, who are not expert modelers. The approach facilitates individual articulation and collaborative consolidation of work knowledge. An instrument based on physical structure elaboration techniques is introduced to represent procedural knowledge in conceptual models of collaborative work. These physical models are captured digitally and transformed to syntactically correct S-BPM models. Capturing is supported by interactive tools, which allow to correct syntactic errors. Semantic completeness of the models is archived by interactive refinement during simulated enactment using a process validation engine. The paper shows the feasibility of the approach in two case studies and identifies requirements on interactive guidance during model capturing and interpretation.
Article
Full-text available
In this paper, we present the IMISketch method. This method is a generic method for an interactive interpretation of handwritten sketches. The analysis of complex documents requires the manage of uncertainty. While, in practice the similar method often induce a large combinatorics, IMISketch method presents optimization strategies to reduce the combinatorics. The goal of these optimizations is to have a time analysis compatible with user expectations. The decision process is able to solicit the user in the case of strong ambiguity: when it is not sure to make the right decision. The user explicitly validates the right decision to avoid a fastidious a posteriori verification phase due to error propagation. In this paper, we present the impact of user interaction in the recognition of off-line handwritten structured documents. This interaction requires solving two major prob- lems: how interpretation results will be presented to the user, and how the user will interact with analysis process. We propose to study the effects of those two aspects. The experiments demonstrate that (i) a progressive presentation of the analysis results, (ii) user interventions during it and (iii) the user solicitation by the analysis process are an effcient strategy for the recognition of complex off-line documents. To validate this interactive analysis method, several experiments are reported on off-line handwritten 2D architectural foor plans.
Article
Full-text available
The least cognitively demanding way to create a diagram is to draw it with a pen. Yet there is also a need for more formal visualizations, that is, diagrams created using both traditional keyboard andmouse interaction. Our objective is to allow the creation of diagrams using traditional and stylus-based input. Having two diagram creation interfaces requires that changes to a diagram should be automatically rendered in the other visualization. Because sketches are imprecise, there is always the possibility that conversion between visualizations results in a lack of syntactic consistency between the two visualizations. We propose methods for converting diagrams between forms, checking them for equivalence, and rectifying inconsistencies. As a result of our theoretical contributions, we present an intelligent software system allowing users to create and edit diagrams in sketch or formal mode. Our proof-of-concept tool supports diagrams with connected and spatial syntactic elements. Two user studies show that this approach is viable and participants found the software easy to use. We conclude that supporting such diagram creation is now possible in practice.
Article
Full-text available
Purpose – This paper aims to achieve fully intertwined knowledge and business processing in change processes. It proposes streamlining situated articulation work, value network analyses (VNA) and subject-oriented business process modelling (S-BPM) and execution to provide non-disruptive single and double learning processes driven by concerned stakeholders. When implementing knowledge life cycles, such as Firestone and McElroy’s knowledge life cycle, the agility of organizations is significantly constrained, in particular, when surviving knowledge claims should be implemented in the business processing environment in a seamless way. Design/methodology/approach – The contribution is based on a conceptual analysis of knowledge life cycle implementations, learning loop developments and an exploratory case study in health care to demonstrate the effectiveness of the proposed approach. The solution towards non-disruptive knowledge and business processing allows stakeholders to actively participate in single- and double-loop learning processes. Findings – The introduced approach supports problem and knowledge claim formulation, knowledge claim evaluation and non-disruptive knowledge integration into a business process environment. Based on stakeholder articulation, the steps to follow are: holomapping, exchange analysis, impact analysis, value creation analysis, subject-oriented modelling, business process validation and execution. Seamless support of stakeholders is enabled through the direct mapping of stakeholder and activity descriptions from value network representations to behaviour specifications (process models) on the individual and organizational layer. Research limitations/implications – Current knowledge life cycle developments and implementations can now be analyzed in a structured way. Elements of the proposed approach could be integrated in disruptive implementations to overcome current limitations of knowledge life cycles. However, further case studies need to be performed to identify hindrances or barriers of combining VNA and S-BPM, both on the technological and methodological layer. What works for expert service industries might need to be adapted for production industries, and tools or tool chains might need to be configured accordingly. Finally, the socio-economic impact of the approach needs to be explored. Practical implications – The presented case study from health care reveals the potential of such a methodological combination, as cycle times can be reduced, in particular, due to the execution of role-specific process models in the respective business processing environment. It can be considered as a fundamental shift for existing change management procedures, as they require rework of the entire functional process models when addressing business processing. Now, stakeholder- or role-specific behaviour can be handled isolated and in parallel, without affecting the entire organization in case of modifications. Originality/value – The proposed methodological integration has not been done before. It enables stakeholders to perform single- and double-loop change processes in a seamless way. available at: http://www.emeraldinsight.com/doi/pdfplus/10.1108/JKM-10-2013-0377
Article
Full-text available
Face recognition has made significant advances in the last decade, but robust commercial applications are still lacking. Current authentication/identification applications are limited to controlled settings, e.g., limited pose and illumination changes, with the user usually aware of being screened and collaborating in the process. Among others, pose and illumination changes are limited. To address challenges from looser restrictions, this paper proposes a novel framework for real-world face recognition in uncontrolled settings named Face Analysis for Commercial Entities (FACE). Its robustness comes from normalization (“correction”) strategies to address pose and illumination variations. In addition, two separate image quality indices quantitatively assess pose and illumination changes for each biometric query, before submitting it to the classifier. Samples with poor quality are possibly discarded or undergo a manual classification or, when possible, trigger a new capture. After such filter, template similarity for matching purposes is measured using a localized version of the image correlation index. Finally, FACE adopts reliability indices, which estimate the “acceptability” of the final identification decision made by the classifier. Experimental results show that the accuracy of FACE (in terms of recognition rate) compares favorably, and in some cases by significant margins, against popular face recognition methods. In particular, FACE is compared against SVM, incremental SVM, principal component analysis, incremental LDA, ICA, and hierarchical multiscale local binary pattern. Testing exploits data from different data sets: CelebrityDB, Labeled Faces in the Wild, SCface, and FERET. The face images used present variations in pose, expression, illumination, image quality, and resolution. Our experiments show the benefits of using image quality and reliability indices to enhance overall accuracy, on one side, and to provide for indi- idualized processing of biometric probes for better decision-making purposes, on the other side. Both kinds of indices, owing to the way they are defined, can be easily integrated within different frameworks and off-the-shelf biometric applications for the following: 1) data fusion; 2) online identity management; and 3) interoperability. The results obtained by FACE witness a significant increase in accuracy when compared with the results produced by the other algorithms considered.
Article
Full-text available
As work is an inherently cooperative phenomenon, it requires a common understanding of the nature of collaboration for all involved parties. In this way, explicit articulation work becomes an integral and essential part of collaboration. Implicit aspects of collaboration have impact on the quality of work results, mainly through social norms and observations of working together. Eliciting those aspects interactively helps in avoiding (mutual) misrepresentations and lack of understanding. Tangible articulation support systems allow aligning mental models of how work should be carried out. Stakeholders can develop a common understanding of collaboration in a semantically open and non-intrusive way. They are not burdened by explication features and diagrammatic notations. We have utilised experiences with model-centred learning theory to support explicit articulation work. According to our field studies, the resulting models can be fed back to current work practices and help in preventing problematic work situations.
Article
Full-text available
In the current digital age, the adoption of natural interfaces between humans and machines is increasingly important. This trend is particularly significant in the education sector where interactive tools and applications can ease the presentation and comprehension of complex concepts, stimulate collaborative work, and improve teaching practices. An important step towards this vision, interactive whiteboards are gaining widespread adoption in various levels of education. Nevertheless, these solutions are usually expensive, making their acceptance slow, especially in countries with more fragile economies. In this context, we present the low-cost interactive whiteboard (LoCoBoard) project, an open-source interactive whiteboard with low-cost hardware requirements, usually accessible in our daily lives, for an easy installation: a webcam-equipped computer, a video projector, and an infrared pointing device. The detection software framework offers five different Pointer Location algorithms with support for the Tangible User Interface Object protocol and also adapts to support multiple operating systems. We discuss the detailed physical and logical structure of LoCoBoard and compare its performance with that of similar systems. We believe that the proposed solution may represent a valuable contribution to ease the access to interactive whiteboards and increase widespread use with obvious benefits.
Article
Full-text available
A system which allows the computer to capture sketches made by a mechanical designer is described. The system not only recognizes basic design features as they are sketched, but it also builds a feature-based solid model of the artifact. The temporal nature of the capture, one feature at a time, serves to form a feature graph that allows for parametric redesign. The system is composed of three inference systems: a two-dimensional freehand primitive recognition system, a three-dimensional feature recognition system and a spatial reasoning system. Each is described in the paper.
Article
Full-text available
The distance transform (DT) is a general operator forming the basis of many methods in computer vision and geometry, with great potential for practical applications. However, all the optimal algorithms for the computation of the exact Euclidean DT (EDT) were proposed only since the 1990s. In this work, state-of-the-art sequential 2D EDT algorithms are reviewed and compared, in an effort to reach more solid conclusions regarding their differences in speed and their exactness. Six of the best algorithms were fully implemented and compared in practice.
Article
Full-text available
Purpose – Knowledge management (KM) as a field has been characterized by great confusion about its conceptual foundations and scope, much to the detriment of assessments of its impact and track record. The purpose of this paper is to contribute toward defining the scope of KM and ending the confusion, by presenting a conceptual framework and set of criteria for evaluating whether claimed KM interventions are bona fide instances of it or are interventions of another sort. Design/methodology/approach – Methods used include conceptual evaluation and critique of a variety of types of “KM interventions” and presentation of a detailed analysis of an unambiguous case (The Partners HealthCare case) where KM has been successful. Findings – The critical analysis indicates that the use of tools and methods associated with KM does not imply that interventions using them are KM interventions, and most “KM projects” are probably interventions of other types. The analysis also illustrates a pattern of intervention that can serve as the basis of a long‐term systematic strategy for implementing KM. Originality/value – This is the first detailed examination of whether KM is really being done by those who claim to be doing it. It should be of value to all those who think about the scope of organizational learning and KM, and who care about unbiased assessments of its performance.
Conference Paper
Full-text available
We have used a design science approach to study the collaborative creation of conceptual models. We have designed a collaborative modeling architecture based on business needs and applicable knowledge from theory and empirical findings from a modeling study using conventional modeling. A tool for this architecture was then developed and used as an instrument to confirm the practical relevance of the approach and the validity of the employed theory.
Conference Paper
Full-text available
Current sketch recognition systems treat sketches as images or a collection of strokes, rather than viewing sketching as an interactive and incremental process. We show how viewing sketching as an interactive process allows us to recognize sketches using Hidden Markov Models. We report results of a user study indicating that in certain domains people draw objects using consistent stroke orderings. We show how this consistency, when present, can be used to perform sketch recognition efficiently. This novel approach enables us to have polynomial time algorithms for sketch recognition and segmentation, unlike conventional methods with exponential complexity.
Conference Paper
Full-text available
Process modeling is an important design practice in intra- as well as interorganizational process improvement projects. Inter-organizational process modeling often requires collaboration support for distributed participants. We present the results of a preliminary exploratory of study of process modeling on basis of collaborative technology. We examine a group of process modelers that rely on a collaborative modeling editor to complete two process modeling tasks in distributed settings. We examine how the participants learn to appropriate the technology, the key phases and tasks of collaborative process modeling, the breakdowns encountered and workarounds employed by the participants. With our study, we provide a first understanding of the IT-enabled process of process modeling, and detail a set of guidelines and implications for the research and design of collaborative process modeling.
Conference Paper
Full-text available
Augmented tabletops have recently attracted considerable attention in the literature. However, little has been known about the eects that these interfaces have on learning tasks. In this paper, we report on the results of an empirical study that explores the usage of tabletop systems in an expres- sive collaborative learning task. In particular, we focus on measuring the dierence in learning outcomes at individ- ual and group levels between students using two interfaces: traditional computer and augmented tabletop with tangible input. No signicant eects of the interface on individual learning gain were found. However, groups using traditional com- puter learned signicantly more from their partners than those using tabletop interface. Further analysis showed an interaction eect of the condition and the group heterogene- ity on learning outcomes. We also present our qualitative ndings in terms of how group interactions and strategy dif- fer in the two conditions.
Conference Paper
Full-text available
Companies normally hire external consultants to carry out their business process re-engineering. While this can be straightforward in the short term, it does not produce the desired results in the mid and long term. A low level of worker involvement and a continuous dependency on external consultancy are the main drawbacks. We propose an alternative approach to BPR, specifically to workflow design, where company workers play an active role in re-designing the organization's processes in a cooperative style. The paper describes the essence of a BPR method based on participatory design and stepwise refinement which we believe will generate better results than the traditional approach. We describe in detail the CEPE tool: Cooperative Editor for Processes Elicitation, which is a cooperative graphic editor that supports the building of the knowledge about the current process and its associated problems. That is the second phase of the proposed method.
Conference Paper
Full-text available
We study collaborative modeling of business processes with respect to the impact of tool support on the modeling process. For this purpose we compared model quality and modeling costs in two cases. The first was carried out with the help of a collaborative modeling tool; in the second case we kept all other parameters as closely as possible to the first one but conducted the modeling session in the usual way without tool support. We observed a marked increase in modeling time in the second case and a reduction in model quality.
Conference Paper
Full-text available
This paper describes a new environment, COE, for capturing and formally representing expert knowledge for use in the Semantic Web. COE exploits the ease of use and rapid knowledge construction capabilities of the CmapTools concept mapping system and extends them to support the import and export of formal, machine-interpretable knowledge representations, such as OWL, across multiple ontologies. Pragati's ExpozT tool suite complements COE's ontology construction, browsing and navigation features by providing cluster-based search capabilities that expose existing reusable concepts relevant to the user's focus of attention.
Conference Paper
Full-text available
A long standing challenge in pen-based computer interaction is the ability to make sense of informal sketches. A main difficulty lies in reliably extracting and recognizing the intended set of visual objects from a continuous stream of pen strokes. Existing pen-based systems either avoid these issues altogether, thus resulting in the equivalent of a drawing program, or rely on algorithms that place unnatural constraints on the way the user draws. As one step toward alleviating these difficulties, we present an integrated sketch parsing and recognition approach designed to enable natural, fluid, sketch-based computer interaction. The techniques presented in this paper are oriented toward the domain of network diagrams. In the first step of our approach, the stream of pen strokes is examined to identify the arrows in the sketch. The identified arrows then anchor a spatial analysis which groups the uninterpreted strokes into distinct clusters, each representing a single object. Finally, a trainable shape recognizer, which is informed by the spatial analysis, is used to find the best interpretations of the clusters. Based on these concepts, we have built SimuSketch, a sketch-based interface for Matlab's Simulink software package. An evaluation of SimuSketch has indicated that even novice users can effectively utilize our system to solve real engineering problems without having to know much about the underlying recognition techniques.
Article
Full-text available
The hype cycle points to widespread adoption of tabletop systems within the next decade.
Chapter
Color constancy is one of the most amazing features of the human visual system. When we look at objects under different illuminations, their colors stay relatively constant. This helps humans to identify objects conveniently. While the precise physiological mechanism is not fully known, it has been postulated that the eyes are responsible for capturing different wavelengths of the light reflected by an object, and the brain attempts to “discount” the contribution of the illumination so that the color perception matches more closely with the object reflectance, and therefore is mostly constant under different illuminations [1].
Conference Paper
Whiteboards and paper allow for any kind of notations and are easy to use. Requirements engineers love to use them in creative requirements elicitation and design sessions. However, the resulting diagram sketches cannot be interpreted by software modeling tools. We have developed FlexiSketch as an alternative to whiteboards in previous work. It is a mobile tool for model-based sketching of free-form diagrams that allows the definition and re-use of diagramming notations on the fly. The latest version of the tool, called FlexiSketch Team, supports collaboration with multiple tablets and an electronic whiteboard, such that several users can work simultaneously on the same model sketch. In this paper we present an exploratory study about how novice and experienced engineers sketch and define ad-hoc notations collaboratively in early requirements elicitation sessions when supported by our tool. Results show that participants incrementally build notations by defining language constructs the first time they use them. Participants considered the option to re-use defined constructs to be a big motivational factor for providing type definitions. They found our approach useful for longer sketching sessions and situations where sketches are re-used later on.
Article
The overall coverage of the chapter is about moving face recognition out of the comfort zone and dramatically improving the current performance of existing biometric tools by fusing the rich spatial, temporal, and contextual information available from the multiple views made available by video (rather than still images) in the wild and operational real-world problems. Instead of relying on a "single best frame approach," one must confront uncontrolled settings by exploiting all available imagery to allow the addition of new evidence, graceful degradation, and re-identification. Uncontrolled settings are all-encompassing and include Aging-Pose, Illumination, and Expression (A-PIE), denial and deception characteristic of incomplete and uncertain information, uncooperative users, and unconstrained data collection, scenarios, and sensors. The challenges are many: most important among them lack of persistence for biometric data, adversarial biometrics, open rather than closed set recognition, covariate shift, cross-dataset generalization, alignment and registration, interoperability, scalability, and last but not least, the deployment of full-fledged biometrics that include detection, authentication, informative sampling, and tracking. The overall recommendations are synergetic and should consider for implementation and processing purposes the regularization, statistical learning, and boosting triad complemented by sparsity and grouping (feature sharing) to deal with high-dimensional data and enhanced generalization. The recurring theme is that of a unified framework that involves multi-task and transfer learning using metric learning and side information.
Article
In today's rapidly changing business environment a company's business and related information systems underlie constant change. The field of evolutionary business information systems deals with applications that can be modified partially by stakeholders regarding content and behavior with the objective to align to new business requirements. A possibility to change the behavior of an application could be achieved by modification of the underlying business processes. Subject-oriented business process management (S-BPM) realizes an approach where process models can be interpreted by an appropriate workflow engine and directly executed by stakeholders using a generic application working on it. In that way the generic application can be seen as a primary system on which a secondary design can be performed by editing the process models. In this paper we compare evolutionary business information systems with subject-oriented business process management with the objective to infer software requirements for implementing an evolutionary business information system on the basis of S-BPM, where the system behavior is a result of the continuous evolution of the underlying business processes.
Article
Collaborative concept mapping engages two or more students ino'cloordinated and sustained efforts in the creation of one or more concept maps in order to learn and construct knowledge. It is a potentially powerful instructional strategy in that it fosters meaningful leaming and group knowledge construction and helps the building of common ground among learners. However, limited research studies in this area have generated mixed findings. This article attempts to find possible reasons for the mixed firidings by reviewing some studies that specifically addressed the use of concept mapping in individual learning and/or group knowledge construction. Based on the findings, the article proposes the use of other instructional strategies along with collaborative concept mapping for better implementation of the technique in both face-to-face and online environments. The implications for further investigations in this area are also discussed.
Conference Paper
This paper presents a solution of how systematic design within facilitated walkthrough workshops is combined with phases of non-linear ideation for the purpose of collaborative process modeling. In the context of socio-technically supported co-located meetings, three design cycles were run which led to an evolutionary improvement. The result is a set of features as part of a socio-technical solution allowing to seamlessly intertwine creative phases with walkthrough-oriented inspection and improvement of models. The set of features includes the possibility of simultaneous brainstorming on several topics, variation of prompts per brainstorming topic etc. Additional features are described to support the facilitator.
Article
Simple experimental setups are designed for observing liquid surface waves, studying the dispersion relation and obtaining the liquid surface tension coefficient. In addition, the relationship between temperature and the liquid surface tension coefficient is verified. We take pictures of the surface wave patterns with a smartphone camera and measure the wavelength with software analysis based on image recognition. The experiment is performed not only with common devices and simple operations, but also with the student’s own smartphone. As a result, the experiment itself is easy and convenient to carry out, which stimulates undergraduates’ interest in this experiment.
Conference Paper
We describe Photo OCR, a system for text extraction from images. Our particular focus is reliable text extraction from smartphone imagery, with the goal of text recognition as a user input modality similar to speech recognition. Commercially available OCR performs poorly on this task. Recent progress in machine learning has substantially improved isolated character classification, we build on this progress by demonstrating a complete OCR system using these techniques. We also incorporate modern data center-scale distributed language modelling. Our approach is capable of recognizing text in a variety of challenging imaging conditions where traditional OCR systems fail, notably in the presence of substantial blur, low resolution, low contrast, high image noise and other distortions. It also operates with low latency, mean processing time is 600 ms per image. We evaluate our system on public benchmark datasets for text extraction and outperform all previously reported results, more than halving the error rate on multiple benchmarks. The system is currently in use in many applications at Google, and is available as a user input modality in Google Translate for Android.
Article
Whiteboards serve an important role in supporting informal design, providing a fluid and flexible medium for collaborative design. Interactive whiteboards offer the potential for enhanced support for manipulating content, managing sketches, and distributed work, but little is known about how this support affects the practice of informal design. To understand the opportunities and challenges, we first conducted a literature review, identifying 14 behaviors that occur during informal design. We then designed an interactive whiteboard system to support all of these behaviors and deployed the system to three groups of designers. Through usage logs and interviews, we examined the effects of interactivity on whiteboard use across a wide spectrum of design behaviors, identifying ways in which interactive whiteboards support the practices used in physical whiteboards and where they enable designers to work more effectively.
Article
Every product that exists, ranging from a toothbrush to a car, has first been conceived as a mental concept. Due to its efficacy in rapidly externalizing concepts, paper-based sketching is still extensively used by practising designers to gradually develop the three-dimensional (3D) geometric form of a concept. It is a common practice that form concepts are sketched on paper prior to generating 3D virtual models in commercial Computer-Aided Design (CAD) systems. However, the user-interface of such systems does not support automatic generation of 3D models from sketches. Furthermore, the inherent characteristics of form sketching (e.g. idiosyncrasy) pose a challenge to computer-based understanding of the form concept semantics expressed on paper. To address these issues, this paper is therefore concerned with the development of a visual language that is prescribed and to be used by product designers to annotate paper-based sketches such that the form geometry semantics can be formally represented; parsing the annotated sketch allows for the automatic generation of 3D virtual models in CAD. Inspired by re-usable 3D CAD modelling functions and the related environmental constraints and requirements, a prescribed sketching language, PSL, has been developed to annotate paper-based form sketches. The framework architecture which parses the annotated sketch and subsequently extracts the form concept semantics is described. Based on this framework, a prototype computer tool has been implemented and evaluated. Evaluation results provide a degree of evidence, first on the suitability of PSL in representing the semantics of a range of forms, and secondly on the designers' acceptance of taking up this annotated sketching approach in practice.
Article
Color constancy is one of the most amazing features of the human visual system. When we look at objects under different illuminations, their colors stay relatively constant. This helps humans to identify objects conveniently. While the precise physiological mechanism is not fully known, it has been postulated that the eyes are responsible for capturing different wavelengths of the light reflected by an object, and the brain attempts to "discount" the contribution of the illumination so that the color perception matches more closely with the object reflectance, and therefore is mostly constant under different illuminations [1]. A similar behavior is highly desirable in digital still and video cameras. This is achieved via white balancing which is an image processing step employed in a digital camera imag-ing pipeline (detailed description of the camera imaging pipeline can be found in Chapters 1 and 3) to adjust the coloration of images captured under different illuminations [2], [3].
Article
We study collaborative modelling of business processes with respect to the impact of tool support on the modelling process. For this purpose, we compared model quality and modelling costs of two sessions in five cases. The first was carried out in the usual way without tool support; in the second case we conducted the modelling session with the help of a collaborative modelling tool. We observed a marked decrease in modelling time in the second case and a rise in model quality.
Conference Paper
Traditionally, metamodeling is an upfront activity performed by experts for defining modeling languages. Modeling tools then typically restrict modelers to using only constructs defined in the metamodel. This is inappropriate when users want to sketch graphical models without any restrictions and only later assign meanings to the sketched elements. Upfront metamodeling also complicates the creation of domain-specific languages, as it requires experts with both domain and metamodeling expertise. In this paper we present a new approach that supports modelers in creating metamodels for diagrams they have sketched or are currently sketching. Metamodels are defined in a semi-automatic, interactive way by annotating diagram elements and automated model analysis. Our approach requires no metamodeling expertise and supports the co-evolution of models and meta-models.
Conference Paper
In the early phases, software engineers use whiteboards and flip charts to create and discuss their ideas and later they transform manually the hand drawn pictures into machine readable models. During this transformation important sketch information, like the history of origin or some elements, will be lost. To solve this problem, we present a new approach using digital whiteboards to elaborate in a creative and collaborative environment hand drawn pictures and transform them into domain specific models and vice versa. This poster outlines the process of the automatic transformation from sketch models to models based on well-defined notations and vice versa in the early creative phases of software development. Video: https://www.youtube.com/watch?v=0i3M9djPrRM
Data
Two paradigms characterize much of the research in the Information Systems discipline: behavioral science and design science. The behavioral-science paradigm seeks to develop and verify theories that explain or predict human or organizational behavior. The design-science paradigm seeks to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. Both paradigms are foundational to the IS discipline, positioned as it is at the confluence of people, organizations, and technology. Our objective is to describe the performance of design-science research in Information Systems via a concise conceptual framework and clear guidelines for understanding, executing, and evaluating the research. In the design-science paradigm, knowledge and understanding of a problem domain and its solution are achieved in the building and application of the designed artifact. Three recent exemplars in the research literature are used to demonstrate the application of these guidelines. We conclude with an analysis of the challenges of performing high-quality design-science research in the context of the broader IS community.
Article
In a shocking and almost silly interview with Max Jacobson, Christopher Alexander recounted the following story. "There was a conference which I was invited to a few months ago where computer graphics was being discussed as one item and I was arguing very strongly against computer graphics simply because of the frame of mind that you need to be in to create a good building. Are you at peace with yourself? Are you thinking about smell and touch, and what happens when people are walking about in a place? But particularly, are you at peace with yourself? All of that is completely disturbed by the pretentiousness, insistence and complicatedness of computer graphics and all the allied techniques. So my final objection to that and to other types of methodology is that they actually prevent you from being in the right state of mind to do the design, quite apart from the question of whether they help in a sort of technical sense, which, as I said, I don't think they do."
Article
Attempts to specify requirements adequately normally fail - sometimes catastrophically. One reason is the lack of a rigorously defined method which directly addresses the needs of requirement specification. CORE, the subject of this paper, is such a method. CORE is the result of several years of practical experiment by the author, using a number of published approaches to specification and design. It is supported by a diagrammatic notation whose key features are a composite of ideas drawn from several widely used notations for expression of requirement or design.
Conference Paper
'Interactive Acquisition, Negotiation and Enactment of Subject-oriented Business Process Knowledge' is a 4-year research effort to implement a knowledge life cycle using Subject-oriented Business Process Management and mutually align related Organizational Learning techniques in respective development processes. As different partners from academia and industry need to share their experiences, tools, techniques on a detailed level as well as the project's content management are of crucial importance. In this paper, we focus on the Nymphaea system as a means for effective spatially distributed knowledge sharing. We develop the requirements revisiting distributed transactive memory systems and describe the toolset and its support for different aspects of knowledge sharing within the project. We also report on initial findings when utilizing annotation features for individualization and mutually changing perspectives.
Chapter
This paper presents parts of a design framework for collaboratively used tangible interaction systems, focusing on the theme of Embodied Facilitation. Systems can be interpreted as spaces/structures to act and move in, facilitating some movements and hindering others. Thus they shape the ways we collaborate, induce collaboration or make us refrain from it. Tangible interaction systems provide virtual and physical structure - they truly embody facilitation. Three concepts further refine the theme: Embodied Constraints, Multiple Access Points and Tailored Representations. These are broken down into design guidelines and each illustrated with examples.
Article
Knowledge Management has become a prominent subject for organizations, but often the information that flows in a well-defined design work process is not characterized and treated in such a way as to promote its reuse. We argue that context is a fundamental information resource for improving how activities and interactions are understood and carried on. Our premise is that it is important for organizational learning that decisions, solutions, discussions and actions executed in work processes should be retrievable. We describe an environment that supports the cycle of creating and dealing with information about activities and interactions, focusing on their context. A formal ontology-based representation of context is presented to support the use of this environment. Two case studies are described and their results analyzed. The goal of this paper is to discuss and specify mechanisms that can be used to collect contextual information within such an environment.
Article
The freehand sketch has traditionally been seen as the primary conceptual tool in the early stages of the design process. But what is the impact of digital technology on conceptual tools and sketching in particular? A multiple case study compared how design students and design practitioners used conceptual tools in everyday design situations. The outcome showed that verbalisation, rather than freehand sketching was the major conceptual tool for getting started. Moreover, the computer emerged as an ideation tool across design domains.
Conference Paper
Key challenges in enterprise business process modeling are to capture complex inter-departmental and organizational processes, and to integrate different perspectives on the operation of the enterprise. Actors often convey different and only partly overlapping perceptions of their business processes, which hinder the construction of fairly accurate models in first modeling attempts. These different accounts of the business processes need to be integrated in a way to create a realistic and acceptable picture of the enterprise. To avoid this reoccurring pitfall and trial-and-error situation, and supporting the integration of different views on enterprise processes, collaborative modeling is emerging as a powerful approach. In this chapter, we report findings from a case study in which we used a collaborative approach to support enterprise business processes modeling with participation of analysts, process owners, and professionals. The deliverables of this chapter are based on a case study with participation of industry partners during a collaborative enterprise modeling session. We will reflect on the approaches used, lessons learned and the role of technology for supporting collaborative modeling.
Article
Concept maps are an important tool to organize, represent, and share knowledge. Building a concept map involves creating text-based concepts and specifying their relationships with line-based links. Current concept map tools usually impose specific task structures for text and link construction, and may increase cognitive burden to generate and interact with concept maps. While pen-based devices (e.g., tablet PCs) offer users more freedom in drawing concept maps with a pen or stylus more naturally, the support for hand-drawn concept map creation and manipulation is still limited, largely due to the lack of methods to recognize the components and structures of hand-drawn concept maps. This article proposes a method to understand hand-drawn concept maps. Our algorithm can extract node blocks, or concept blocks, and link blocks of a hand-drawn concept map by combining dynamic programming and graph partitioning, recognize the text content of each concept node, and build a concept-map structure by relating concepts and links. We also design an algorithm for concept map retrieval based on hand-drawn queries. With our algorithms, we introduce structure-based intelligent manipulation techniques and ink-based retrieval techniques to support the management and modification of hand-drawn concept maps. Results from our evaluation study show high structure recognition accuracy in real time of our method, and good usability of intelligent manipulation and retrieval techniques.