Conference PaperPDF Available

Mobile Product Customization

Authors:
  • University of St.Gallen
  • Switzerland Innovation

Abstract and Figures

Many companies are using the web to enable customers to individually customize their products that range from automobiles and bicycles to CDs, cosmetics and shirts. In this paper we present a mobile application for product customization and production within a smart factory. This allows the ad hoc configuration of products at the point of sale (POS). We investigate human factors when customizing products while interacting with them. We focus on the concept of the mobile client that enables this ad hoc modification, but also present the production chain behind our product. We believe that this particular D interaction with a product and a mobile device help to improve the customer satisfaction as it allows for customizing a product in an easy and intuitive way. From a CHI perspective an important aspect is that our mobile augmented reality interface can help to match the costumer's expectations with the final modified product and allows the most natural and intuitive interaction. As a use case of the system, we present the modification of a soap dispenser.
No caption available
… 
Content may be subject to copyright.
Mobile Product Customization
Abstract
Many companies are using the web to enable customers
to individually customize their products that range from
automobiles and bicycles to CDs, cosmetics and shirts.
In this paper we present a mobile application for product
customization and production within a smart factory.
This allows the ad hoc configuration of products at the
point of sale (POS). We investigate human factors when
customizing products while interacting with them. We
focus on the concept of the mobile client that enables
this ad hoc modification, but also present the production
chain behind our product. We believe that this particular
3D interaction with a product and a mobile device help to
improve the customer satisfaction as it allows for
customizing a product in an easy and intuitive way. From
a CHI perspective an important aspect is that our mobile
augmented reality interface can help to match the
costumer’s expectations with the final modified product
and allows the most natural and intuitive interaction. As
a use case of the system, we present the modification of
a soap dispenser.
Keywords
Mobile interaction, product customization
ACM Classification Keywords
H.5.1 Multimedia Information Systems
Copyright is held by the author/owner(s).
CHI 2010, April 10–15, 2010, Atlanta, Georgia, USA.
ACM 978-1-60558-930-5/10/04.
Sven Gehring
German Research Center for
Artificial Intelligence
Saarbrücken, Germany
sven.gehring@dfki.de
Markus Löchtefeld
German Research Center for
Artificial Intelligence
Saarbrücken, Germany
markus.loechtefeld@dfki.de
Johannes Schöning
German Research Center for
Artificial Intelligence
Saarbrücken, Germany
schoening@dfki.de
Dominic Gorecky
Technical University of
Kaiserslautern, Germany
dominic.gorecky@mv.uni-kl.de
Peter Stephan
German Research Center for
Artificial Intelligence
Kaiserslautern, Germany
peter.stephan@dfki.de
Antonio Krüger
German Research Center for
Artificial Intelligence
Saarbrücken, Germany
antonio.krueger@dfki.de
Michael Rohs
Deutsche Telekom Laboratories
Technische Universität Berlin
Ernst-Reuter-Platz 7
10587 Berlin, Germany
michael.rohs@telekom.de
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3463
General Terms
Human Factors
Introduction
In today's retail environment the stores more and more
try to meet the expectations of the consumers. They
not only strive towards an unmanageable range of
products to supply the consumer with the desired
product, they also giving them more and more the
opportunity to customize the product. In the last ten
years a change from mass manufacturing to mass
customization of products has taken place [6]. Lately
many companies went one step further and enabled the
customers to personalize their products online and then
manufacture it for them. For example websites exist
where costumers can customize standard sport shoes in
terms of color and texture for an additional fee. This
shows that a huge market for such services exists and
that customers are willing to pay an extra fee for the
customization of their products. The drawback of all
these web-based services is that the client only gets a
virtual version of the product. Indeed most of the
services allow a 360° view of the final product but this
is in most cases neither easy nor natural. Besides this
shortcoming, the products are not directly available,
the purchaser has to wait until it gets assembled and
delivered. In this paper we investigate human factors
when allowing costumers to customize the products
directly at the POS. In doing so we believe that the
adjustments to the products become a tangible
experience, like deciding which product should be taken
out of the shelf and put into the shopping cart. Our
mobile prototype enables the user to get direct
feedback about the look and feel of customized
products. By utilizing an augmented reality overlay
over the commodity users can manipulate the product
specification and interact with a real product at the
same time. In addition we highlight how this product
can be produced on the fly.
So-called Smart Homes have been established all
around the world, these instrumented environments
test the interaction of new technology in the field of
consumer products. For industrial applications today
only few of such smart instrumented factories exist,
e.g. the SmartFactoryKL of the DFKI (German Research
Center for AI) in Kaiserslautern, Germany [9]. These
automated factories allow altering the production chain
on the fly and thus making products with changed
properties available on demand. Combined with a
mobile device, which let the user customize a product,
such a factory could produce for the user matching
products right at the POS. The prototype presented in
this paper is able to communicate the specification of
the customer designed product directly to the
SmartFactoryKL and therefore makes the products
available immediately and creates a seamless shopping
and customization experience.
Related Work
Web-based clients for product customization are widely
spread: As already mentioned in the introduction shoes
manufactures such as Nike, Rebook, Adidas and others
allow web-based customization. The services NIKEiD1
and miAdidas2 are the most prominent examples.
Reebok developed with Your Reebok3 an iPhone
application that allows the user to customize a shoe
1 http://nikeid.nike.com
2 http://www.adidas.com/en/miadidas
3 http://www.mobilemarketer.com/cms/news/commerce/3247.html
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3464
just to his needs and expectations on a mobile device.
This makes it possible to examine the original model of
the shoe in a store while designing a personal version
of it on the mobile device. Of course other products of
the product catalog of these companies are
customizable (e.g. clothing, shirts) [8]. Of course there
is a huge variety of customizable products ranging from
desktop computer, laptops, bicycles, postcards, toys,
cereals, coffee, cars, to jewelry and cosmetics [6].
Customization in that sense means that the costumers
is able to select a combination of properties that lead to
a new design variation. Not only with the release of the
FLARToolkit (Flash toolkit to allow web-based
Augmented Reality (AR) applications) many augmented
reality advertisements appeared there are stand alone
applications as well trying to make products such as
customized sun glasses more tangible. But of course
users prefer wearing real sun glasses, in comparison, to
wear an augmented reality sun glass. With the Virtual
Mirror Bichelmeier et al. [4] presented an augmented
reality approach for visualization of customer adapted
shoes. As an extension of the miAdidas [4] web
application, the system records a video stream of the
user wearing the standard model and overlays it with
the customized design of the user. One disadvantage of
this system, besides the fact that the customization
process is done online and that the user can only
examine the final design, is that it needs an
instrumented mirror available at the store. In contrast
to related work we use a mobile magic lens [1]
approach, which allows customizing a product (see
figure 1 and 2). Our application combines the
customization process with the augmented reality
aspect by integrating a physical product instance
directly into the virtual modification process.
Interaction concepts for tool glasses or magic lenses
with different kind of objects are well studied [7]. In
our approach we provide a video see-through interface
on the product with a camera-equipped handheld
device. The user’s view is mediated by the device and
combined with different graphical overlay on the display
of the device. The user acts on two layers of
information the “transparent” device screen in the focus
and the product, in our case a soap dispenser, in the
visual context. The camera display unit acts as a
movable window into a computer augmented view of
the product.
Interaction Concept
In our System, the customers can use their cell phones
to customize the desired products on the fly at the
point of sale. As a use case of the system, we chose the
customization of liquid soap, because different
properties of a soap/ soap dispenser can be modified
easily and quickly, e.g. the color, fragrance, ph-value
and other ingredients. In addition the soap/ soap
dispenser can be easily produced within our living lab
called SmartFactoryKL. The basic SmartFactoryKL
production infrastructure consists of a continuous
production process, in which raw soap is processed and
colored, and a subsequent, discrete production process,
in which the colored soap is bottled, sealed, labeled and
commissioned. The complete process is designed
strictly modular according to the principle of
“plug’n’work”, which means that each module in the
process chain consists of an independent mechanical
structure and control unit with a clearly defined
function [5]. This can support the integration of parts of
the factory into grocery or drug stores. To customize
the product, there are two possible ways of interaction.
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3465
Interaction Schema I
The underlying idea is to enable easy and arbitrary
customization of production processes and to support
producers in coping with highly customized products.
When producing soap, there are several properties,
which are highly customizable. These properties can be
visual, like the color of the soap, but they can also be
non-visual like the fragrance, pH-value or concentration
of the soap. The users can scan a product with their
mobile phones to determine which properties of the
product are customizable at all and what the current
values of these properties are. Scanning means that
the mobile device is used like a magic lens to determine
several properties of the product. To determine visual
properties like the color of the product, the users can
put a real product in the sight of the mobile phone's
camera. The system automatically analyzes the camera
image and determines the visual properties. The
system detects the color of the product on the image
and the customer can customize the color based on the
detected value. He gets the feedback on his mobile
devices, seeing the soap with a virtual color in the
video stream. This can also be seen in figure 1. When
the user is satisfied with it, he can order the right soap
directly from the SmartFactoryKL. The limiting time
factor is the transportation way from the factory into
the store. The soap can be produced in minutes. While
the SmartFactoryKL is highly modular, one can think of
integrating several modules directly into the stores
(e.g. the coloring part).
Interaction Schema 2
In the second interaction schema, the product is not
adapted the costumers needs, but to the costumers
environment. The costumer takes for example a photo
of her bathroom. The system then detects the average
colors appearing in the bathroom and deriving a color
palette. The system now suggests a color for the soap
to the customer and she can pick the right color fitting
to the colors of her bathroom. This is illustrated in
figure 2. Both ways of interaction have in common that
after the customer finishes the customization of the
product, he can see how the customized product looks
on the mobile phone. The mobile phone sends the
customization data directly to the SmartFactoryKL to
integrate the product customization directly into the
production chain. By the use of this concept, the
customers can interact with a real product to customize
it. They get direct visual feedback on their mobile
phone. By the integration of the customization process
Figure 1: A user is customizing the color of a liquid soap. The soap dispenser is held in
front of the mobile device camera (left). The user can pick a color (middle) and a
preview image is displayed. The product than can be ordered with a single click
(interaction schema 1).
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3466
into the product chain, the customers also immediately
see how the customized product is produced and how it
looks in real. When the user is done with customizing
the product on the mobile device, the device sends the
customization data directly to the SmartFactoryKL to
integrate the product customization directly into the
production chain. The user can immediately watch the
factory producing the customized product.
Implementation
To test our system, we implemented a prototype for
Apple's iPhone platform. With the prototype, the user
can either scan a real product by taking a picture of it
an use the products color as basis for the color
customization or he can take a picture of his bathroom
a customize and request a color that matches the
colors in the bathroom. In order to change the color of
the soap virtually on the image, the contours of the
soap bottle are detected. Therefore, the image is
transferred into a grey scale image in the first step. In
the next step, the active contour model of Chan and
Vese [3] is applied to obtain the contours of the bottle
to adapt the perform the replacement. Other
approaches (e.g. having a visual barcode on the
dispenser in combination with a 3D model of it) are also
feasible. To determine the colors appearing in the
image of the bathroom, we observe the normalized
color histogram of the image. Color histograms are
invariant to translation, rotation around the viewing
axis and change slowly with distance to the object and
partial occlusion. The proposals for matching colors are
calculated on the basis of the color histogram.
The spectrum of colors that can be mixed by the
SmartFactoryKL within the process is nearly unlimited.
Similarly to an ink-jet printer, the subtractive color
model is used to reproduce the desired color out of the
three basic colors cyan, magenta and yellow. However,
the flexible production structure provides basically way
more parameters to configure, such as type of the
bottle, filling quantity or pH-value of the soap. The
current job management in the SmartFactoryKL
functions as follows: Once the individual product was
designed, the order is sent to a server-based
production planning and control tool (PPC). Since the
production system consists of independent modules,
the information, how the product has to be
manufactured, will not be processed in a central control
unit. In this way the production line is able to produce a
large number of product variants in an arbitrary
sequence with about 1-2 dispensers a minute (see
figure 3).
User Feedback
We collect first preliminary feedback from 6 users in
unstructured interviews. We ask them if they want to
use such an application. They gave us promising
comments such as: “Never have to put a blue soap into
my beige bathroom – I just produce me the right one”,
or “it would be wonderful to allow customization of the
fragrance. Often the best looking soaps are smelling
terrible”.
Conclusion and Future Work
With our prototype implementation, we presented an
easy to use method for on-the-fly product
customization where the user can interact with a real
product. We want to make the approach more robust
(e.g. color variances caused by different lighting
conditions) and allow more customization options. In
the prototype, the user can only customize the color of
the product as a visual property. As future work, we
Figure 2: A user is taking a
picture of a bathroom and the
system than suggesting an
adequate color matching the
design of the bathroom
(interaction schema 2).
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3467
want to allow the system to
determine the non-visual
properties of the product. We
want to integrate additional
technologies like an odor
sensor and actuator system
into the mobile phone to allow
the costumers to change more
properties of the product. In
addition we want to formal
evaluate the advantage of
mobile product customization
against already web-based
services. We think that the
close interlocking with the
SmartFactoryKL is the key to
success of those services. The
integrated customization and
production chain we help to
bring these mobile services into
the markets. A standalone
application cannot be successful in this domain.
References
[1] Bier, E. A., Stone, M. C., Pier, K., Buxton, W. and
DeRose, T. Toolglass and magic lenses: the see-
through interface. In Proc. of SIGGRAPH '93, ACM
Press, (1993), 73-80.Anderson, R.E. Social impacts of
computing: Codes of professional ethics. Social Science
Computing Review 10, 2 (1992), 453-469.
[2] Bichlmeier, C., Henning, S.M., Feuerstein, M.,
Navab, N. The Virtual Mirror: A New Interaction
Paradigm for Augmented Reality Environments. IEEE
Trans. Med. Imag., vol. 28, no. 9, pp. 1498-1510,
September 2009
[3] T. F. Chan, L. A. Vese: Active contours without
edges. IEEE Transactions on Image Processing,Vol. 10,
No. 2, pp. 266–277, Febr. 2001.
[4] Eisert, P., Rurainsky, J., Fechteler, P. Virtual Mirror:
Real-Time Tracking of Shoes in Augmented Reality
Environments. In Proc. of ICIP 2007, IEEE, Volume 2,
Issue , Sept. 16 2007-Oct. 19 2007 Page(s):557 - 560
[5] Görlich, D., Stephan, P. and Quadflieg, J.
Demonstrating remote operation of industrial devices
using mobile phones. Proceedings of Mobility '07, 2007,
S. 474–477.
[6] Hvam, L., Mortensen, N.H., Riis, J.: Product
Customization, Springer 2008, XII, 283 p.
[7] Rohs, M., Schöning, J., Schleicher, R., Essl, G.,
Naumann, A., Krüger, A. Impact of Item Density on
Magic Lens Interactions In Prof. of MobileHCI, Bonn,
Germany, 2009.
[8] Waller, M.A., Dabholkar, P.A., Gentry, J.J.
Postponement, Product Customization, and Market-
Oriented Supply Chain Management. Journal of
Business Logistics, 2000
[9] Zuehlke, D. “SmartFactory–from Vision to Reality in
Factory Technologies,” Plenary Paper. 17th IFAC WC,
vol. 8, 2008, S. 82–89.
Figure 3: The production of personalized soap
dispensers in the SmartFactoryKL. Color mixing unit
(left) and personalized soap dispensers (right).
CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2)
April 12–13, 2010, Atlanta, GA, USA
3468
... An early study by Gehring et al. explored the usage of a mobile AR configurator app as a means of on-the-fly customization at the point of sale [15]. The app allowed the user to customize the color of a soap dispenser on their phone, either by picking a color directly or by calculating the most fitting color from the environment. ...
Conference Paper
Full-text available
Augmented Reality (AR) has recently found high attention in mobile shopping apps such as in domains like furniture or decoration. Here, the developers of the apps focus on the positioning of atomic 3D objects in the physical environment. With this focus, they neglect the configuration of multi-faceted 3D object composition according to the user needs and environmental constraints. To tackle these challenges, we present a model-based approach to support AR-assisted product configuration based on the concept of Dynamic Software Product Lines. Our approach splits products (e.g. table) into parts (eg. tabletop, table legs, funnier) with their 3D objects and additional information (e.g. name, price). The possible products, which can be configured out of these parts, are stored in a feature model. At runtime, this feature model can be used to configure 3D object compositions out of the product parts and adapt to user needs and environmental constraints. The benefits of this approach are demonstrated by a case study of configuring modular kitchens with the help of a prototypical mobile-based implementation.
... The challenge lies in deciding how to display and arrange the information [49], as the information must not be ambiguous and must obscure as few objects as possible. However, if the information is appropriately arranged, AR can enhance a customer's expectations of a sales product by enabling a natural and intuitive interaction [18]. A designer must decide the degree of virtualization on a contextual basis. ...
... Maintainers are among these important people as they are responsible to service the equipment in the cleanroom to guarantee a high quality production. To reach such high quality productions, research has already shown that by means of mobile interfaces (e.g., [3]) or situated interfaces (e.g., [6]) production processes in the factory could be improved. Regarding maintenance activities, the use of augmented interfaces in the factory context has shown promise and was researched thoroughly lately (e.g., [7]). ...
... Besides web-based approaches, an Augmented Reality (AR) approach to customize a product using a smartphone 3 http://www.adidas.com/en/miadidas 4 http://www.mobilemarketer.com/cms/news/commerce/3247.html was introduced by . They created a tangible experience by integrating a real product into the activity of customization. ...
Article
Full-text available
Today, customers have a high demand on personalized products. Manufactures try to address this demand in various ways: e.g., they produce the same product in different variations or adapt the product package to a special event (e.g., sport events). Furthermore, they offer Web-based platforms to allow customization by the users. For the majority of industrial companies, customizing products and services is among the most critical means to deliver true customer value and achieve superior competitive advantages in the future. This paper describes how smartphones can be used at the Point of Sale (POS) to customize products. The described interaction techniques utilize a physical representation of the product itself and they help to match the customer 's expectations with the final modified product and allow the most natural and intuitive interaction.
... These days a manufacturing plant can only be accessed directly on an internal local access point. On the other hand global web-interaction forms provide the same possibilities to order, but everywhere (Gehring et al., 2010). Pen-based interaction can be seen as a novel, third way of addressing orders to a producing facility, but in a natural user-friendly way. ...
Conference Paper
This paper proposes a novel digital system for ordering customized products in a convenient pen and paper setting. In particular we integrate pen-based interaction forms which automatically recognize natural handwriting. The integration of these forms in a factory environment describes a novel way of addressing orders to a producing facility beside usual ways like direct access or web-interaction forms.
... As an extension of the miAdidas web application [3], the system records a video stream of the user wearing the standard model and overlays it with the customized design of the user. An application scenario for augmented reality and magic lens techniques [4] is product customization with mobile phones [6, 5]. With ShelfTorchlight, another use case for the retail domain was proposed by Löchtefeld et al. [11, 12]. ...
Conference Paper
Full-text available
While shopping websites provide rich customer support through their adaptiveness, static paper-based leaflets are still one of the most important advertising mechanisms for retailers even in todays digital world. With their physical qualities, they create a higher emotional connection and with that more positive memories for brands and retailers. In this paper, we investigate two concepts for Augmented Reality advertising for such leaflets to bridge the digital divide. One of them is following a Guerrilla marketing approach, which allows users to easily compare products of different retailers. The second concept investigates different strategies for visualizing cross-selling recommendations inside the leaflets. We report on initial user feedback and discuss ideas for future work in the field of Augmented Reality advertising.
... These days a manufacturing plant can only be accessed directly on an internal local access point. On the other hand global web-interaction forms provide the same possibilities to order, but everywhere (Gehring et al., 2010). Pen-based interaction can be seen as a novel, third way of addressing orders to a producing facility, but in a natural user-friendly way. ...
Conference Paper
Full-text available
This paper proposes a novel digital system for ordering customized products in a convenient pen and paper setting. In particular we integrate pen-based interaction forms which automatically recognize natural handwriting. The integration of these forms in a factory environment describes a novel way of addressing orders to a producing facility beside usual ways like direct access or web-interaction forms.
Conference Paper
Mobile phone applications have received intensive attention by marketers due to the high engagement of users and its positive persuasive impact on brand. However, how can companies get on the right track of designing branded apps? Little research has been done on the identification of the elements which can be used to design branded apps strategy. Our research aims to offer a design framework of branded apps by identifying constructs from the perspective of company, user and technology respectively. By evaluating 84 mobile apps from top 11 FMCG (Fast Moving Consumer Goods) brands, we examine the usage of mobile interaction, social interaction and brand interaction in current branded apps design.
Article
Mobile platforms (e.g., Google Android, Apple iOS) and their closely integrated app stores transformed the mobile industry and opened the market for mobile application developers. Consequently, applications for smartphones quickly soared to phenomena levels. As mobile technology continues to evolve and shape human interaction with technology, human-centered design (HCD) methods adapt to the capabilities of technology and to the needs of mobile application development. This study presents a preliminary review of 79 research papers on the practice of HCD in mobile application development for the smartphone touch era. The aim of the study is to highlight emerging methods and their implications for mobile application development. The methods discovered by this study assist mobile application developers to better understand their target users. Further research is needed, particularly in exploring what user research and evaluation methods are the most effective in the context of mobile application development.
Article
Full-text available
This paper summarizes the five-year development project of Construction Director, a BIM (Building Information Modeling) tool. During the first year, the development team from National Taiwan University (NTU) completed the architectural design of the system after conducting intensive interviews and meetings with key CTCI Corporation (CTCI) stakeholders. The NTU team then implemented the BIM tool for 4D simulation in the second year. During the following year, the NTU team tested and modified the system to overcome any usability problems. In the fourth year, the NTU team worked closely with CTCI and created a process to integrate the BIM tool into the company workflow. The system was then officially deployed on a design-build project in the final year. This paper summarizes the Lessons Learnt during each year of the project, and concludes that a trusting environment was the key contributing factor to the success of the project. This paper is not expected to only be beneficial for large companies who are planning to develop their own customized solutions, but it is also meant to be useful for software developers who are involved in similar projects. The experiences may help reduce overall investment risks, simplify the development process and guarantee the eventual success of introducing a new solution to a company.
Article
Full-text available
Toolglass™ widgets are new user interface tools that can appear, as though on a transparent sheet of glass, between an application and a traditional cursor. They can be positioned with one hand while the other positions the cursor. The widgets provide a rich and concise vocabulary for operating on application objects. These widgets may incorporate visual filters, called Magic Lens™ filters, that modify the presentation of application objects to reveal hidden information, to enhance data of interest, or to suppress distracting information. Together, these tools form a see-through interface that offers many advantages over traditional controls. They provide a new style of interaction that better exploits the user's everyday skills. They can reduce steps, cursor motion, and errors. Many widgets can be provided in a user interface, by designers and by users, without requiring dedicated screen space. In addition, lenses provide rich context-dependent feedback and the ability to view details and context simultaneously. Our widgets and lenses can be combined to form operation and viewing macros, and can be used over multiple applications.
Conference Paper
Full-text available
In this paper, we present a system that enhances the visualization of customized sports shoes using augmented reality techniques. Instead of viewing yourself in a real mirror, sophisticated 3D image processing techniques are used to verify the appearance of new shoe models. A single camera captures the person and outputs the mirrored images onto a large display which replaces the real mirror. The 3D motion of both feet are tracked in real-time with a new motion tracking algorithm. Computer graphics models of the shoes are augmented into the video such that the person seems to wear the virtual shoes.
Conference Paper
Full-text available
We conducted a user study to investigate the effect of visual context in handheld augmented reality interfaces. A dynamic peephole interface (without visual context beyond the device display) was compared to a magic lens interface (with video see-through augmentation of external visual context). The task was to explore objects on a map and look for a specific attribute shown on the display. We tested different sizes of visual context as well as different numbers of items per area, i.e. different item densities. We found that visual context is most effective for sparse item distributions and the performance benefit decreases with increasing density. User performance in the magic lens case approaches the performance of the dynamic peephole case the more densely spaced the items are. In all conditions, subjective feedback indicates that participants generally prefer visual context over the lack thereof. The insights gained from this study are relevant for designers of mobile AR and dynamic peephole interfaces by suggesting when external visual context is most beneficial.
Conference Paper
Full-text available
In the SmartFactoryKL, the intelligent factory of the future, a consortium of companies and research facilities explores new, intelligent technologies. Being a development and demonstration center for industrial applications, the SmartFactoryKL is arbitrarily modifiable and expandable (flexible), connects components from multiple manufacturers (networked), enables its components to perform context-related tasks autonomously (self-organizing), and emphasizes user-friendliness (user-oriented). In this paper, we present a prototypical system that enables commercial mobile phones to monitor, diagnose, and remotely control plant components via Bluetooth.
Article
Full-text available
This article reports on two user studies investigating the effect of visual context in handheld augmented reality interfaces. A dynamic peephole interface (without visual context beyond the device display) was compared to a magic lens interface (with video see-through augmentation of external visual context). The task was to explore items on a map and look for a specific attribute. We tested different sizes of visual context as well as different numbers of items per area, i.e. different item densities. Hand motion patterns and eye movements were recorded. We found that visual context is most effective for sparsely distributed items and gets less helpful with increasing item density. User performance in the magic lens case is generally better than in the dynamic peephole case, but approaches the performance of the latter the more densely the items are spaced. In all conditions, subjective feedback indicates that participants generally prefer visual context over the lack thereof. The insights gained from this study are relevant for designers of mobile AR and dynamic peephole interfaces, involving spatially tracked personal displays or combined personal and public displays, by suggesting when to use visual context.
Conference Paper
In our daily life we are more and more dependent on the latest technologies in electronics and communication. Our mobile phones become powerful multimedia systems, our cars computer systems on wheels, and our homes will turn into smart living environments. All these advances must be turned into products for very cost-sensitive world markets in shorter cycles than ever before. The resulting requirements for design, setup, and operation of our factories become crucial for success. In the past, we often increased complexity in structures and control systems resulting in inflexible monolithic production systems. But the future must become “lean” – not only in organization, but also in planning and technology! We must develop technologies which allow us to speed up planning and setup, to adapt to rapid product changes during operation, and to reduce the planning effort. To meet these challenges we should also make use of the smart technologies of our daily life. The advances in wireless communication will allow us to avoid cables. Powerful mobile computers or smartphones will replace many of the traditional control panels and abstract services will replace bits and bytes in control. These advances will not only lead to mobility for machines and people but also to new challenges in system design. The SmartFactoryKL initiative was founded by many industrial and academic partners to create and operate a demonstration and research test bed for future factory technologies. Many projects develop, test, and evaluate new solutions. This presentation describes changes and challenges, and it summarizes the experience gained to date in the SmartFactoryKL approach.
Article
In this paper, we propose a new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford--Shah functional for segmentation and level sets. Our model can detect objects whose boundaries are not necessarily defined by gradient. We minimize an energy which can be seen as a particular case of the minimal partition problem. In the level set formulation, the problem becomes a "mean-curvature flow"-like evolving the active contour, which will stop on the desired boundary. However, the stopping term does not depend on the gradient of the image, as in the classical active contour models, but is instead related to a particular segmentation of the image. We will give a numerical algorithm using finite differences. Finally, we will present various experimental results and in particular some examples for which the classical snakes methods based on the gradient are not applicable. Also, the initial curve can be anywhere in the image, and interior contours are automatically detected.
Article
Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.
Article
We propose a new model for active contours to detect objects in a given image, based on techniques of curve evolution, Mumford-Shah (1989) functional for segmentation and level sets. Our model can detect objects whose boundaries are not necessarily defined by the gradient. We minimize an energy which can be seen as a particular case of the minimal partition problem. In the level set formulation, the problem becomes a "mean-curvature flow"-like evolving the active contour, which will stop on the desired boundary. However, the stopping term does not depend on the gradient of the image, as in the classical active contour models, but is instead related to a particular segmentation of the image. We give a numerical algorithm using finite differences. Finally, we present various experimental results and in particular some examples for which the classical snakes methods based on the gradient are not applicable. Also, the initial curve can be anywhere in the image, and interior contours are automatically detected.