Figure 5 - uploaded by Paulo Ferreira
Content may be subject to copyright.
Source publication
This paper proposes a new approach to building a virtual enterprise (VE) software infrastructure that offers persistence, concurrent access, coherence and security on a distributed datastore based on the distributed shared-memory paradigm. The platform presented, persistent distributed store (PerDiS), is demonstrated with test applications that sho...
Contexts in source publication
Context 1
... is no concurrent access specification in the current definition of the SDAI. As shown in Figure 5, the SDAI data are divided into two categories: Dictionary data and Application data. Normally, the Dictionary data is read by all applications and it is not modified often. The ...
Similar publications
In this paper, we propose a new approach to build a Virtual Enterprise software infrastructure that offers persistence, concurrent access, coherence and security on a distributed-shared data store based on distributed-shared memory paradigm.
Citations
... Following the previously developed architectures (e.g. Camarinha-Matos, Afsarmanesh, and Osorio (2001), Sandakly et al. (2001)), Giret, Garcia, and Botti (2016), proposed an open architecture utilising agents that was amenable to emanufacturing systems. Ghomi, Rahmani, and Qader (2019) reviewed concepts, architectures, and platforms of cloud manufacturing. ...
Open systems have been of interest to the research and industrial community for decades, e.g. software development, telecommunication, and innovation. The presence of open manufacturing enterprises in a cloud calls for broadly interpretable models. Though there is no global standard for representation of digital models of processes and systems in a cloud, the existing process modelling methodologies and languages are of interest to the manufacturing cloud. The models residing in the cloud need to be configured and reconfigured to meet different objectives, including complexity reduction and interpretability which coincide with the resilience requirements. Digitisation, greater openness, and growing service orientation of manufacturing offer opportunities to address resilience at the design rather than the operations stage. An algorithm is presented for complexity reduction of digital models. The complexity reduction algorithm decomposes complex structures and enhances interpretability and visibility of their components. The same algorithm and its variants could serve other known concepts supporting resilience such as modularity of products and processes as well as delayed product differentiation. The ideas introduced in the paper and the complexity reduction algorithm of digital models are illustrated with examples. Properties of the graph and matrix representations produced by the algorithm are discussed.
... " [Luczak & Hauser, 2005]. Since one of the virtual organizations' main characteristics is the continuous change, their efficiency is determined by the speed and accuracy with which information can be managed and exchanged among the business partners [Sandakly et al., 2001]. The several cooperating nodes in virtual organizations are geographically distributed while their coordination takes place through electronic communications. ...
Modern enterprises tend to form virtual but legally consolidated schemas for collaborating in order to function and survive in constantly changing grounds, where competition always grows and the emergence of new technologies keeps on posing new challenges. They collaborate in virtual organizations, where several segments of work or parts of collaboration can be often identified as recurring and be reused. In order to take advantage of this repetition, we consider the introduction of Collaboration Patterns in VOs. We propose a combined schema for using Collaboration Patterns along with the initiatives of the Event driven technology, in order to capture the ad hoc nature of these collaborations and cope with the demanding and rapidly changing environments of virtual organizations.
... This feature can be set by providing an open information space where the different actors can freely access or exchange the information they need. For this purpose, CORBA based architecture (Zhang et al. 2000) or dedicated distributed frameworks (Sandakly et al. 2001) can be used. Distributed collaborative architecture can couple formal and informal cooperation providing that each node integrates formal exchanges (as in an EDI (Electronic Data Interchange) framework) and offers security and co-ordination features as in the PRODNET architecture (Camarinha-Matos et al. 1998). ...
To fit economic constraints, enterprises are more and more focused on their own business. This trend involves to renew enterprise
organization in order to integrate inter-enterprise partnership. These virtual enterprise organizations require modeling techniques
able to take into account inter-enterprise cooperation, flexible and lean enough to be used on “short term” projects and fitting
legal constraints as contractual exchanges between enterprises. According to a mid-term point of view, these contracts may
include the description of common Business Process (BP) or collaboration rules. Then, each business process enactment may
also be related to more specific contracts (orders, invoices...) linked to each enterprise activity. To fit the Virtual Organization
requirements, we propose a global architecture coupling BP models and contract frameworks. This architecture is based on the
enterprise BP adaptation to fit openness constraints. For this purpose, security requirements are added to the BP specification
and the traditional BP organization is split into business transaction. Then generic models describing contractual facilities
and the Virtual Organization collaborative organization are set so that BP enactment (including contract enactment) can be
developed.
... At Europe level, an ESPRIT project, PRODNET II, aims at designing and implementing a federated database architecture as the base support framework to effectively manage certain issues associated with the sharing and exchange of information in the VE environment, such as the physical distribution of data, the enterprise autonomy and privacy enforcement, access rights to shared information, and data visibility levels, among others [17]. Sandakly et al. have proposed a new approach to building a virtual enterprise software infrastructure that offers persistence, concurrent access, coherence and security on a distributed datastore based on the distributed shared-memory paradigm [18]. This approach generated a platform of persistent, distributed and shared memory called PerDiS. ...
In the emerging agile manufacturing paradigm, there is a great need for a flexible and re-configurable IT platform to form virtual enterprises. In this paper, according to the functional requirements of virtual agile manufacturing, a pragmatic Web-based platform entitled "E-DREAM" has been developed to support the virtual enterprising. Firstly, this paper discusses the E-DREAM basic architecture, infrastructure, and the global object model of E-DREAM. Next, based on the information, information interaction, and role classification, the distributed information management and role management in E-DREAM are interpreted, illustrating that the information access visibility level is dependent on the role that an agile partner plays in a VE (Virtual Enterprise). Making use of CORBA-based method, the implementation of wrapping software resources is conducted, which aims at interoperating the remote software resources. In the end, the E-DREAM prototype implementation is presented. Through the E-DREAM architecture development and prototype system implementation, we have come up with a thorough approach for building agile virtual enterprises, configuring and re-configuring working platforms for different agile partners.
... Coupled to exchange standards as STEP for CAD data for example (Zhang et al. 2000), they provide an open framework that can support both indirect and direct co-operation. Nevertheless, such "shared memory" infrastructureas the one developed by (Sandakly et al. 2001) -relies on a rather detailed knowledge of the different information systems organization. Moreover, information systems protection (i.e. ...
Information and communication technologies can be seen as driving elements in virtual enterprise organization. As far as alliances of SMEs are concerned, the required infrastructure must be light, with a rather low cost. To choose an acceptable infrastructure organization, a multi-criteria analysis describing the technical requirement and the way the virtual enterprise organization is perceived and integrated by each partner is proposed and can be used as a generic guideline to define adapted security policies in each enterprise..
... This feature can be set by providing an open information space where the different actors can freely access or exchange the information they need. For this purpose, CORBA based architecture (Zhang et al. 2000) or dedicated distributed frameworks (Sandakly et al. 2001) can be used. ...
Congrès PROVE'02, 3rd IFIP Working conference on infrastructures for virtual enterprises, Algarve, Portugal, 1-3 May 2002
New advancements in computers and information technologies have yielded novel ideas to create more effective virtual collaboration platforms for multiple enterprises. Virtual enterprise (VE) is a collaboration model between multiple independent business partners in a value chain and is particularly suited to small and medium-sized enterprises (SMEs). The most challenging problem in implementing VE systems is ineffcient and inflexible data storage and management techniques for VE systems. In this research, an ontology-based multi-agent virtual enterprise (OMAVE) system is proposed to help SMEs shift from the classical trend of manufacturing part pieces to producing high-value-added, high-tech, innovative products. OMAVE targets improvement in the flexibility of VE business processes in order to enhance integration with available enterprise resource planning (ERP) systems. The architecture of OMAVE supports the requisite flexibility and enhances the reusability of the data and knowledge created in a VE system. In this article, a detailed description of system features along with the rule-based reasoning and decision support capabilities of OMAVE system are presented. To test and verify the functionality and operation of this system, a sample product was manufactured using OMAVE applications and tools with the contribution of three SMEs.
CNC manufacturing has evolved through the use of faster, more precise and more capable CNC controllers and machine tools. These enhancements in machine tools however have not been integrated under a common platform to support CAD/CAM/CNC software inter-operability and as a result a plethora of standards is being used for these systems. ISO10303 (STEP) and ISO14649 (STEP-NC) seek to eliminate the barriers in the exchange of information in the CNC manufacturing chain and enable interoperability throughout the manufacturing software domain. This paper introduces a novel software platform called the Integrated Platform for Process Planning and Control (IP3AC) to support the rapid development of STEP-NC compliant CNC manufacturing software.
The major project goal of the ISTforCE project is to provide an open, human-centered Web-based collaboration environment which supports concurrent engineering while working on multiple projects simultaneously and offers easy access to specialized engineering services distributed over the Web. Normally, engineering applications are bought and then installed and used locally, but in the last years there is a growing interest, especially by small, highly specialised vendors, to offer such applications on rental or "pay per use" basis. The main innovation of ISTforCE is in the human-centred approach enabling the integration of multiple applications and services for multiple users and on multi-project basis. This should lead to support the work of each user across projects and team boundaries and to establish a common platform where providers of software and end users (engineers, architects, technicians, project managers) can meet. This paper will focus on three specific aspects of the ISTforCE project. The first addresses product data models also referred to as Building Information Models (BIMs) given the specific scope of the application domain dealt with and also covers the contribution of ISTforCE to standardization activities, either within the frame of ISO / STEP or of the IAI and the development of the Industry Foundation Classes. The second will address part of the software architecture designed to support concurrent engineering activities with the development of the Product Data server (PDS) and finally we will cover AI based applications, more specifically in the area of automated building conformance checking (CCS).
Ce document n'engage en rien le CSTB et ne reflète que les vues de son auteur principal.
La mission logicielle telle qu’elle a été définie s’intéresse à un certain nombre de questions fondamentales qui ont toutes un impact sur la capacité du CSTB à agir en qualité de concepteur, développeur, producteur, éditeur et vendeur de logiciels.
Bien que de nombreuses relations existent entre ces grandes questions, il a été pour des raisons de clarté de la présentation utile de les aborder de manière séparée, le lecteur devant conserver à l’esprit qu’elles sont souvent intimement liées. Par exemple, il ne saurait être question de politique technique sans s’intéresser aux dispositions organisationnelles qui peuvent être prises pour en garantir la bonne exécution, ou encore d’une stratégie technique qui ferait fi des réalité du terrain du point de vue du parc matériel et logiciel des utilisateurs ou de considérations commerciales lors du déploiement des produits en faisant des hypothèses fortes sur les capacités d’équipement des clients, etc. Ainsi les questions sont interdépendantes, et agir sur l’une sans mesurer les effets ou les conséquences sur les autres est de peu d’utilité. C’est probablement la complexité de chacun des sujets que nous allons aborder, couplée à leurs inter-relations qui rend l’activité logicielle difficile.
La politique technique est une question essentielle. On y abordera des thèmes comme la mesure de la pertinence d’une solution technique, les conséquences en termes de productivité, de cohérence entre les développements entrepris dans différents départements du CSTB, d’interopérabilité et pérennité des développements logiciels par l’adoption de langages et de plate-formes de développement préférentiels, de la mise en œuvre de bibliothèques de composants logiciels réutilisables, de la documentation systématique et standardisée selon des normes de l’entreprise, de maintenance corrective et adaptative, etc. Le champ est immense et si le CSTB peut s’enorgueillir de posséder divers métiers celui du logiciel en tant que tel ne lui est finalement pas vraiment naturel.
Les questions de responsabilités préoccupent la Direction du CSTB, à juste titre, compte tenu des conséquences importantes que peuvent avoir un usage erroné ou tout simplement hors des limites prévues d’un logiciel. Ceci est particulièrement vrai compte tenu des capitaux engagés dans des opérations de construction ou de TP, les erreurs, et malfaçons engendrés par voie de conséquences ou autres problèmes se soldant généralement par un impact financier important voire considérable et des délais, lesquels engendrent eux-même en retour des pénalités. Les hypothèses de modélisation et les techniques de résolution numériques, doivent être explicitées et comprises des utilisateurs qui doivent s’engager à décliner la responsabilité des concepteurs selon des modalités sur lesquelles nous reviendront par la suite. Au titre de la portée juridique interviennent également les questions de protection des logiciels, des droits qui leurs sont attachés et du transfert de ces droits aux utilisateurs, avec toutes les limites qu’il convient d’imposer pour diverses raisons, ne seraient-ce que celles de responsabilité qui viennent d’être exposées.
Les processus décisionnels et la logique économique doivent être bien maîtrisés et la décision de la diffusion d’un logiciel doit être prise en parfaite connaissance de cause des implications sur les autres activités. On peut en effet parfois penser que la mise à disposition d’un outil logiciel sophistiqué auprès des utilisateurs finaux pourrait être de nature à diminuer les montants de contrats obtenus lorsque ces derniers s’appuyaient sur la mise en œuvre d’un logiciel propriétaire assez sophistiqué pour justifier de dispenser des prestations. Nous reviendrons par la suite sur ces arguments, et sans en dire davantage, nous avons tendance à penser que la diffusion du logiciel – si elle est faite correctement et que les utilisateurs peuvent s’appuyer sur l’éditeur en fonction que de besoin – peut au contraire augmenter les volumes d’affaires. Ce point de vue sera argumenté par la suite.
La diffusion suppose des organisations et des démarches spécifiques tant pour le développement que pour la promotion et la commercialisation dans un contexte où la connaissance de l’évolution rapide des technologies informatiques reste un investissement permanent et où la cible commerciale est souvent mouvante. Il ne suffit pas de remplir un service que l’on a pu identifier à un moment donné et qui correspond à un besoin clair d’une population d’utilisateurs, encore faut-il que les conditions de la cible n’aient pas changées entre le moment où le chantier est lancé et celui où la commercialisation s’opère. Ce genre de problème s’est déjà malheureusement rencontré, ne serait-ce pour l’illustrer que dans le cas où l’utilisateur ne veut plus – par exemple – d’une solution utilisant des données en local, même si elles sont fréquemment mises à jour sous forme de CDs, mais où il souhaite pouvoir déporter ce coût vers le fournisseur en se connectant par internet à ses serveurs. Dans ce cas, même si le besoin est fonctionnellement satisfait par la solution proposée, le contexte technique ayant suffisamment changé et ce de manière assez rapide pour qu’une migration devienne difficile, il est alors très dur de commercialiser le produit qui rate sa cible. De ce point de vue, les étapes d’investissement doivent être contrôlées régulièrement, et les décisions doivent se fonder sur une analyse du type espérance de gains sur risque en tenant compte de la connaissance du marché, de la concurrence, d’un business plan, etc.
Bien sûr et on le comprend aisément de ce qui vient d’être rapidement évoqué, le CSTB a tout intérêt à mettre en place des partenariats pour s’appuyer sur des spécialistes de ces diverses questions.
Enfin, une mission comme celle-ci, conduite sur une durée de temps très limitée, n’aura permis que de survoler l’ensemble des questions qui se posent et d’offrir un panorama nécessairement un peu rustique de nombre de sujets qui mériteraient d’être approfondis.
Acceptons en l’augure.