IEEE Latin America Transactions

Published by Institute of Electrical and Electronics Engineers
Publications
In the present paper the importance of the water as resource for hydropower, his sub appraisement in the electrical market, and the function of the state as regulatory entity, to demand the rational use in hydroelectric power station.
 
Data warehouses integrate several operational sources to provide a multidimensional analysis of data, thus improving the decision making process. Therefore, an in-depth analysis of these data sources is crucial for data warehouse development. Traditionally, this analysis has been based on a set of informal guidelines or heuristics to support the manually discovery of multidimensional elements on a well-known documentation. Therefore, this task may become highly tedious and prone to fail. In this paper, MDA (Model Driven Architecture) is used to design a reverse engineering process in which the following tasks are performed (i) obtain a logical representation of data sources (ii) mark this logical representation with multidimensional concepts, and (iii) derive a conceptual multidimensional model from the marked model.
 
This work presents collaborative transport control protocol (CTCP), a new transport protocol for sensor networks. It aims at providing end-to-end reliability and adapts itself to different applications through a two level mechanism of reliability variation. CTCP achieves these properties using hop-by-hop acknowledgments and a storage control algorithm that operates at each node along a flow. It was observed that distributed fault recovery increases the average delivery rate and that duplication of storage responsibility minimizes the message loss. Its congestion control differentiates communication losses from buffer overflow. CTCP is called collaborative because all nodes detect and act on congestion control and also because it includes distributed storage responsibility. It is scalable and independent of the underlying network layer. The protocol energy consumption overhead was calculated and discussed for two reliability levels.
 
The increasing demand of multimedia applications requires a new conduct of routing protocols for Wireless Mesh Networks (WMN). It is necessary to support the minimum requirements for Quality of Service (QoS). In this work we propose an extension to the proactive Optimized Link State Routing (OLSR) protocol that differentiates the data traffic and multimedia traffic, in order to provide quality of service and support to applications which use TCP as their transport protocol. The proposal performance, called OLSR Dynamic Choice (OLSR-DC), is evaluated using the network simulator.
 
Ejemplo de red de Petri. 
Representación parcial del Pathway del TLR4 en CPN Tools. 
This paper shows how model-driven software development (MDSD) can be applied in the bioinformatics field since biological data structures can be easily expressed by means of models. The existence of several heterogeneous data sources is usual in the bioinformatics context. In order to validate the information stored in these data sources, several formalisms and simulation tools have been adopted. The process of importing data from the source databases and introducing it in the simulation tools is usually done by hand. This work describes how to overcome this drawback by applying MDSD techniques (e.g. model transformations). Such techniques allow us to automate the data migration process between source databases and simulation tools, making the transformation process independent of the data persistence format, obtaining more modular tools and generating traceability information automatically.
 
Fronteira de Pareto para a base de dados Glass e conjunto de aproximação gerado pelo MOPSO-N..  
Multiobjective Metaheuristics (MOMH) permit to conceive a complete novel approach to induce classifiers. In the Rule Learning problem, the use of MOMH permit that the properties of the rules can be expressed in different objectives, and then the algorithm finds these rules in an unique run by exploring Pareto dominance concepts. This work describes a Multiobjective Particle Swarm Optimization (MOPSO) algorithm that handles with numerical and discrete attributes. The algorithm is evaluated by using the area under ROC curve and the approximation sets produced by the algorithm are also analyzed following Multiobjective methodology.
 
Whenever there is more work to do than resources to overcome it, engineers usually tend to devise some kind of prioritization that ensures that effort is applied to the most important tasks. This is what Barry Boehm called the "value-based" approach. We propose a value-based methodology that prioritizes software artifacts taking into account the frequency of use. The main idea is to provide to maintainers with a "partial view" of the system containing only the code corresponding with the functionality that is really used by users, and thus to "ignore" the code which is not use. The objective of our proposal is to minimize the maintenance effort needed to maintain large systems, by means of reducing the size and complexity of such software. This process that we called "software reduction" is explained in detail in this paper.
 
Definição do ângulo de incidência pélvica (PI). 
Definição do ângulo de declive sacral (SS). 
Gráfico de dispersão dos atributos incidência pélvica e raio pélvico e curvas (elipses) de contorno para as três classes com remoção de outliers. 
This paper reports results from a comprehensive performance comparison among standalone machine learning algorithms (SVM, MLP and GRNN) and their combinations in ensembles of classifiers when applied to a medical diagnosis problem in the field of orthopedics. All the aforementioned learning strategies, which currently comprises the classification module of the SINPATCO platform, are evaluated according to their ability in discriminating patients as belonging to one out of three categories: normal, disk hernia and spondylolisthesis. Confusion matrices of all learning algorithms are also reported, as well as a study of the effect of diversity in the design of the ensembles. The obtained results clearly indicate that the ensembles of classifiers have better generalization performance than standalone classifiers.
 
Rendimiento del algoritmo k-NN mejorado.  
Similarity search has a growing usefulness in a range of applications like time series, image and multimedia databases, biological data, and so on. These searches are implemented by means of neighbour queries in the multidimensional spaces where the data resides. Faced with a large volume of data, it becomes imperative the use of indexing methods with specific algorithms capable of reducing the queries response time. The balanced indexing structures based on kd-trees generate a partition of multidimensional space in holey regions that complicate the underlying topology, hindering the calculation of distances between points and regions. In this paper, we introduce a nearest neighbour search algorithm that exploits this topology for efficiency. The experimental results that are included in this study support the usefulness of the proposed method.
 
Secuencia de desarrollo MDA En la Fig. 2 se muestran, en color gris oscuro, la extensión de UML 2.0-AD, que hemos llamado BPSec (Business Process Security), las reglas QVT para la transformación de modelos, el proceso de negocio seguro (SBP, Secure Business Process) y la herramienta BPSec-Tool que hemos diseñado para especificar un SBP y obtener los artefactos UML en forma automática. En el ámbito de MDA, el uso de la extensión permite hacer especificaciones independientes de computación (Modelo SBP) y pasar, por medio de transformación de modelos, hacia especificaciones independiente de plataforma (clases de análisis y casos de uso). En la última columna de esta figura hemos incorporado los flujos de trabajo del proceso unificado. El objetivo es mostrar que tanto la especificación del SBP como las clases de análisis y casos de uso pueden ser utilizados en forma complementaria en un proceso de desarrollo de software consolidado y exitoso. De esta forma, el modelo SBP será usado en la etapa de " Modelo del Negocio " y las clases de análisis y los casos de uso se usarán en las etapas de " Requisitos " y " Análisis & Diseño " .  
Estereotipos de BPSec
Vista general del perfil BPSec En la Fig. 6 se muestra el modelo de los estereotipos que componen de BPSec (en color gris). Además se puede ver la
Relación entre UML 2.0-AD y BPSec
Business processes (BP) are an important resource in enterprise performance and in maintaining competitiveness. In the last few years, languages used for BP representation have been improved and new notations have appeared. The importance of security in BP is widely accepted. However, the perspective of the business analyst in relation to security has hardly been dealt with. In this paper, we present an extension of the UML 2.0 Activity Diagram which allows us to specify security requirements in BP. We have used UML profile extensibility mechanisms composed of stereotypes, constraints and tagged values. We have also used the OCL to specify the constraints. We apply our proposal in a typical business process related to a patient admission in health-care institution.
 
Information systems supporting business processes can be improved by the use of Ubiquitous Computing (Ubicomp) technologies. However, the dynamism of business processes and the complexity in the construction of Ubicomp systems requires an adequate method for its development. This paper proposes a model driven development method to obtain this kind of systems in a systematic way, abstracting technological details and automating its implementation. The method is presented in depth with a case study of the supply chain of a pharmaceutics company where Ubicomp is used to improve the cold chain and ease the inventory tasks.
 
Arquitectura general de un sistema Web basado en cluster. 
Encaminamiento de peticiones en sistemas basados en clúster con replicas parciales
Arquitectura del switch Web 
Arquitectura del nodo servidor 
The most usual solution in order to improve the performance of a Web server is based on building a distributed architecture, in which the Web service is offered from a set of multiple nodes. The most widespread distributed architecture is the Web system based on cluster or switch Web. The switch Web is the responsible of deciding which of the nodes of the site must attend to each request. In this article a solution of storage based on dynamic partial replication and the architecture of this mechanism of distribution based on agents and emergent techniques are presented.
 
Nuevas clases y asociaciones  
Modelo MD seguro
Generally, security and audit measures for Data Warehouses (DWs) are defined in the final implementation on top of commercial systems because there is not a standard for the exchange and the operability of metadata. The Common Warehouse Metamodel (CWM) proposal is broadly accepted as the standard for the interchange and the interoperability of metadata. Nevertheless, it does not allow us to specify security measures. In this paper, we make use of the extension mechanisms provided by the CWM to extend the relational package to specify, at the logical level, the security and audit rules captured during the conceptual modeling phase of the DWs design. Moreover, in order to show the benefits of our extension, we apply it to a case study related to the management of the pharmacies consortium businesses.
 
Currently, in order to obtain high quality software products it is necessary to carry out a good software processes management in which measurement is a fundamental factor. Due to the great diversity of entities involved in software measurement, a consistent framework to integrate the different entities in the measurement process is required. In this paper the software measurement framework (SMF) is presented, which supports the measurement of any type of software entity through the metamodels which depict them. In this framework, any software entity in any domain could be measured with a common software Measurement metamodel and by means of QVT transformations. This work explains the three fundamental elements of the software measurement framework (conceptual architecture, technological aspects and method). These elements have all been adapted to the MDE paradigm and to MDA technology, taking advantage of their benefits within the field of software measurement. Furthermore an example which illustrates the framework's application to a concrete domain is furthermore shown.
 
The use of simulation tools for reasoning about human societies formalizes their analysis and reduces the related costs, but it demands a high level of expertise in the design and programming of complex systems. This paper presents the SCAT framework aimed at bringing these tools closer to their end users, the researchers in social sciences. For this purpose, it adopts an approach with domain-specific languages based on the activity theory and the situation calculus. The Activity Theory is a paradigm from social sciences to analyze the behaviour of societies through activity systems, which are groups of tasks performed by actors in a social and historical context. SCAT uses the formalism of the Situation calculus to define an operational semantics for these systems. On this basis, SCAT provides a modelling language to describe interconnected activity systems, and an environment for the simulation of these systems and the verification of their properties. An example on human relations organizes this presentation of SCAT.
 
Formato do quadro físico.
Rede Homeplug implementada para testes.
Comparação da vazão média para cada carga e situação de teste.
In literature we can find several papers that analyze the characteristics of the powerlines to data transmission, noises, and employed techniques in communication by powerlines. Almost all the papers that study the HomePlug standard make tests of protocol performance. Some of those papers make comparative tests between the medium access control sublayer (MAC) of the HomePlug and the IEEE 802.11 MAC. This work complements the evaluations found in the literature, having as its aim the study of HomePlug LAN’s, in respect of the performance as a function of the electrical demand. The homeplug network throughput was analyzed, by considering several situations of electrical consumption.
 
, Arquitectura MMDS En el local del usuario, un Cable Modem es el responsable por la conversión de la señal para Ethernet. El transporte de datos desde el Headend para el usuario (downstream) es realizado en un canal (6 MHz), dentro del espectro de señal de TV MMDS, entre 2500 e 2686 MHz, y es compartido entre todos los usuarios. En la dirección  
, Configuración del sistema bajo test En esta plataforma el desempeño es analizado a tres niveles: Análisis a nivel de Red Local– evalúa la calidad del servicio de videoconferencia en el sistema a través de medidas de latencia, tasa liquida de transmisión/recepción, tasa media de frames de video y jitter de audio y video. Análisis a nivel de Transceptor (FI) – analiza los parámetros relacionados al transmisor y receptor. A partir de una cierta potencia de transmisión es verificado el estado de la constelación QAM, la tasa de error de modulación (MER) y la tasa de error de bit (BER) en el receptor. Análisis a nivel de Radiofrecuencia (RF) – toma en consideración los aspectos relacionados a la propagación de la señal, tales como multitrayecto y ruido impulsivo. Debido a que el servicio de videoconferencia requiere de acceso simétrico, fueron establecidas configuraciones de tasas en el CMTS en 1.5 Mbps para upstream y downstream, y 768 Kbps para un segundo test, forzando un acceso simétrico como es mostrado en la Figura 3.  
, Señal digital con presencia de Multitrayecto  
The Cable Modem technology evaluation under the DOCSIS 1.0 specification operating in an MMDS (Multichannel Multipoint Distribution System) access network for video conference service is carried on this work in a qualitative and quantitative approach, making observations in videoconference terminals, modems and access network. The results obtained show the need to add quality of service tools, for IP traffic over Cable Modem technology in order to get a better performance in videoconference applications. The conclusions of this work can be extended to other VoIP services under this technology.
 
Arquitectura de Agile SPI El resultado del proyecto SIMEP-SW es Agile SPI (Software Process Agile Improvement) [9], con la premisa esencial que los modelos utilizados sean ligeros y basados en estándares internacionales, acordes a las características, idiosincrasia y circunstancias de la realidad socio-económica de la naciente industria del software en el sur occidente Colombiano. La arquitectura preliminar de Agile SPI, se presenta en la figura 1, de la cual se observan los siguientes componentes: @BULLET Agile SPI Process: Un proceso ágil que guía a un programa de mejora de procesos. @BULLET Light SPI Evaluation Model: Un modelo ligero de evaluación del proceso productivo. @BULLET Light SPI Metrics Quality Model: Un modelo ligero de métricas del proceso productivo. @BULLET Framework PDS: Un marco conceptual y tecnológico para soportar procesos. @BULLET Light SPI Quality Model: Un modelo de calidad ligero. En este artículo se presenta la definición de un modelo ligero de evaluación de la calidad de procesos de desarrollo de software denominado Light MECPDS, basado en las normas ISO/IEC 12207:2002 [10] e ISO/IEC 15504:2003 [11], aplicable a PyMES de manera fácil y económica, con pocos recursos y en poco tiempo. El modelo proporciona un marco de trabajo ligero de medida de la madurez y cumplimiento del proceso; y un modelo de proceso de referencia. El artículo se estructura en cinco secciones adicionales a esta introducción. En primer lugar, en la sección 2 se muestra una panorámica de los trabajos relacionados. En la sección 3  
Vistas del modelo de evaluación de procesos La dimensión del proceso es proporcionada por un modelo de proceso de referencia externo, el cual define un conjunto de procesos característicos con declaraciones de propósitos y resultados del proceso. La dimensión de la capacidad del proceso consiste en un framework de medida que abarca seis niveles de capacidad del proceso y sus atributos de proceso asociados.  
Estructura de Light MECPDS  
Improvement in software development processes gives companies guaranteed high levels of maturity in their processes and increases their competitiveness in international terms. There are improvement, assessment and quality models which enjoy world-wide recognition but which must be adapted to the particular characteristics of the specific countries where those models are applied. These models can not easily be applied in the majority of organizations in many Latin American countries due to the large investment in money, time and resources that the models require. There is also the factor of the complexity of the recommendations they give, and the fact that the return on the investment is a long term prospect. This paper’s main goal is to present MECPDS, a light-weight model for the assessment of the quality of software development processes. This model is based on the ISO/IEC 12207 e ISO/IEC 15504 standards which are applicable to micro organizations, as well as small and medium ones. The model fulfils its function in a simple, economical way, using only a small amount of resources and in a short period of time.
 
Diagrama de bloques del compensador propuesto.
Flujos de potencia en un sistema compensado según IEEE Std. 1459.
IEEE Standard 1459-2000 includes new definitions for the measurement of electric power quantities under sinusoidal, non-sinusoidal, balanced or unbalanced conditions. This standard is the new reference for the measurement of the power quantities related with energy flows in electric systems. These new definitions affect directly to the design of the compensating systems used to improve the electric utility quality: only positive-sequence fundamental power must be present in efficient systems and all the other power terms are non-efficient. The reference currents of active compensators had to be adapted to this new definition of the efficient power in order to eliminate the non-efficient terms. This paper proposes the design of a new set of reference currents that provides IEEE standard 1459 compliance for a four-wire three-branch active compensator. Simulation results using a three-dimensional space vector pulse width modulation (3D-3B SVPWM) are included. The loads used are three single-phase rectifiers and three unbalanced resistive loads connected to a three-phase four-wire electric system. The proposed output current control uses a current regulator and a Space Vector modulator (SVPWM). The three-phase inverter is modeled using three half-bridges and a split DC bus. The proposed three-dimensional SVPWM allow to obtain more advantage of the DC bus and allow working with a constant switching frequency.
 
Diagrama de la metodología MEPLAMECAL.
Nowadays, data play a paramount role in organizations. Due to this fact, data quality management is becoming into one of the most important activities. As part of this management, and in order to obtain useful measurements, organizations need to develop data quality measurement plans. These plans must be developed taking into account the nature of the data and any other organizational factors that can affect to its use. This paper presents a methodology, MEPLAMECAL, aimed at developing such as plans. MEPLAMECAL is based on ISO/IEC 15939, which, in spite of being a software quality standard, we consider that can be applied to data because of the similarities among them. The proposed methodology is composed by two main activities: (1) establish and maintain the commitment of the organization with the data quality measurement process, and (2) the development of the plan itself.
 
In this article two new criteria for handoff traffic acceptance based on the received signal power and the buffer occupation percentage are proposed. The proposed criteria are evaluated in conjunction with the data schedulers Max C/I (Maximum Carrier Interference) and PF (Proportional Fair). The evaluation is carried out through HTTP source models and using QoS metrics such as the average packet delay, the throughput and the loss percentage in function of handoff traffic load. The results showed that depending on the chosen criterion, it is possible to assure the QoS of CDMA 1xEV-DO RA system and still to accept a good amount of handoff traffic.
 
AdapTools: Seleção de ação semântica.  
This paper present the new version of the tool AdapTools. This application provides mechanisms for implementation, execution and debugging of adaptive robots. We will focus on the aspects of implementation of the new structure of data and the model of non-deterministic execution, the new model of coding using the compiler AdapMap and improvement in the insertion of semantic routines, which makes it unnecessary to recompile the code of the tool.
 
The present paper is an attempt to survey the evolution of adaptivity and its applications. In this context, adaptivity is a term that refers to the property exhibited by some system or device that is able to self-modify its own set of operation rules, driven exclusively by its history, without the help of any external active element. Adaptive technology refers to the application of adaptivity as a tool for solving practical problems. This paper gathers a significant collection of available publications on adaptivity and related topics. It organizes the collected works according to their subject, and the publications on each subject, according to the chronological sequence of their publication.
 
Dois autômatos de estados finitos adaptativos que aceitam o merging de duas cadeias curtas. Em (a), o autômato adaptativo sempre opera deterministicamente, seja qual for a cadeia de entrada. Em (b), algum nãodeterminismo pode, eventualmente, se manifestar. Por exemplo, para a cadeia aecbgcd, a operação do autômato será determinística, o que não ocorre com a cadeia aebcgcd.  
The investigation of determinism issues in formal adaptive devices is particularly important when designing formal adaptive devices that must exhibit high operation performance. In the special case of adaptive formalisms, there is a serious difficulty related to determinism assurance, since their inherent self-modifying behavior makes it very difficult to assure that determinism is kept throughout the device’s operation. This work formally states the concept of determinism for adaptive finite-state automata, a class of adaptive devices that is particularly important for their relative simplicity. By studying this particular formalism, some generalizations to the adaptive devices were proposed. In addition, a set of requirements that should be satisfied in order to obtain deterministic adaptive finite-state automata were stated and a specific subclass of deterministic adaptive devices was defined.
 
A rede ATN na parte superior da figura é mapeada para um Autômato Adaptativo usando-se a estrutura da Figura 3 [15].
This paper describes some adaptive techniques applied to natural language processing. It starts with historical points and, after that, describes how adaptive technology can be used to address natural language processing. The paper finishes showing some of the techniques used by the Laboratory of Languages and Adaptive Technology research group.
 
Make decisions to solve problems or achieve certain objectives requires a reasoning process where the information, acquired and new, when compared with each other can lead to new information and, thus, influence the process. Reaching a conclusion implies the selection of one alternative, among the many found, following set criteria. The decision making is, therefore, a complex and dynamic reasoning process. Conventional decision tables represent one of the help tools used in solving problems of such nature. The purpose of this article is to show how the Adaptive Technology methods, and in particular the use of an adaptive device based on decision tables, can be applied to the decision making processes.
 
An adaptive device is made up of an underlying mechanism, for instance, an automaton, a grammar, etc., to which is added an adaptive mechanism, responsible for allowing a dynamic modification in the structure of the underlying mechanism. Adaptive languages have the basic feature of allowing the expression of programs which self-modifying through adaptive actions at runtime. The conception of such languages calls for a new programming style, since the application of adaptive technology suggests a new way of thinking. The adaptive programming style may be a feasible alternate way to obtain self-modifying consistent codes, which allow its use in modern applications for self-modifying.
 
Metodologia de extensão de dispositivos adaptativos dirigidos por regras Na etapa de definição do modelo teórico, representada na Fig. 2(a), entende-se que um especialista com bons conhecimentos matemáticos sobre determinado dispositivo não-adaptativo possa acrescentar à definição formal do dispositivo subjacente as funcionalidades providas pelos mecanismos adaptativos. Em [5] e em [7], são apresentadas extensões de dispositivos subjacentes pela inclusão de conceitos de dispositivos adaptativos. Nessa fase, é realizada a união dos conceitos formais de ambos os dispositivos (nãoadaptativo e adaptativo), pela qual se obtém o novo dispositivo adaptativo. A etapa de definição do modelo teórico é a primeira etapa a ser realizada ao se definirem dispositivos adaptativos. Tal etapa fundamenta-se na estrutura para representação geral de dispositivos adaptativos proposta em [9]. O trabalho aí desenvolvido pode ser considerado a forma mais abrangente para definição de um dispositivo adaptativo. Sua estrutura fundamenta-se na existência de um dispositivo não-adaptativo — por exemplo, Autômato de Estados Finitos, Autômato de Pilha, Gramática, Rede de Petri, ISDL, etc., —, o qual é envolvido por um mecanismo adaptativo, responsável por permitir que a estrutura do mecanismo subjacente seja dinamicamente modificada. Um autômato de estados finitos, por exemplo, ao ser envolvido por um mecanismo adaptativo, passa a poder sofrer em seu comportamento remoções ou inserções de transições enquanto realiza o processamento da cadeia de entrada, o que concede um aumento ao seu poder de expressão. Uma característica fundamental no uso dos conceitos de tecnologia adaptativa é a possibilidade de utilização de dispositivos existentes com um aumento do seu poder de representação ao custo de um pequeno acréscimo na  
Ambiente para o projeto de aplicações usando Tecnologia Adaptativa. Valendo-se das Ferramentas de Edição, um projetista de aplicações, utilizando um dispositivo adaptativo específico, poderá realizar a especificação de sua aplicação com o auxílio de um editor de texto qualquer ou de um editor gráfico. Para possibilitar o intercâmbio das especificações produzidas, os editores devem gerar objetos no Modelo Lógico. Caso a especificação seja produzida em um editor textual, esta deverá ser compilada para transformar a codificação realizada no formato definido para o modelo lógico. Depois de realizada a especificação, o projetista de aplicações poderá utilizar as Ferramentas de Análise. Tais ferramentas utilizam a codificação da especificação (no formato do modelo lógico) como base e permitem a realização de uma análise do comportamento da aplicação adaptativa em desenvolvimento. Por fim, depois de especificada e analisada a representação de uma aplicação, o projetista pode se utilizar das Ferramentas de Produção de Representações Físicas e gerar uma representação de uma aplicação em um determinado padrão de linguagem de representação física para obter a aplicação desejada. VI. UM GERADOR DE AMBIENTES PARA A MODELAGEM DE APLICAÇÕES USANDO TECNOLOGIA ADAPTATIVA O gerador de ambientes consiste em ferramentas que permitam a descrição de um dispositivo específico, suas definições gráficas e a forma de operação. Com base em tais definições, o gerador de ambientes é executado e obtém como resultado um ambiente para o dispositivo específico, o qual permite que sejam desempenhadas as funcionalidades  
Arquitetura geral do gerador de ambientes. Na arquitetura proposta, o Framework de um Ambiente Geral é um conjunto de programas (classes) que descreve as funcionalidades comuns a todos os projetistas de aplicações adaptativas. Tal framework permite a obtenção de um ambiente específico sem a necessidade de refazer a implementação de todos os componentes pertencentes ao mesmo. Um programador com bons conhecimentos no framework e na linguagem para sua representação física pode inserir os códigos necessários para que as funcionalidades específicas com base no dispositivo utilizado possam ser disponibilizadas no ambiente. Com o objetivo de facilitar a implementação do ambiente e o trabalho de realização no framework é que se propõe a utilização da Ferramenta de Definição dos Elementos Conceituais de um Dispositivo, que permitirá a um especialista instanciar objetos no modelo lógico e definir a representação dos elementos conceituais do dispositivo desejado. Os objetos gráficos associados aos elementos conceituais de um dispositivo também devem ser definidos para poderem ser utilizados pelas ferramentas de edição e análise. Para tal, é proposta a Ferramenta para a Definição dos Elementos Gráficos de um Dispositivo, que possibilita a um especialista definir os objetos gráficos – no caso da existência de tal representação – dos elementos conceituais do novo dispositivo. Os novos elementos definidos comporão, posteriormente, a barra de ferramentas e os menus de opções da ferramenta de edição do ambiente específico para modelagem, responsáveis por permitir a inserção de tais objetos durante a edição da especificação de uma aplicação e que também servirão para a apresentação do modelo da aplicação quando da sua simulação e verificação. Além dos elementos descritos anteriormente, também se faz  
This work aims to propose an environment generator (meta-environment) which enables the environment automatic generation for adpative application projects. This generator is based on Adaptive Technology concepts and allows the definition of rule-driven adaptive devices. When developing the present study, we considered the general architecture of an environment for the project of adaptive applications and the architecture for an environment generator of applications modeling using a specific adaptive device Based on the concepts mentioned, the implementation of some tools were done to show such concepts and some experiments were made to demonstrate the use of such tools and adaptive devices in application projects.
 
This article presents a line of research, in adaptive automata, based in results for context-free adaptive grammars with appearance checking. The grammatical results are described and exemplified and a strategy is proposed in order to obtain analogous results for the case of the adaptive automata. This strategy is then applied successfully to deduce a restricted version, for the case of the adaptative automata, of the general result for context-free adaptive grammars with appearance checking.
 
JISBD 2008 This number of the journal IEEE América Latina includes a selection of the best papers presented at the 13th Conference on Software Engineering and Databases, organized in Gijón (Spain) in October, 7 to 10, 2008. TELECOM I+D 2008 The present Edition of IEEE Latin America Transactions contains a selection of the best papers presented in the 18th Edition of the Telecom I+D Conference that was held the 29, 30 and October 31, 2008 in the Guggenheim Museum of Bilbao (Spain). For almost 20 years, the Telecom I+D (Telecommunications Research and Development) conferences have become the reference event in the telecommunications sector of Spain, in which public administrations, enterprises, prestigious universities and professional societies meet, exchange knowledge and discuss trends in research, development and innovation in the telecommunication knowledge area.
 
This special number of the IEEE Latin America Transactions includes a selection of the best papers presented at the three events I2TS 2009 and LatinCom 2009.
 
This special number of the IEEE Latin America Transactions includes a selection of the best papers presented at the three events CITA 2009, JISBD 2009 and TELECOM I+D 2009, which take place at Spain in 2009.
 
This is the first special issue of the IEEE Latin America Transactions of the year 2011. Special issues are published, out of regular issues schedule, according to the availability of extended best papers from recognized high level conferences that claim for a special issue. The current issue includes I2TS 2010 and WTA 2009-2010 Conferences.
 
Geometria da Configuração dipolo-dipolo 
The objective of this work was to analyze the influence of lateral heterogeneities which had the existence of the geologic imperfections in the measurement of the resistivity of the ground. The following methods had been used: method of the finite elements in the modeling 2D applying the arrangement dipolo-dipolo to establish a model of the probable real distribution of the values of resistivity in subsurface. The results had shown heterogeneities of the layers of the geoelectrical model, influenced for the geologic imperfections inside of the studied area. This heterogeneity reflects the difference of resistivity of the ground. In such a way, this study it contributes to guide the project of the grounding system that future can be used for accomplishment of the mesh of the grounding.
 
This work intends to compute the ground influence in communication signals, obtained at specific points of an urban micro cell. This was performed by implementing a full-wave simulator, based on the novel B-FDTD method and on parallel processing. The U-PML for conductive media has been employed to perform the domain truncation. The obtained results present good agreement to those available in literature.
 
A new computational parallel model based on 3D ray-tracing for radio propagation prediction is presented. This approach considers that the main tasks in a 3D ray-tracing technique can be evaluated in an independent and/or parallel way. The workload distribution among the participant nodes of the parallel architecture (cluster of PC's) is performed through a random assignment of the initial rays and the field points for them. Simulations are realized in order to validate and evaluate the performance of the proposed model. The presented results show that the scalability of the model is obtained naturally due to independence of the involved processes. The efficiency of the model presents behavior above the ideal for cases with ostensible processing of rays. These characteristics favor to the increase of the prediction precision through the increase of the density of launched rays and the possibility of incorporation of new propagation mechanisms.
 
Strong demands for public wireless broadband services will require more capacity than even that can be supplied by advanced mobile cellular systems. For this reason, there is currently a strong need for interworking mechanisms between WLANs and cellular data networks. But one of the problems which the operators of this new architecture face is its process of management. Policy-based networking (PBN) is a novel technology that facilitates the management and operation of networks. In this article policy-based network management reference architecture for integrated environment WLAN-3G is proposed.
 
At present time it is broadly recognized the existence of three generations of mobile telephone systems. In this work it is analysed the evolution of the three generations systems in terms of the services provided in each generation, focusing in the entity known as CAMEL (Customized Applications for Mobile Networks Enhanced Logics), which is an important platform influencing the evolution of 2G and 3G and beyond mobile telephony systems. This work starts with a general overview of mobile telephone systems, their architectures and services platforms, going further into the evolution of CAMEL that occurs together with the evolution of third generation system UMTS (Universal Mobile Telecommunications System) and would play an important role when the “all-IP” network and multimedia applications become part of everyday use by million of subscribers part of the mass market. CAMEL evolution is important as it is the evolution of 3G into 4G networks such as Ambient Networks, where CAMEL will be a fundamental element in order to make it easy the creation, control, and establishment of advanced and personalized services to subscribers, wherever they would be, and whenever they require the services, with full mobility and service portability, independently of the radio-access technology, the networks and the operators.
 
The current publishing is the second regular issue of the IEEE Latin America Transactions of the year 2011. Regular issues are published in March, June, September and December. Each paper goes under a pear review, of a least 3 program committee reviewer (that can accept, reject or ask modification) , followed by an advocate positioning. After that, the Editor-in-Chief select the papers to publish respecting the date of submission, i.e., the pid submission number. In another months of the year, special issues are published containing best papers of traditional conferences organized by excellence centres of researchers.
 
The present work described the build and characterization of a CO2 laser excited with a capacitive electric discharge of radio frequency of 450 kHz. The system used two external electrodes cooled with water and the discharge is product in a gas flow of 75 m/s, with a mixture gas of CO2:N2:He with proportion of 1:10.2:11.5 and 45 mbar of pressure. The discharge tube is of 22 mm of internal diameter (1.5 mm of wall), and a maximal supply power of 1.3 kW with a voltage discharge of 3.4 kV and a current of 700 mA, as well as a optic power of 22 W of 10.6 mm wavelength. Radiation is obtained in the E/N interval of 7X10-15 a 1.2X10-15 V*cm2.
 
Diagrama IEEE Gold Book [1]  
Red de bayesiana para el cálculo de la seguridad ante corto circuito del barraje principal A IEEE 493
Red de bayesiana para el cálculo de la seguridad ante transitorios V. RESULTADOS Y CONCLUSIONES
The application of power electronics in industrial systems has increased the use of a wide variety of reliability analysis software for commercial and industrial electrical systems. However, there are power quality and security events that are not considered by these tools. This paper describes a methodology for the reliability evaluation of industrial electrical systems using Bayesian networks which incorporates the power quality and security characteristics under events such us short-circuits, random outages and electrical transients. This methodology is applied to the standard network proposed by the IEEE 493 Gold Book to compare the system reliability indices with the ones obtained by others methodologies. This work offers a solid and practical tool for the design of industrial electrical systems
 
Ejemplo de envío de ACKs.
Esquema temporal del envío utilizando RTS/CTS.
Several fractal phenomena, such as self-similarity and long-range dependence, have been detected in network traffic, with important implications on network performance. This paper describes a detailed study based on NS-2 simulations and testbed experiments that assesses how the characteristics of traffic change when traversing IEEE 802.11 networks. The most significant finding is the mitigation of fractal characteristics in the output traffic.
 
Esquema de la primera etapa del LNA.
This paper presents the contamination levels, obtained applying the Equivalent Salt Deposit Density ESDD methodology in nine distribution circuits and five substations, belonging to ELECTRICARIBE S.A. E.S.P., and located in the north area of Barranquilla, the main Colombian Atlantic Ocean port. The paper shows the different study stages such as the sampling places selection and configuration, the ESDD measurement procedures and the results evaluation applying statistical techniques.
 
This paper describes a way to include the Arbitrary Interframe Spacing (AIFS) effect in the Enhanced Distributed Channel Access (EDCA) analysis for throughput rate. The effect of AIFS is incorporated using a new correlation measure in the collision probabilities calculation. The results are compared with previous work in references and validated via OPNET simulation. The improvement is shown in the figures. Close match with simulation results is achieved in most of the cases.
 
IEEE 802.11e EDCA MAC 
Since the advent of the first IEEE 802.11 standard, several papers have proposed means of providing QoS to IEEE 802.11 networks and evaluate various traffic-prioritization mechanisms. Nevertheless, studies on the assignment of AIFS times defined in IEEE 802.11e reveal that the various priority levels work in a synchronized manner. The studies show that, under large loads of high-priority traffic, EDCA starves low-priority frames, which is undesirable. We argue that QoS traffic needs to be prioritized, but users sending best-effort frames should also obtain the expected service. High-priority traffic can also suffer performance degradation when using EDCA because of heavy loads of low-priority frames. Thus, we have proposed a mechanism based on desynchronizing the IEEE 802.11e working procedure. It prevents stations that belong to different priority classes from attempting simultaneous transmission, prioritizes independent collision groups and achieves better short-term and long-term channel access fairness. We have evaluated the proposal based on extensive analytical and simulation results. It prevents the strangulation of low-priority traffic, and, moreover, reduces the degradation of high-priority traffic with the increased presence of low-priority frames.
 
In order to the increasing demand for wireless connectivity and large coverage, Radio-over-Fiber (RoF) infrastructure has been suggested as a cost-effective solution for the provisioning of bandwidth in small cell size. In this article, we present an analysis of the use of RoF technology in IEEE 802.16 networks. The propagation delay introduced by the fiber length impacts the tuning of MAC and physical layers parameters. For an effective tuning, we present a comprehensive study of the performance degradation of WiMAX networks employing RoF infrastructure. Results were obtained using analytical models and simulations. Results indicate the viability of RoF scenarios with degradation bounded to 20% at physical layer when using fiber links with maximum length of 115 Km and degradation bounded to 20% at application layer for fiber length up to 80 km.
 
– Primitivas.  
– Primitivas. Aplicando-se as novas primitivas da figura 5, a cadeia de caracteres a ser reconhecida pelo AA será da forma: c n a n f n A figura 6 apresenta o autômato reconhecedor de triângulo, juntamente com a ação adaptativa posterior ψ(1), que insere nova transição e novo estado quando ativada [1].  
– Autômato reconhecedor de triângulos. IV. MAPEAMENTO E NAVEGAÇÃO ROBÓTICA A. O Modelo de Informação O modelo de informação proposto como estrutura para o mapeamento e navegação robótica é apresentado na figura 7.  
Estrutura da informação.  
This paper presents the researches conducted using Adaptive Automata (AA) theory in robotics equipments, process and systems. The generalization and behavior adaptation features of AA allow their application for process modeling, identification, classification and decision making in the same way as traditional computational logic such as Fuzzy Logic and Artificial Neural Networks. As sample of AA applications, the paper presents the research on pattern recognition and the research on hybrid mapping for robot navigation.
 
Esquema demonstrativo da metodologia utilizada. 
Área de estudo: entorno do Lago de Tucuruí, Brasil. 
Diagrama de dispersão dos dados estimados e reais.
Potencial energético estimado (em kWh), em uma região do entorno do Lago de Tucuruí, Brasil. 
This paper proposes an integrated methodology for estimating aboveground forest biomass in Amazon region. It is based on remote sensing, artificial neural networks and geographical information systems technologies for achieving confident results with a lesser cost than traditional methods of forest inventory. This methodology was tested and validated in Tucurui Reservoir, Brazil.
 
Top-cited authors
Jose de Jesus Rubio
  • Instituto Politécnico Nacional
Marcio Zamboti Fortes
  • Universidade Federal Fluminense
Lucas Batista Gabriel
  • University of São Paulo
Cintia Borges Margi
  • University of São Paulo
Hector Perez-Meana
  • Instituto Politécnico Nacional