Chapter

ANÁLISE DOS DESAFIOS PARA ESTABELECER E MANTER SISTEMA DE GESTÃO DE SEGURANÇA DA INFORMAÇÃO NO CENÁRIO BRASILEIRO

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Article
Full-text available
The violation of principles such as confidentiality, integrity and availability of information, basic attributes of Information Security (IS), can affect business continuity and productivity and development organizations. Thus, the information security should be a subject of great relevance for organizations of Aeronautics Command (COMAER). In this sense, this study aimed to assess the compliance level of Information Security of the Second Center for Integrated Air Defense and Air Traffic Control (CINDACTA II) in relation to COMAER and Federal Public Administration publications, as well as the degree of CINDACTA II adherence for NBR 27002:2013 recommendations. The research was characterized as applied and descriptive about the objective. It was adopted the hypothetical-deductive method by means the documentary research and survey techniques. As to the problem approach was qualitative and quantitative. The study was carried out with five managers responsible for information security and 57 IT users in the CINDACTA II. We concluded that the CINDACTA II Information Security management is at a level that meets the requirements of NBR 27002:2013, as well as the publications of COMAER and Federal Public Administration. We observed similarities in IS practices between military and civil organizations, allowing us to infer that cultural issues, values and beliefs of the organizational environment influence information security. Resumo. A violação de princípios como a confidencialidade, integridade e disponibilidade da informação, atributos básicos da Segurança da Informação (SI),
Conference Paper
Full-text available
The adoption of a model for information security management, along with the implementation of its policies and the required adjustments to some of its norms are not simple tasks. Therefore, the application of a model for information security management often times experiences some difficulties due to the complexity of the norms. These difficulties demonstrate the need for a research focused on new ways to overcome such hurdles. To achieve this goal, this paper proposes a simplified information security policy model based on the principles exposed on the ISO/IEC 27001 and 27002 Standards. To achieve this goal, this proposition will rely on the bibliography perspective, and it will also utilize surveys regarding the current situation of information security in the industry and also the main controls currently required. The validation and refinement will be obtained by relying on surveys sent to companies and experts in the area. The work already presents as a result the Simplified Model of ISO / IEC 27001 and 27002, reducing from 114 controls to 31.
Article
Full-text available
A internet oferece ao cidadão uma quantidade incomensurável de informações em todas as áreas do conhecimento. Elas são disponibilizadas sem que haja qualquer tipo avaliação. Na saúde, essas informações podem prejudicar o cidadão. O usuário precisa compreender o que encontra na rede e acreditar no que lê. A avaliação da qualidade das informações sobre saúde encontradas na internet é um problema que muitas instituições e pesquisadores têm procurado resolver. Este artigo tem como objetivo apresentar alguns critérios de avaliação da qualidade das informações encontradas em sites de saúde. Para isso, foi realizada uma pesquisa de modo a identificar os principais instrumentos nacionais e internacionais que têm essa finalidade, foram analisados seus métodos de aplicação e comparados os critérios adotados. Como resultado, oitenta critérios de qualidade de sites de saúde foram agrupados em três dimensões: técnica, de conteúdo e design. Recomenda-se a criação de um selo de qualidade para sites de saúde no Brasil.
Thesis
Full-text available
The access of people and patients to health-related information on the Internet boosted the need for a review of the available information, especially when the patient needs to make decisions regarding his own treatment. A textual analysis consists of a readability and quality analysis of its content. Readability is the ease of understanding or comprehension of a text. There are several formulas used to measure it (Flesch Reading Ease, SMOG Index, etc.). Textual quality, on the other hand, can be evaluated through Text Mining studies. Text Mining is described as a process containing several techniques to organize, discover and extract information on text databases quickly and automatically. The mainly purpose of this paper is the development of a web tool capable of evaluating Portuguese texts using the Fernández-Huerta readability formula and classification techniques (J48, Bayes Net, Naïve Bayes, Support Vector Machines, K-Nearest Neighbors and Multilayer Perceptron). To reach the proposed goal, it was collected a text database composed of health-related online texts. The database collected was split into a training and test set. In order to proceed with analysis, a web tool was developed to support the readability and quality analysis. Tests were performed with the software. The results were compare with the texts classifications from human specialists. The Naïve Bayes technique showed the best results on data classification (89% of correct instances). In conclusion, the results are promising and evidence the viability of use of machine learning techniques on health-related texts.
Article
Full-text available
It is usually difficult for companies to keep up with the development of new information technologies and adapt to them in face of the opportunities and threats their advances may represent. This is especially true for small and medium enterprises (SME) in emerging economies, where resources tend to be scarce and markets more volatile. This paper describes an action research conducted in a small Brazilian software house that adopted an open-source Web Services development platform in order to improve its software development process. Data analysis revealed critical success factors (CSF) in the adoption process, as well as specific benefits and barriers prone to be faced by small software houses in their adoption efforts. In the process of overcoming such barriers, SME may acquire intellectual capital that represents an essential resource to ensure their competitiveness and survival in emerging economies.
Article
Full-text available
Introduction: As the number of clinical decision support systems (CDSSs) incorporated into electronic medical records (EMRs) increases, so does the need to evaluate their effectiveness. The use of medical record review and similar manual methods for evaluating decision rules is laborious and inefficient. The authors use machine learning and Natural Language Processing (NLP) algorithms to accurately evaluate a clinical decision support rule through an EMR system, and they compare it against manual evaluation. Methods: Modeled after the EMR system EPIC at Maine Medical Center, we developed a dummy data set containing physician notes in free text for 3,621 artificial patients records undergoing a head computed tomography (CT) scan for mild traumatic brain injury after the incorporation of an electronic best practice approach. We validated the accuracy of the Best Practice Advisories (BPA) using three machine learning algorithms—C-Support Vector Classification (SVC), Decision Tree Classifier (DecisionTreeClassifier), k-nearest neighbors classifier (KNeighborsClassifier)—by comparing their accuracy for adjudicating the occurrence of a mild traumatic brain injury against manual review. We then used the best of the three algorithms to evaluate the effectiveness of the BPA, and we compared the algorithm’s evaluation of the BPA to that of manual review. Results: The electronic best practice approach was found to have a sensitivity of 98.8 percent (96.83–100.0), specificity of 10.3 percent, PPV = 7.3 percent, and NPV = 99.2 percent when reviewed manually by abstractors. Though all the machine learning algorithms were observed to have a high level of prediction, the SVC displayed the highest with a sensitivity 93.33 percent (92.49–98.84), specificity of 97.62 percent (96.53–98.38), PPV = 50.00, NPV = 99.83. The SVC algorithm was observed to have a sensitivity of 97.9 percent (94.7–99.86), specificity 10.30 percent, PPV 7.25 percent, and NPV 99.2 percent for evaluating the best practice approach, after accounting for 17 cases (0.66 percent) where the patient records had to be reviewed manually due to the NPL systems inability to capture the proper diagnosis. Discussion: CDSSs incorporated into EMRs can be evaluated in an automatic fashion by using NLP and machine learning techniques.
Article
Full-text available
The goal of software process improvement (SPI) is to improve software processes and produce high-quality software, but the results of SPI efforts in small- and medium-sized enterprises (SMEs) that develop software have been unsatisfactory. The objective of this study is to support the prolific and successful CMMI-based implementation of SPI in SMEs by presenting the facts related to the unofficial adoption of CMMI level 2 process area-specific practices by software SMEs. Two questionnaire surveys were performed, and 42 questionnaires were selected for data analysis. The questionnaires were filled out by experts from 42 non-CMMI-certified software SMEs based in Malaysia and Pakistan. In the case of each process area of CMMI level 2, the respondents were asked to choose from three categories, namely ‘below 50 %,’ ‘50–75 %,’ and ‘above 75 %’. The percentages indicated the extent to which process area-specific practices are routinely followed in the respondents’ respective organizations. To deal with differing standards for defining SMEs, the notion of the common range standard has been introduced. The results of the study show that a large segment of software development SMEs informally follows the specific practices of CMMI level 2 process areas and thus has true potential for rapid and effective CMMI-based SPI. The results further indicate that, in the case of four process areas of CMMI level 2, there are statistically significant differences between the readiness of small and medium software enterprises to adopt the specific practices of those process areas, and between trends on their part to do so unofficially. The findings, manifesting various degrees of unofficial readiness for CMMI-based SPI among SMEs, can be used to define criteria for the selection of SMEs that would be included in SPI initiatives funded by relevant authorities. In the interests of developing fruitful CMMI-based SPI and to enhance the success rate of CMMI-based SPI initiatives, the study suggests that ‘ready’ or ‘potential’ SMEs should be given priority for SPI initiatives.
Article
Full-text available
The objective of this report is to propose comprehensive guidelines for systematic literature reviews appropriate for software engineering researchers, including PhD students. A systematic literature review is a means of evaluating and interpreting all available research relevant to a particular research question, topic area, or phenomenon of interest. Systematic reviews aim to present a fair evaluation of a research topic by using a trustworthy, rigorous, and auditable methodology. The guidelines presented in this report were derived from three existing guidelines used by medical researchers, two books produced by researchers with social science backgrounds and discussions with researchers from other disciplines who are involved in evidence-based practice. The guidelines have been adapted to reflect the specific problems of software engineering research. The guidelines cover three phases of a systematic literature review: planning the review, conducting the review and reporting the review. They provide a relatively high level description. They do not consider the impact of the research questions on the review procedures, nor do they specify in detail the mechanisms needed to perform meta-analysis.
Article
Full-text available
This paper reports on a grounded theory to study into software developers’ use of software development processes in actual practice in the specific context of very small companies. This study was conducted in three very small software product companies located in Ecuador. The data collection was based on semi-structured qualitative interviews with software project managers, focus group with software developers and was supplemented by the literature and document studies. We interviewed two types of participants (managers and developers), so as to ensure that we elicited a holistic perspective of how they approached the software development process in actual practice. The goal was to study what practices are actually used and their opinion and attitude toward the potential adopting of an international standard (ISO/IEC 29110) specifically designed for very small companies. With the collected data, we performed an analysis utilizing grounded theory coding techniques, as this methodology promotes the focus on uncovering the real concerns of the participants. This study highlighted three areas of concern: customer, software product and development tasks coordination and tracking. The findings in this study give an insight toward the work products as they relate to software development process practices in very small companies and the important factors that must be considered to assist project success.
Article
Full-text available
Internet of Things (IoT) has provided a promising opportunity to build powerful industrial systems and applications by leveraging the growing ubiquity of radio-frequency identification (RFID), and wireless, mobile, and sensor devices. A wide range of industrial IoT applications have been developed and deployed in recent years. In an effort to understand the development of IoT in industries, this paper reviews the current research of IoT, key enabling technologies, major IoT applications in industries, and identifies research trends and challenges. A main contribution of this review paper is that it summarizes the current state-of-the-art IoT in industries systematically.
Conference Paper
Full-text available
Context: There are numerous studies on effort estimation in Agile Software Development (ASD) and the state of the art in this area has been recently documented in a Systematic Literature Review (SLR). However, to date there are no studies on the state of the practice in this area, focusing on similar issues to those investigated in the above-mentioned SLR. Objectives: The aim of this paper is to report on the state of the practice on effort estimation in ASD, focusing on a wide range of aspects such as the estimation techniques and effort predictors used, to name a few. Method: A survey was carried out using as instrument an on-line questionnaire answered by agile practitioners who have experience in effort estimation. Results: Data was collected from 60 agile practitioners from 16 different countries, and the main findings are: 1) Planning poker (63%), analogy (47%) and expert judgment (38%) are frequently practiced estimation techniques in ASD; 2) Story points is the most frequently (62%) employed size metric, used solo or in combination with other metrics (e.g., function points); 3) Team's expertise level and prior experience are most commonly used cost drivers; 4) 52% of the respondents believe that their effort estimates on average are under/over estimated by an error of 25% or more; 5) Most agile teams take into account implementation and testing activities during effort estimation; and 6) Estimation is mostly performed at sprint and release planning levels in ASD. Conclusions: Estimation techniques that rely on experts' subjective assessment are the ones used the most in ASD, with effort underestimation being the dominant trend. Further, the use of multiple techniques in combination and story points seem to present a positive association with estimation accuracy, and team-related cost drivers are the ones used by most agile teams. Finally, requirements and management related issues are perceived as the main reasons for inaccurate estimates.
Article
Full-text available
This paper will explore and identify success factors related to the implementation of information security in organizations. It will explore these factors from the expert's perspective. Qualitative analysis of the organizations' employees' experiences will be analysed and discussed. The purpose of this research was to identify those factors required to ensure successful implementation of information security, particularly in government organizations. This study revealed many experiences and insights which will have widespread applicability.
Conference Paper
Full-text available
As health care information proliferates on the web, the content quality is varied and difficult to assess, partially due to the large volume and the dynamicity. This paper reports an automated approach in which the quality of depression treatment web pages is assessed according to evidence-based depression treatment guidelines. A supervised machine learning technique, specifically Naive Bayes classification, is used to identify the sentences that are consistent with the guidelines. The quality score of a depression treatment web page is the number of unique evidence-based guidelines covered in this page. Significant Pearson correlation (p<.001) was found between the quality rating results by the machine learning approach and the results by human raters on 31 depression treatment web pages in this case study. The semantic-based, machine learning quality rating method is promising and it may lead to an efficient and effective quality assessment mechanism for health care information on the Web. Copyright: Copyright is held by the authors. Acknowledgements: We thank Maggie Chu and Chunli Men who provided evidence-based quality rating on the training and testing web pages. We would also like to thank the anonymous reviewers for their valuable comments.
Article
Full-text available
Software process improvement (SPI) initiatives have been around for many years, yet many companies are still facing SPI implementation problems. The objective of this exploratory research is to gain an in-depth understanding of risks that can undermine SPI implementation from the perspective of software development practitioners. Interviews were conducted as the main approach of data collection from 34 SPI practitioners. Interviews were recorded and subsequently transcribed. The interview transcripts were systematically scrutinised to identify the major themes for SPI implementation. The identified themes were noted down and compared with the notes made during the interviews to ascertain that the former are indeed a true reflection of the discussion in the interviews. This two-step process also verified that the transcription process had not changed the original data generated in the interviews. Five SPI risks were identified from the interview data: organisational politics, lack of support, lack of defined SPI implementation methodology, lack of awareness and lack of resources that are generally considered critical by Australian practitioners. The results also reveal the similarities and the differences in the risks identified by different group of practitioners (i.e. developers, managers and senior managers), different type of organisations (i.e. small–medium and large) and organisations with mature and immature software development processes. Practitioners identify SPI risks based on previous SPI implementation experience. Copyright
Conference Paper
Full-text available
Due to the requirements to provision a proper Quality of Service level in enterprise WLANs supporting both voice and data services the typical densities in the deployment of access points (APs) may exceed 4000 APs per square kilometer. While such density is necessary under heavy traffic conditions, it is obviously superfluous during the time of lower load- and dramatically excessive at night periods, with only marginal traffic intensity. We present a novel, aggressive approach for adjusting the AP density to the actual traffic conditions. In the limiting case of a very low traffic, we postulate keeping operational only a skeleton deployment, sufficient just to recognize that there is a station attempting an association. In this case additional APs can be powered up, in order to assure the requested connectivity, locally in this area. Using data from commercially available APs we estimate the potential of power saving in such an operation mode and relate it to the best approaches proposed so far.
Article
Full-text available
Wireless networks have evolved into an important technology for connecting users to the Internet. As the utility of wireless technology grows, wireless networks are being deployed in more widely varying conditions. The monitoring of wireless networks continues to reveal key implementation deficiencies that need to be corrected in order to improve protocol operation and end-to-end network performance. In wireless networks, where the medium is shared, unwanted traffic can pose significant overhead and lead to suboptimal network performance. Much of the previous analyses of unwanted traffic in wireless networks focus on malicious traffic. However, another major contributor of unwanted traffic is incorrect link layer behavior. Using data we collected from the 67th Internet Engineering Task Force (IETF) meeting held in November 2006, we show that a significant portion of link layer traffic stems from mechanisms that initiate, maintain, and change client-AP associations. We further show that under conditions of high medium utilization and packet loss rate, handoffs are initiated incorrectly. We analyze the traffic to understand when handoffs occur and whether the handoffs were beneficial or should have been avoided.
Article
Full-text available
The coinciding development of multiobjective evolutionary algorithms (MOEAs) and the emergence of complex problem formulation in the finance and economics areas has led to a mutual interest from both research communities. Since the 1990s, an increasing number of works have thus proposed the application of MOEAs to solve complex financial and economic problems, involving multiple objectives. This paper provides a survey on the state-of-the-art of research, reported in the specialized literature to date, related to this framework. The taxonomy chosen here makes a distinction between the (widely covered) portfolio optimization problem and the other applications in the field. In addition, potential paths for future research within this area are identified.
Article
Full-text available
Neste trabalho a evasão nas instituições de educação superior no Brasil é estudada com base em dados oficiais, em que se incluem análises regionais dos índices da evasão anual média e da evasão por tipo de instituição. Verifica-se uma correlação negativa entre os índices de evasão e a demanda por curso. Com vistas a possibilitar comparações, são apresentados dados internacionais que indicam que a evasão no Brasil não difere muito das médias internacionais.In this work, evasion in institutions of higher education is studied, based on official data, including regional analyses of annual mean evasion rates and evasion rates by type of institution. It was verified that there is a negative correlation between evasion rates an demand for undergraduate courses. Comparisons with international data are presented, pointing out that the evasion rates in Brazil do not differ a lot from those of other countries.
Article
Full-text available
This paper describes an approach for the modeling of the software process adapted to small software companies. This work proposes a collaborative model that through workshops promotes the discussion between stakeholders for the modeling and evaluation of the processes. For an initial evaluation, the approach has been applied in 5 companies in Florianópolis/SC. Resumo. Este artigo descreve uma abordagem para a modelagem de processos de software adaptada para micro e pequenas empresas de software (MPEs). O trabalho propõe um modelo colaborativo que por meio de oficinas promove a discussão entre os envolvidos para a modelagem e validação dos processos. Para a validação inicial da abordagem, ela foi aplicada em 5 empresas em Florianópolis/SC. 1. Introdução O setor micro e pequeno empresarial é muito importante para a economia Brasileira atual. Isto tendo em vista que, por exemplo , no setor de software estas empresas representam aproximadamente 70% do total de empresas e empregam um grande número de pessoas (MCT 2005). Tipicamente, este tipo de empresa sofre problemas similares a qualquer tipo de empresa, p.ex., no que se refere à qualidade de seus produtos. Porém, no geral, micro e pequenas empresas (MPEs) normalmente enfrentam estes problemas ao extremo devido à informalidade de seus processos e à falta de recursos. Estas características podem prejudicar as MPEs no que se refere à sua qualidade, produtividade e competitividade, ou até mesmo à sua sobrevivência no mercado. Em particular, MPEs têm geralmente um processo de software informal e, conseqüenteme nte, dependente principalmente da competência das pessoas envolvidas (MCT 2005). Nesse contexto, o estabelecimento sistemático de processos pode contribuir significativamente na sua melhoria e assim, aumentar sua competitividade e suas chances de sobrevivência . Para estabelecer um processo, o mesmo precisa ser definido e implantado. Ao definirmos um processo é preciso que seja construído um modelo que o represente. Essa representação suporta o entendimento e a visualização do processo, facilita sua disseminação e comunicação, auxilia na gerência de projetos, e é importante na avaliação, evolução e melhoria contínua do processo.
Article
Full-text available
The MPS model has been developed in the context of the MPS.BR Program in order to address the business needs of the Brazilian software industry. In this paper we present the current version of the MPS model, quantitative results of its adoption, and qualitative performance results obtained by software organizations that adopted the model, gathered through an experimental strategy based on surveys applied during two consecutive years. The quantitative results (174 organizations appraised until September 2009) point to an increasing adoption of the MPS model by Brazilian software organizations and to its capacity in promoting good software engineering practices. The qualitative results, on the other hand, show an increase of customer satisfaction and productivity, and capacity to deal with bigger projects for organizations that adopted the MPS model.
Conference Paper
Full-text available
In the last decades the complexity of software development projects had a significant increase. This complexity emerges from the higher degree of sophistication in the contexts they aim to serve and from the evolution of the functionalities implemented by the applications However, many software corporations have a reduced dimension (micro, small or medium) which imposes a considerable constraint to the number of individuals that might be involved in each project. This limitation has obvious consequences to the individual's efficiency and effectiveness. In this paper we describe a Rational Unified Process (RUP) tailoring to simplify the number of RUP roles. With this tailoring we obtain one set of RUP roles that, without neglecting any critical role of the software development process, may easily be adopted by a small or medium software development team. In this paper, we present and justify a complete set of mapping rules between RUP roles and one possible configuration for small software development teams.
Article
Full-text available
This paper discusses the potential problems due to cultural differences, which foreign companies may face in Brazil concerning information security. Top 3 investing countries in Brazil, namely US, Netherlands, and Japan are examined. Potential problems concerning the management of people in information security are developed by using Geert Hofstede’s framework and based upon the authors’ experience in global business activities. To evaluate the magnitude of potential of problems, a recently proposed measure called Level of Potential (LoP) is adopted. A survey was conducted in Brazil to evaluate the severity of potential problems and the practicability of LoP. To examine the practicability of LoPs, the logical LoPs are compared with their surveyed severities. Our results show that LoP can predict problems to a certain extent in the Brazilian business environment. The results reveal that Japanese companies may face problems least, while the Dutch ones face the difficulties most. The problem of “Using previous company’s confidential information” is a problem with the highest severity among the potential problems since “teaching others” is encouraged by employees’ belief.
Article
Full-text available
Null data frames are a special but important type of frames in IEEE 802.11 WLANs. They are widely used in 802.11 WLANs for control purposes such as power management, channel scanning, and association keeping alive. The wide applications of null data frames come from their salient features such as lightweight frame format and implementation flexibility. However, such features can be taken advantage of by malicious attackers to launch a variety of attacks on 802.11 WLANs. In this paper, we identify potential security vulnerabilities in current null data frame applications in 802.11 WLANs. We then study two types of attacks taking advantage of these vulnerabilities in detail that are functionality-based Denial-of-Service attack and implementation-based fingerprinting attack. We also evaluate their effectiveness based on extensive experiments. Furthermore, we design and implement novel defense mechanisms against the attacks, and evaluate their effectiveness based on extensive experiments. Although our proposed defenses help alleviate the vulnerabilities, completely eliminating the vulnerabilities brought by null data frames remains an open issue. Finally, we point out that our work has broader impact in that similar vulnerabilities exist in many other networks due to the adoption of simple and lightweight messages for control purpose.
Conference Paper
Pacientes com insuficiência cardíaca e sem acompanhamento médico diário podem ter os sinais fisiológicos do coração comprometidos, causando graves problemas de saúde. Este cenário recorrente diminui a qualidade de vida do paciente resultando em readmissões hospitalares, onerando assim o sistema de saúde. Considera-se que o emprego de cuidados ubíquos, usando sensores e wearables, pode melhorar este processo reduzindo o número de readmissões hospitalares. Nesse âmbito, esse trabalho propõe o modelo UbHeart que emprega ciência da situação para identificar possíveis problemas cardíacos. Como contribuição científica o modelo prove o monitoramento da evolução da degradação dos sinais vitais do coração do paciente, através da detecção de possíveis situações de complicação cardíaca. Foram utilizados parâmetros de saúde estabelecidos pela sociedade brasileira de cardiologia. A avaliação do UbHeart foi realizada utilizando dois cenários de uso e apresentou resultados satisfatórios durante a análise.
Article
Free and forced vibrations of elastically coupled thin annular plate and cylindrical shell structures under elastic boundary conditions are studied through wave based method. The method is involved in dividing the coupled structure into shell segments and annular plates. Flügge shell theory and thin plate theory are utilized to describe motion equations of segments and plates, respectively. Regardless of boundary and continuity conditions, displacements of individual members are expressed as different forms of wave functions, rather than polynomials or trigonometric functions. With the aid of artificial springs, continuity conditions between segments and plates are readily obtained and corresponding governing equation can be established by assembling these continuity conditions. To test accuracy of present method, vibration results of some coupled structures subjected to different boundary and coupling conditions are firstly examined. As expected, results of present method are in excellent agreement with the ones in literature and calculated by finite element method (FEM). Moreover, effects of annular plates, elastic coupling and boundary conditions, excitation and damping are also studied. Results show that normal displacement of annular plate mainly affects free vibrations of the coupled structures, while tangential displacement has the greatest effect on forced vibrations as meridinoal or normal excitation forced on the annular plate.
Article
The Shewhart and CUSUM control chart techniques have found wide application in the manufacturing industries. However, workpiece quality has also been greatly enhanced by rapid and precise individual item measurements and by improvements in automatic dynamic machine control. One consequence is a growing similarity in the control problems faced by the workpiece quality control engineer and his compatriot in the continuous process industries. The purpose of this paper is to exposit a control chart technique that may be of value to both manufacturing and continuous process quality control engineers: the exponentially weighted moving average (EWMA) control chart. The EWMA has its origins in the early work of econometricians, and although its use in quality control has been recognized, it remains a largely neglected tool. The EWMA chart is easy to plot, easy to interpret, and its control limits are easy to obtain. Further, the EWMA leads naturally to an empirical dynamic control equation.
Article
Researchers have proposed that scarce resources are the main factor hindering product innovation in small companies. However, despite scarce resources, small companies do innovate, so the research question is: How do small companies manage resource scarcity in product innovation? To answer the research question a multiple case study of three small established companies and their product innovation was used, including interviews and observations over a period of five months. The small companies were found to use many different bootstrapping methods in combination within their product innovation. The methods can be classified into three different functional categories: bootstrapping methods for increasing resources, for using existing resources more efficiently, and those for securing a fast payback on resources put into product innovation. Due to their resource scarcity, the studied companies also favoured an innovation strategy only involving new products done with known technology and targeting existing markets. This strategy seems to avoid unsuccessful innovation but at the same time exclude technologically radical innovation.
Conference Paper
Against the low level of automation, the single barcode management mode cannot meet the demand of efficient management and other traditional enterprise asset management problems, the research adopts UHF radio frequency identification (RFID) based on IOT technology, in detail, we develop the software system using 902 MHz RFID tags with the UHF antenna system, combined with Microsoft Visual Studio 2008, 2010 and My SQL Server 2008 environment, using C# in ASP.NET development platform. Via one-dimensional, two-dimensional code and RFID technology, realizing the intelligent management and monitoring of the air materiel's entry, export, shift and inventory of warehouse, meanwhile, reducing inventory management men's work intensity and improving inventory management level.
Article
RFID technology has been known to be one of the noteworthy converging technologies of the 20th century. The technology can be applied in many fields. However, this paper focuses on the application of the technology in the transportation industry. The application of RFID in Intelligent Transport Systems (ITS) is gaining popularity with its widespread use in the field of toll management and the management of the overall transport sector. There are many RFID applications available in the market such as RFID contactless smart card commonly used in buses and LRTs, Automatic Vehicle Identification (AVI), Electronic Toll Collection (ETC), Smart Parking, and congestion zone pricing. In Mashhad, the second largest city of Iran, the "My Card" is used not only in Public transit but also in car parking, and soon in taxis and also other public municipality Services. Driven by such success stories, deployment of RFID technology in Mashhad is thus encouraged. This work has been carried out with a purpose to demonstrate benefits of the RFID technology in developing countries and its application in transport sector. This paper explores existing technology and surveys a set of successful implementations in Mashhad for Intelligent transportation applications .The integration considerations and challenges facing RFID deployment are also discussed.
In Japan, most of all the university and advanced hospitals have implemented both electronic order entry systems and electronic charting. In addition, all medical records are subjected to inspector audit for quality assurance. The record of informed consent (IC) is very important as this provides evidence of consent from the patient or patient's family and health care provider. Therefore, we developed an automatic audit system for a hospital information system (HIS) that is able to evaluate IC automatically using machine learning.
Book
Text mining tries to solve the crisis of information overload by combining techniques from data mining, machine learning, natural language processing, information retrieval, and knowledge management. In addition to providing an in-depth examination of core text mining and link detection algorithms and operations, this book examines advanced pre-processing techniques, knowledge representation considerations, and visualization approaches. Finally, it explores current real-world, mission-critical applications of text mining and link detection in such varied fields as M&A business intelligence, genomics research and counter-terrorism activities.
Article
of the ACM SIGCOMM Conference. The award “recognizes a paper published 10 to 12 years in the past … that is deemed to be an outstanding paper whose contents are still a vibrant and useful contribution today. ” In this review, we try to explain why we picked this paper for the award. (In that light, we should note there were a number of outstanding papers that were strong contenders for the award). A Time of Change in Measurement One of the reasons that the paper remains vital and vibrant today is that it marks a moment of change in network measurement. Network measurement is as old as networking itself. In 1969, when the ARPANET was being built, Len Kleinrock at UCLA was commissioned to put together a measurement center to analyze the performance of the network. Through the 1970s and 1980s, there was a tradition of network measurement, both by users and by the network providers. Equally important was a tradition of sharing the results. So if you were curious about, for instance, path stability, you could typically ask BBN (who ran the ARPANET) or MERIT (who ran NSFNET) and either get an answer or access to their raw measurement data. By the early 1990s, measurement inside the network was becoming increasingly hard. A combination of privacy concerns and the rise of competing Internet Service Providers (ISPs) who viewed measurements as proprietary, meant that data about the Internet’s (rapidly growing) core was increasingly hard to get. This change did not mean the end of Internet measurement: indeed, just the year before Paxson’s paper, Jeff Mogul had published a brilliant paper using HTTP measurements to show the benefits of persistent connections [1]. But it appeared that research was becoming restricted to measurements (like those in Mogul's study) that could be completed without access to data on how the middle of the network behaved. It was in this environment that Paxson’s paper appeared. Paxson showed that, using proper statistical techniques
Conference Paper
In recognizing the potential that software can help in the socio-economic development of the country, the government of Botswana has identified software industry as one of the strategic priority areas in ICT with research focus on areas such as software engineering practices and software process improvement. To realize this vision, the state of software development practice has to be formally assessed to determine the strengths and weaknesses thereby providing input for the development of appropriate software development strategy for the country. However, the current state of software process practice has not yet been properly investigated. In this paper, we discuss the results of our study conducted to determine process performance and process capability of major software companies. The process assessment study was conducted following the METvalCOMPETISOFT assessment methodology. Our findings highlight the organizations' current software engineering practices by identifying strengths, major weaknesses and key areas for software process improvement. We also discuss our lightweight prototype tool developed for the management of process assessment in small companies.
Article
The world is increasingly dependent on technology and computing systems. Software organizations are facing a highly competitive market, and thus seeking good practices and processes that help keep them competitive. The quality of their products becomes a differentiating factor and is directly associated with these processes. The software products they deliver play a major role in this competitive scenario, to which small organizations do not have easy access. Our study is directed to those small and micro-organizations that lack the necessary financial assets to hire people, adopt and implement expensive processes, or even implement good development practices. In this paper we present our approach to help those organizations find good practices to enhance their software development processes. The method consisted of obtaining a possible company profile based on technical attributes, given as input to a knowledge-based system that derived a list of possible practices to be adopted according to that profile. Then project managers can select those more suitable to the company's present demands, and implement them in smaller steps according to the organization maturity levels. The approach was currently tested in two organizations that are by now implementing the suggested practices. The proposed system is freely available through the internet.
Conference Paper
In this work several empirical propagation models have been studied based on many signal strength measurements in an indoor environment (Ophthalmology Specialized Hospital) using log-distance model for LOS (Line of Sight) area and MWM (Multi-Wall Model) for NLOS (Non Line of Sight) areas. The building is typical in Iraq (more than 85% of the buildings have this structure). It has two types of walls, 10cm and 20cm cement block walls. In this work five Multi-Wall Models have been studied and a modified one has been generated based on modifying one of them. Comparison between on site measured signal strength data and software based reading shows the applicability of the model to this building structure. In the study the measured signal strength was for a wireless access point which was working as a transmitter (Tx) and a Laptop WLAN card was working as a receiver (Rx). The Netstumbler software was used to detect the access point signal and measuring the signal strength. The used devices and software are very cost-effective and there was no need for expensive equipment for conducting the whole study which is a positive side for this study.
Article
A diagnostic system includes a vibration sensor mounted on a machine to measure vibrations. Vibration signals from the sensor are processed and analyzed by the system. From a known critical frequency of a vibration-generating component, the system measures the amplitude of the vibration signal at more than one harmonic frequency of the known critical frequency and compares the amplitudes to amplitudes at adjacent harmonic frequencies. When a relatively large amplitude is found at a harmonic frequency, that is a harmonic frequency near a resonant frequency of the physical path between the vibration sensor and the vibration-generating component, the system analyzes the shape and magnitude of the vibration signal around that harmonic frequency to evaluate the condition of the machine.
Article
Based on the introduction of supply chain management of fresh agricultural products and the internet of things (IOT), this paper proposes three specific applications of IOT in fresh agricultural products supply chain management, namely perfecting the monitor of the fresh agricultural products, quality to control the food security sources strictly, building management information system of fresh agricultural products based on IOT to increase supply chain integration level, and reducing supply chain management costs to improve supply chain efficiency.
Article
A concept is advanced for using the motion of launchers of a free-flight launcher/rocket system which is caused by random imperfections of the rockets launched from it to reduce the total error caused by the imperfections. This concept is called 'passive launcher control' because no feedback is generated by an active energy source after an error is sensed; only the feedback inherent in the launcher/rocket interaction is used. Relatively simple launcher models with two degrees of freedom, pitch and yaw, were used in conjunction with a more detailed, variable-mass model in a digital simulation code to obtain rocket trajectories with and without thrust misalignment and dynamic imbalance. Angular deviations of rocket velocities and linear deviations of the positions of rocket centers of mass at burnout were computed for cases in which the launcher was allowed to move ('flexible' launcher) and was constrained so that it did not rotate ('rigid' launcher) and ratios of flexible to rigid deviations were determined. Curves of these error ratios versus launcher frequency are presented. These show that a launcher which has a transverse moment of inertia about its pivot point of the same magnitude as that of the centroidal transverse moments of inertia of the rockets launched from it can be tuned to passively reduce the errors caused by rocket imperfections.
Article
In this article we show how the results from software process appraisals of ten small- to medium-sized software development enterprises demonstrate that as companies become larger they naturally use an increasing number of practices within the Capability Maturity Model Integration (CMMI®). The companies involved in this study had no CMMI® experience prior to this work. We identify those practices that tend to be best followed by both small- and medium-sized enterprises and those that tend to be introduced as the company size increases from small to medium. We conclude that the CMMI® appears to have an inherent roadmap for process improvement, buried within its explicit structure of capability levels that is driven by company expansion. The results of this study can be used by organisations to plan process improvement as they expand in size and staff complement. We further show how our results are consistent with the European Union's definition of a ‘Small’ and a ‘Medium’ sized enterprise. Copyright © 2009 John Wiley & Sons, Ltd.