Article

Computers, logic and data processing

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... On the other hand, the task scheduling on the processor can be implemented either in a preemptive way or in a non-preemptive way. Modern radar systems use a general purpose processor (Billetter, 1989Billetter, , 1987) and thus we assume such processor scheduling tasks in a preemptive way. Speci?cally, we assume the preemptive earliest deadline ?rst (EDF) (Liu and Layland, 1973) scheduling of tasks on the processor resource. ...
Article
Many application level qualities are functions of available computation resources. Recent studies have handled the computation resource allocation problem to maximize the overall application quality. However, such QoS problems are fundamentally multi-dimensional optimization problems that require extensive computation. Therefore, online usage of optimization procedures may significantly reduce the computation resource available for applications. This raises the question of how to best use the optimization procedures for dynamic real-time task sets. In dynamic real-time systems, it is important to improve the performance by re-allocating the resources adapting to dynamic situations. However, the overhead of changing task parameters (i.e., algorithms and frequencies) for resource re-allocation is non-negligible in many applications. Thus, too frequent change of resource allocation may not be desirable. This paper proposes a method called service classes configuration to address the QoS problem with dynamic arrival and departure of tasks. The method avoids online usage of optimization procedures by offline designing templates (called service classes) of resource allocation, which will be adaptively used depending on online situations. The service classes are designed by best trading-off the accuracy of dynamic adaptation against the overhead of resource re-allocation. A simplified radar application is used as an illustrative example.
... From the late 1960s through to the 1980s numerous textbooks were published that sought to explain data and information processing, all in strictly positivist terms (e.g. Arnold, 1978; Davis, 1969; Frates & Moldrup, 1983; Gore, 1979; Verzello, 1982). These books told us that data processing is about the transformation of data into information and that the key difference between the two is that information has a meaning or purpose. ...
Article
The Arts and Sciences have always been considered distinct, but have played differing roles with respect to significant ethical, humanitarian and political issues of the day. Computers were first introduced within a culture of data processing, which gave rise to a number of social issues. As computers developed, the concept of information became more important, yet it is very difficult to say what is meant by that term. Despite this, the concept of information has been instrumental in promoting free-market economics. However, the introduction of mixed media computing raises some serious questions about the adequacy of the concept of information and suggests that the current mainstream ideology of computing is under some pressure. People who worked in computing in the early decades frequently held libertarian views that were often described as counter-cultural. Many of these views were absorbed within the dominant culture of free-market liberalisation, but different counter-cultures have emerged (e.g., hackers and political activists). Some digital artists and designers can also be considered counter-cultural and it is from this direction that we might see an emerging positive ethical response to contemporary issues. Full Text at Springer, may require registration or fee
Article
The term ‘the artificial’ can only be given a precise meaning in the context of the evolution of computational technology and this in turn can only be fully understood within a cultural setting that includes an epistemological perspective. The argument is illustrated in two case studies from the history of computational machinery: the first calculating machines and the first programmable computers. In the early years of electronic computers, the dominant form of computing was data processing which was a reflection of the dominant philosophy of logical positivism. By contrast, artificial intelligence (AI) adopted an anti-positivist position which left it marginalised until the 1980s when two camps emerged: technical AI which reverted to positivism, and strong AI which reified intelligence. Strong AI's commitment to the computer as a symbol processing machine and its use of models links it to late-modernism. The more directly experiential Virtual Reality (VR) more closely reflects the contemporary cultural climate of postmodernism. It is VR, rather than AI, that is more likely to form the basis of a culture of the artificial.
Article
The rapid development of computer technology is reviewed and its impact on employment and on the gross national product is mentioned. The process of designing an effective human-computer communication system is outlined, the role of job satisfaction is mentioned and methods which can be utilized to minimize (or optimize) occupational stress in human-computer communication work environment is discussed. The impact of utilizing high technology on the educational need s of ergonomists are outlined.
ResearchGate has not been able to resolve any references for this publication.