
Richard J. Leblanc- Ph.D. Computer Sciences
- Professor (Full) at Seattle University
Richard J. Leblanc
- Ph.D. Computer Sciences
- Professor (Full) at Seattle University
About
129
Publications
27,560
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2,324
Citations
Introduction
Current institution
Additional affiliations
January 1978 - June 2008
August 1972 - December 1977
August 2008 - April 2016
Education
August 1972 - December 1977
Publications
Publications (129)
Clear description of algorithms and clean design of compiler components. Crafting a Compiler presents a practical approach to compiler construction with thorough coverage of the material and examples that clearly illustrate the concepts in the book. Unlike other texts on the market, Fischer/Cytron/LeBlanc uses object-oriented design patterns and in...
For over 40 years, the ACM and IEEE-Computer Society have sponsored international curricular guidelines for undergraduate programs in computing. The rapid evolution and expansion of the computing field and the growing number of topics in computer science have made regular revision of curricular recommendations necessary. Thus, the Computing Curricu...
This paper discusses how software engineering topics are included in the CS2013 Curriculum Guidelines and presents several challenges to be addressed in collaboration with the software engineering community before final publication of the CS2013 volume.
For over 40 years, the ACM and IEEE-Computer Society have sponsored the creation of international curricular guidelines for undergraduate programs in computing. These Computing Curricula volumes are updated approximately every 10-year cycle, with the aim of keeping curricula modern and relevant. The next volume in the series, Computer Science 2013...
This workshop will familiarize participants with the current draft of the forthcoming CS 2013 curriculum guidelines and provide feedback to the CS 2013 Steering Committee.
Work began in late 2010 on a project to revise the ACM/IEEE-Computer Society Computer Science volume of Computing Curricula 2001 and the interim review CS 2008. The new guidelines for computer science are scheduled for release in 2013. This interactive session will give the computing education community an opportunity to review current working docu...
Over the past six years, Seattle University's Master of Software Engineering program has adopted a common community-based software engineering project as the basis for class projects in a sequence of required and elective courses. These related projects offer a unifying experience for students in the program, allow in-depth treatment of course topi...
1. SUMMARY This special session will be our first formal curriculum committee report. A working group from IEEE Technical Committee on Parallel Processing (TCPP), National Science Foundation (NSF), and the sister communities, including ACM, has taken up proposing curriculum for computer science (CS) and computer engineering (CE) undergraduates on p...
The IT model curriculum represents an excellent starting point toward understanding more about IT as an academic discipline.
The early 1990s saw the emergence of the Internet from the environs of the technical cognoscenti into the dot-com world with an interface for the masses. Additionally, the personal computer had reached the point that essentia...
Although the basic curriculum structure remains quite similar to recommendations initially offered in the 1980s, the options for what students study after completing the core courses are becoming steadily broader.
The impact of ubiquitous parallel computing hardware on computing curricula is examined in this paper, with a focus on identifying new outcomes to drive curriculum evolution.
The IT model curriculum represents an excellent starting point toward understanding more about IT as an academic discipline. The early 1990s saw the emergence of the Internet from the environs of the technical cognoscenti into the dot-com world with an interface for the masses. Additionally, the personal computer had reached the point that essentia...
This poster will describe ongoing work to modify the Computing Ontology to incorporate issues of parallelism and concurrency, motivated by recent developments in computer hardware design.
This poster will describe ongoing work to modify the Computing Ontology to incorporate issues of parallelism and concurrency, motivated by recent developments in computer hardware design.
For decades the ACM, IEEE-CS, AIS and other professional and scientific computing societies have worked together to tailor curriculum recommendations for the varied computing communities. Currently five volumes exist or are in final stages of completion: computer engineering, computer science, information systems, information technology and softwar...
Working Group 3 at ITiCSE 2007 continued the ongoing work of the Ontology of Computing project. The working group brought several new people into the project and addressed areas of the ontology of particular interest to these participants. In particular, the group worked on the Ontology sections related to History of Computing, Computing Security a...
The software engineering curriculum guidelines developed by ACM and the IEEE Computer Society, known as SE2004, have been available to the software engineering education community for a little over two years. During this time, a number of software engineering degree programs have been established and others have been revised in ways influenced by S...
We argue that the software engineering (SE) community could have a significant impact on the future of the discipline by focusing its efforts on improving the education of software engineers. There are some bright spots such as the various projects to codify knowledge, and the development of undergraduate SE programs. However, there remain several...
Working Group 3 at ITiCSE 2007 continued the ongoing work of the Ontology of Computing project. The working group brought several new people into the project and addressed areas of the ontology of particular interest to these participants. In particular, the group worked on the Ontology sections related to History of Computing, Computing Security a...
RESUMEN RESUMEN
The recommendations in Software Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering , form a volume of the larger Computing Curriculum project of the IEEE-CS and ACM. SE2004 evolved from an analysis of desired student outcomes for a software engineering graduate as compared to those fo...
In 2001, the ACM and the IEEE-CS published Computing Curricula 2001 which contains curriculum recommendations for undergraduate programs in computer science. That report also called for additional discipline-specific volumes for each of computer engineering, information systems, and software engineering. In addition, it called for an Overview Volum...
This is the report of Working Group 4 of the ITiCSE Conference of 2005. The working group met to introduce some new participants into an ongoing project designed to explore the representation of all the computing and information related disciplines in a single, comprehensive, graphical and interactive structure. The goal of the work is to support t...
This paper is an overview of Software Engineering 2004, the software engineering volume of the computing curricula 2001 project. We briefly describe the contents of the volume, the process used in developing the volume's guidelines, and how we expect the volume to be used in practice.
This paper is a collection of reflections on some of the curricular decisions made in “Software Engineering 2004,” the Software
Engineering volume of the Computing Curricula 2001 project. We briefly describe the contents of the Volume and the process
used in developing the Volume’s guidelines. We then look in more detail at the rationale behind som...
This is the report of Working Group 4 of the ITiCSE Conference of 2005. The working group met to introduce some new participants into an ongoing project designed to explore the representation of all the computing and information related disciplines in a single, comprehensive, graphical and interactive structure. The goal of the work is to support t...
This paper is an overview of Software Engineering 2004, the Software Engineering volume of the Computing Curricula 2001 project. We briefly describe the contents of the volume, the process used in developing the volume's guidelines, and how we expect the volume to be used in practice.
While computing technology has undoubtedly changed the world in which we live, the changes have been exaggerated. Talk of a hi-tech internet-driven revolution during the last decade is inaccurate from a historical perspective: (a) It belittles previous ...
In 2001, the ACM and the IEEE-CS published Computing Curricula 2001 which contains curriculum recommendations for undergraduate programs in computer science. That report also called for additional discipline-specific volumes for each of computer engineering, information systems, and software engineering. In addition, it called for an Overview Volum...
There are efforts underway to define each of several flavors of computing disciplines, including computer science, computer engineering, information science, information technology, and software engineering. The purpose of this work is to show that we can accomplish more if we avoid some of the effects of fragmentation and bring together all the co...
In 2001, the ACM and the IEEE-CS published Computing Curricula 2001 which contains curriculum recommendations for undergraduate programs in computer science. That report also called for additional discipline-specific volumes for each of computer engineering, information systems, and software engineering. In addition, it called for an Overview Volum...
Though software testing courses are commonly taught as part of Software Engineering curricula, software testing is still a challenging issue in Software Engineering education. Students frequently see testing only as something that happens at the end of the development process. Two challenges can be recognized: "How to make the students recognize th...
Summary form only given. In the fall of 1998, the Educational Activities Board of the IEEE Computer Society and the ACM Education Board appointed representatives to a joint task force whose mission was to perform a major review of curriculum guidelines for undergraduate programs in computing. This activity, named Computing Curricula, and their corr...
This paper introduces a new programming methodology for building real-time systems that allows the construction of concurrent programs without the explicit creation and synchronization of threads. The approach requires the program to have an acyclic invocation structure. This restriction allows an underlying CycleFree Kernel to implicitly schedule...
This paper introduces a new programming methodology for building real-time systems that allows the construction of concurrent programs without the explicit creation and synchronization of threads. The approach requires the program to have an acyclic invocation structure. This restriction allows an underlying CycleFree Kernel to implicitly schedule...
This paper introduces a new programming methodology for building real-time systems that allows the construction of concurrent programs without the explicit creation and synchronization of threads. The approach requires the program to have an acyclic invocation structure. This restriction allows an underlying CycleFree Kernel to implicitly schedule...
This paper describes the implementation of MiniJava, a teaching-oriented programming language closely based on the Java language developed by Sun Microsystems [6]. The core of the MiniJava environment is a restricted subset of the standard Java release ...
In the fall of 1998, the ACM Education Board and the Educational Activities Board of the IEEE Computer Society appointed representatives to a joint task force to prepare Curriculum 2001, the next installment in a series of reports on the undergraduate Computer Science curriculum that began in 1968 and was then updated in 1978 and 1991. The purpose...
In the fall of 1998, the ACM Education Board and the Educational Activities Board of the IEEE Computer Society appointed representatives to a joint task force to prepare Curriculum 2001, the next installment in a series of reports on the undergraduate Computer Science curriculum that began in 1968 and was then updated in 1978 and 1991. The purpose...
In the fall of 1998, the ACM Education Board and the Educational Activities Board of the IEEE Computer Society appointed representatives to a joint task force to prepare Curriculum 2001, the next installment in a series of reports on the undergraduate Computer Science curriculum that began in 1968 and was then updated in 1978 and 1991. The purpose...
In a distributed environment, a client program bound to a server fails when the server changes (possibly due to the server being relocated, replicated or reconfigured). In this paper, we describe the design of an object-replacement scheme in a client-server environment. Our design addresses the problem of replacing a server and transparently updati...
In the last article we considered how to identify the context for a case study and how to define and validate a case study hypothesis. In this article, we continue my discussion of the eight steps involved in a quantitative case study by considering ...
When debating a campus-wide computer initiative, questions about what computers and software to use are among the most practical concerns. This article provides a more detailed view of the set of questions that are likely to arise and the factors that should be considered in formulating the answers for any campus.
Virtually all students in modern higher education programs should be exposed to informatics, not only in its applications but also in its core concepts. In this focus group paper we propose an introductory informatics course with six important themes, covering the full breadth of the discipline. The course can be implemented in various ways, depend...
In this paper, we describe a two course sequence that has been taught to majors in computer science and a variety of other disciplines. The first course is called “Introduction to Computing”; the second course is “Introduction to Programming”. The “Computing” course is a ten-week course that includes material on the following: the “computing perspe...
The traditional approach to introducing students to Computer Science has been through a course built around the development of programming skills. While many textbooks intended for such courses include the words "Problem Solving" in their titles, the primary focus of most such courses is the skill of programming in a particular programming language...
The traditional approach to introducing students to computer science has been through a course built around the development of programming skills, ignoring the practical reality of increasing powerful application-oriented software packages. In this paper we describe a two course sequence which has been taught to majors in computer science and a var...
Software maintenance, reverse engineering, and software reuse rely on being able to recognize, comprehend, and manipulate design decisions in source code. But what is a design decision? This paper describes a characterization of design decisions based on the analysis of programming language constructs. The characterization underlies a framework for...
In a distributed environment, a client program bound to a server fails when the server changes (possibly due to the server being relocated, replicated or reconfigured). In this paper, we describe the design of an object-replacement scheme in a client-server environment. Our design addresses the problem of replacing a server and transparently updati...
Traditional undergraduate Computer Science curricula have been increasingly challenged on a host of grounds: undergraduate computing education is attracting fewer majors, is not producing graduates who satisfy the needs of either graduate programs or business and industry, and is not effectively responding to the increasing needs for computing educ...
Traditional undergraduate Computer Science curricula have been increasingly challenged on a host of grounds: undergraduate computing education is attracting fewer majors, is not producing graduates who satisfy the needs of either graduate programs or business and industry, and is not effectively responding to the increasing needs for computing educ...
Distributed shared memory consistency protocols suffer from poor performance due their lack of application spe cific knowledge which can be exploited in message passing systems. Explicit synchronization can be used in mem ory coherence activities to realize the benefits of applica tion specific information if the user is allowed to associate data w...
Past research has concentrated on ordering events in a system
where processes communicate through messages. The authors look at issues
in ordering events in a distributed system based on shared objects that
interact via remote procedure calls (RPCs). They derive clock conditions
for ordering operations on an object and provide clock maintenance
sch...
Discusses the design and the operating system support necessary
for providing asynchronous event handling in distributed, passive
object-based programming environments, where objects are potentially
shared by disparate applications. We discuss the necessity of
thread-based as well as object-based event notification and how a
variety of hard-to-solv...
The tools used to generate implementations from component descriptions are called component generators. The protocol considered supports call processing by controlling the set-up and take-down of connections carrying telephone calls. The exercises discussed investigate the use of component generators in the construction of the protocol handler and...
The design and implementation of Distributed Eiffel, a language
designed and implemented for distributed programming, on top of the
Clouds operating system by extending the object-oriented language Eiffel
are discussed. The language presents a programming paradigm based on
objects of multiple granularity. While large-grained persistent objects
serv...
The authors discuss a paradigm for structuring distributed operating systems, the potential and implications this paradigm has for users, and research directions for the future. They describe Clouds, a general-purpose operating system for distributed environments. It is based on an object-thread model adapted from object-oriented programming.< >
In prior developments of discrete event simulation of distributed
architectures, difficulty has been encountered in capturing good
concurrency abstractions. The above problem is addressed in the paper.
Because of the nature and complexity of certain concurrency concepts, a
new mode of representation was deemed necessary to express their details
com...
The transformational approach is a formal method for program
construction that allows refinement to be carried out using mechanical
manipulations. The authors describe an alternative idea for supporting
transformations that emphasizes the use of integrated tools instead of
individual rules. These tools provide the mechanism for coordinating a
gener...
A distributed system can support fault-tolerant applications by replicating data and computation at nodes that have independent failure modes. We present a scheme called parallel execution threads (PET) which can be used to implement fault-tolerant computations in an object-based distributed system. In a system that replicates objects, the PET sche...
The authors present a characterization of design decisions that is based on the analysis of programming constructs. The characterization underlies a framework for documenting and manipulating design information to facilitate maintenance and reuse activities. They identify and describe the following categories of design decisions: composition and de...
This report summarizes the work performed over a two-year period by the CLOUDS project at Georgia Institute of Technology to address the methodologies for fault tolerant software design and implementation in an object-oriented distributed operating system. The major research results are contained in two companion guide book reports resulting from t...
Ra is a native, minimal kernel for the Clouds distributed operating system. Ra is a successor to the prototype Clouds kernel and reflects lessons learned from the earlier implementation effort. Ra supports the same object-thread model as the original Clouds kernel as a special case and introduces extensibility as a major goal. Ra provides three pri...
A novel system architecture, based on the object model, is the central structuring concept used in the Clouds distributed operating system. This architecture makes Clouds attractive over a wide class of machines and environments. Clouds is a native operating system, designed and implemented at Georgia Tech. and runs on a set of generated purpose co...
Clouds is a native operating system for a distributed environment. The Clouds operating system is built on top of a kernel called Ra. Ra is a second generation kernal derived from our experience with the first version of the Clouds operating system. Ra is a minimal, flexible kernel that provides a framework for implementing a variety of distributed...
A description is given of the results of a study of methods of
achieving fault tolerance in the Clouds system and, in particular, of
achieving increased availability of objects. The problems explored in
this work, the model of distributed computation in which the problems
posed by the research were examined (the Clouds system), the tools that
were...
Issued as semi-annual status reports [nos. 1-3], status report, semi-annual technical report, semi-annual interim technical report, interim technical report, and final report
A study of the predictive value of a variety of syntax-based
problem complexity measures is reported. Experimentation with variants
of chunk-oriented measures showed that one should judiciously select
measurable software attributes as proper indicators of what one wishes
to predict, rather than hoping for a single, all-purpose complexity
measure. T...
A description is given of Clouds, an operating system designed to run on a set of general-purpose computers that are connected via a medium-to-high-speed local area network. The structure of Clouds promotes transparency, support for advanced programming paradigms, and integration of resource management, as well as a fair degree of autonomy at each...
Clouds is an operating system in a novel class of distributed operating systems providing the integration, reliability, and structure that makes a distributed system usable. Clouds is designed to run on a set of general purpose computers that are connected via a medium-of-high speed local area network. The system structuring paradigm chosen for the...
1. lNTRODUCTION AND BACKGROUND Work on the Clouds Project (Dasg85, LeB185b, Dasg88) at Georgia Tech has included the design of a distributed debugger which includes an algorithm exploiting the semantics of object-action computations to allow interactive debugging of distributed programs. The debugger allows a user to debug a distributed program fro...
Packages in the Ada™ language provide a mechanism for extending the language through the development of additional data types. Such types can be better integrated into the language using operator overloading; however, key limitations prevent new types from being transparently integrated into the language. Allowing function names to overload private...
There are two basic approachs to the problem of storage reclamation, process- and processor-based, named for the view point used to recognize when a particular piece of storage can be reclaimed. Examples of the processor approach include mark/sweep and copying algorithms and their variants, while reference counting schemes use a process view of the...
A practical approach to the development of a high-quality, re-usable code generator is described in this paper. This code generator produces code for the Prime 64V mode architecture, but the methodology used is generally applicable to the construction of compilers for most architectures. The code generator accepts a tree-structured intermediate for...