The capability to design quality software and implement modern information systems is at the core of economic growth in the 21st century. Nevertheless, exploiting this potential is only possible when adequate human resources are available and when modern software engineering methods and tools are used. The recent years have witnessed rapid evolution of software engineering methodologies, including the creation of new platforms and tools which aim to shorten the software design process, raise its quality and cut down its costs. This evolution is made possible through ever-increasing knowledge of software design strategies as well as through improvements in system design and code testing procedures. At the same time, the need for broad access to high-performance and high-throughput computing resources necessitates the creation of large-scale, interactive information systems, capable of processing millions of transactions per seconds. These systems, in turn, call for new, innovative distributed software design and implementation technologies. The purpose of this book is to review and analyze emerging software engineering technologies, focusing on the evolution of design and implementation platforms as well as on novel computer systems related to the development of modern information services.
The capability to design quality software and implement modern information systems is at the core of economic growth in the 21st century. Nevertheless, exploiting this potential is only possible when adequate human resources are available and when modern software engineering methods and tools are used.
The recent years have witnessed rapid evolution of software engineering methodologies, including the creation of new platforms and tools which aim to shorten the software design process, raise its quality and cut down its costs. This evolution is made possible through ever-increasing knowledge of software design strategies as well as through improvements in system design and code testing procedures. At the same time, the need for broad access to high-performance and high-throughput computing resources necessitates the creation of large-scale, interactive information systems, capable of processing millions of transactions per seconds. These systems, in turn, call for new, innovative distributed software design and implementation technologies.
The purpose of this book is to review and analyze emerging software engineering technologies, focusing on the evolution of design and implementation platforms as well as on novel computer systems related to the development of modern information services. The eight chapters address the following topics covering a wide spectrum of contemporary software engineering:
1. Software Engineering Processes – software process maturity, process measurement and evaluation, agile software development, workflow management in software production,
2. UML-based Software Modeling – UML 2.0 features, usability of UML modeling, exception modeling, business environment elaboration with UML,
4. Technologies for SOA – Grid systems and services, distributed component platforms, configuration management, system and application monitoring,
5. Requirements Engineering – gathering, analyzing and modeling requirements, analyzing and modeling business processes, requirements management,
6. Knowledge Base System and Prototyping – knowledge base system engineering, integrating ontologies, modular rule-based systems,
7. Software Modeling and Verification – modeling of rule-based systems, modeling and verification of reactive systems,
8. Selected Topics in Software Engineering – this part covers 8 selected topics related to various aspects of software engineering.
We believe that the presented topics are interesting for software engineers, project managers and computer scientists involved in the computer software development process. We would like to express our thanks to all authors, colleagues, and reviewers who have supported our efforts to prepare this book.
The success of free software and open source projects has increased interest in utilizing the open source model for mature software development. However, the ad hoc nature of open source development may result in poor quality software or failures for a number of volunteer projects. In this paper, projects from SourceForge are assessed to test the hypothesis that there is a relationship between process maturity and the success of free software and open source projects. This study addresses the question of whether the maturity of particular software processes differs in successful and unsuccessful projects. Processes are identified that are key factors in successful free software projects. The insights gained from this study can be applied as to improve the software process used by free software projects.
The UID approach (User Involved Development) is suggested to improve software development process. Its aim is to admit users as its subject. The UID approach shows an attempt to combine features of hard and soft methodologies. The user focus leads to users' involvement in the process. An initial questionnaire survey of all potential users is recommended. There are suggested specified forms of developers-users cooperation to provide a continual feedback between both sides. Business process modeling, software product assessment by all its users, competency-based selection to play various roles, construction of questionnaires – are examples of methods in use.
Marek Bukowy, Larry Wilder, Susan Finch, David Nunn
27 - 38
A transition to an agile software development methodology in a heavily waterfall-oriented organization is discussed. A collection of agile software development practices implemented at Sabre Holdings is described. A summary of the desired set of practices is given, along with data summarizing the degree of pervasiveness of these practices in real life. Quantitative results on the use of the methodology in day-to-day development are presented. Lessons learned and mitigation ideas are also discussed. The gathered information shows that the habits of the old model can strongly encumber the implementation of an agile methodology, but even implementation of a few practices yields very tangible results.
The maintenance is the longest period of software product life cycle and the most expensive, therefore increasing efficiency and reducing costs in this domain is important issue for any software company. Treating maintenance and also product development as business processes, we can use workflow software to optimize and automate their execution. Tytan Workflow is advanced system for business process modeling which is used in software engineering related activities. In this article feauters of system and examples of process models was introduced.
In this paper we present a Parallel Spatial Data Warehouse (PSDW) system that we use for aggregation and analysis of huge amounts of spatial data. The data is generated by utilities meters communicating via radio. The PSDW system is based on a data model called the cascaded star model. In order to provide satisfactory interactivity for PSDW system, we used parallel computing supported by a special indexing structure called an aggregation tree. The balancing of a PSDW system workload is very essential to ensure the minimal response time of tasks submitted to process. We have implemented two data partitioning schemes which use Hilbert and Peano curves for space ordering. The presented balancing algorithm iteratively calculates optimal size of partitions, which are loaded into each node, by executing a series of aggregations on a test data set. We provide a collection of system tests results and its analysis that confirm the possibility of a balancing algorithm realization in proposed way. During ETL process (Extraction, Transformation and Loading) large amounts of data are transformed and loaded to PSDW. ETL processes are sometimes interrupted by occurrence of a failure. In such a case, one of the interrupted extraction resumption algorithms is usually used. In this paper we analyze the influence of the data balancing used in PSDW on the extraction and resumption processes efficiency.
Bogumiła Hnatkowska, Zbigniew Huzar, Lech Tuzinkiewicz
63 - 74
Database design goes through three-stage development process: conceptual modeling, logical modeling and physical modeling. The paper deals with transformation of conceptual models to logical ones. Assuming that conceptual models are expressed in UML 2.0, the paper presents two groups of transformation rules. The first group contains commonly used rules dealing with transformation of conceptual models. The second group gathers specific rules, proposed by authors, that can be applied for new features of the UML 2.0.
Usability aspects of UML modeling tools have impact on efficiency of work and satisfaction of software developers. This paper discusses applicability of usability techniques to UML modeling tools evaluation. It describes an empirical study of performance testing of six UML modeling tools and inquiries regarding usability problems with these tools. Then, Goals, Methods, Operators, Selection Rules (GOMS) method is applied for investigation of the effort necessary to create diagrams with these tools.
During the last couple of years UML has been becoming more popular and has been rapidly accepted as the standard modelling language for specifying software and system architectures. Despite this popularity using UML 1.x caused some trouble and developers face some limitations due to the fact that software industry has evolved considerably during last seven years and software system become more and more complex. One of the most important issues in modem software systems is exception handling. Exception handling mechanism enables to make software systems robust and reliable . However mechanism provided even by the 2.0 version of UML, in authors opinion, seems to be insufficient and is far from well-known implementation practise. UML does not put proper emphasis on exceptions handling the aim of the paper is to present the possibilities of excep- tions modelling in UML using statechart diagrams. These approach seems to be the most natural way to model the mechanism. Alternative methods, which should be taken into consideration in future research, are discussed at the end of the paper. UML dialect used is proposed by Telelogic so does the tool Tau G2.
The paper deals with software development for supporting information security management, particularly the modeling of the business environment of the organization implementing Information Security Management System (ISMS). The ISMS, based on the PDCA (Plan-Do-Check-Act) model, was defined in the BS7799-2:2002 standard. The paper focuses on the identification of the ISMS business environment that provides appropriate positioning of the ISMS within the organization. The general model of the ISMS business environment based on the high-level risk analysis was presented. The UML approach allows to implement the ISMS within organizations in a more consistent and efficient way and to create supporting tools improving information security management.
Test-driven development (TDD) and pair programming (PP) are the key practices of eXtreme Programming methodology that have caught the attention of software engineers and researchers worldwide. One of the aims of the large experiment performed at Wroclaw University of Technology was to investigate the difference between test-driven development and the traditional, test-last develop- ment as well as pair programming and solo programming with respect to the external code quality. It appeared that the external code quality was lower when test-driven development was used instead of the classic, test-last software development approach in case of solo programmers (p=0.028) and pairs (p=0.013). There was no difference in the external code quality when pair programming was used instead of solo programming.
Software quality is strictly connected with the source code quality. Coding standards may be very helpful in order to assure the high quality of the code, especially that they are supported by many application tools. The paper presents Codespector – the tool for checking the Java source files against a given coding standard. The rules of the standard are written using the special CRL language, elaborated by one of co-authors.
The paper investigates the use of two of the most popular software complexity measurement theories, the Halstead's Software Science metrics and McCabe's cyclomatic complexity, to analyze basic characteristics of multitasking systems implemented in the programming language C. Additional extensions of both systems of metrics are proposed to increase the level of obtained information connected with the typical characteristics of multitasking systems.
Łukasz Skitał, Renata Słota, Darin Nikolov, Jacek Kitowski
149 - 159
Emerging of the web services technology gives Service Oriented Architecture a new practical meaning. The architecture allows developing of complex, distributed systems in a multi-platform environment. Recent Grid projects take an advantage of this architecture, as it is applicable to systems developed by a number of different institutes, computing centers and companies. One of the important aspects of the Grid projects is data management, which is the subject of our study.
In this paper we present our experience from designing and implementing Virtual Storage System in Service Oriented Architecture.
In this paper a new software engineering laboratory introduced in the Institute of Computer Science Warsaw University of Technology in the fall 2004 is presented. Advanced Software Engineering 2 (SE-2) laboratory consists of seven exercises. Theses exercises are dedicated to requirements engineering, system design with UML , reuse, precise modelling with OCL – Object Constraint Language , code coverage testing, memory leaks detection and improving application efficiency. Six out of ten SWEBOK  knowledge areas are practiced. For each laboratory exercise a set of training materials and instructions were developed. These materials are stored on a department server and are available for all students and lecturers of advanced Software Engineering 2 (SE-2) course. Rational Suite tools are used in laboratory.
Marcin Jarzab, Jacek Kosinski, Krzysztof Zielinski
172 - 183
The number of computational elements in massively scalable systems can easily reach several hundred of processors. The exploitation of such systems creates new challenges. The paper presents the contemporary technology of massively scalable computer systems configuration management. This problem has been put in context of modern hardware resources virtualization techniques. The paper refers to solutions provided in this area by SUN Microsystems to show real existing solutions which are used and practically tested by the authors of this paper.
Monitoring tools are commonly used in modern software engineering. They provide feedback information for both development and for productional phases. The use of monitoring tools is especially important in distributed and grid systems, where different aspects of the environment have to be analyzed/manipulated by different types of tools, to support the process of program development. However, in order to avoid improper influences of one tool on other ones these tools must cooperate, what is called interoperability. In this paper we present an interoperability model designed and used in the JINEXT extension to OMIS specification, intended to provide interoperability for OMIS-compliant tools. We present a few practical experiments done with JINTOP – the reference implementation of JINEXT.
A new software tool for hard real-time system timing constraint modelling and validation is presented. The tool consists of a set of predefined timed coloured Petri net (TCPN) structures. The structures include built-in mechanisms, which detect missing timing constraints and make it possible to validate the timing correctness of the modelled system. The final model of the system is a hierarchical TCPN following the execution and simulation rules of CPN/Design software. The paper focuses on presenting the construction of the software tool and the guidelines for applying it in real-time system development.
This paper is about software requirements, main problem that may come from incorrectly defined requirements, bad practices and common mistakes made while creating requirements. Briefly describes methodologies of managing software requirements. In the article Requirement Management System Theater (RMST) is presented, system used to work with software requirements, keeps its history and manage them. It concentrates on managing versioned software products. Rational RequisitePro and CaliberRM systems are described for comparison.
Stanisław Szejko, Maciej Brochocki, Hubert Lyskawa, Wojciech E. Kozlowski
221 - 232
The paper describes an actual software improvement process that was carried out in a small company specialized in providing software and hardware design services in the field of real-time systems. The implemented RDQC method addresses a software project by controlling its quality on the basis of the project requirements. This is done by specifying the requirements, mapping them to quality characteristics and controlling the development process towards the demanded quality. A short presentation of the company and the pilot projects is followed by the description of the method implementation process and its results. Encouraging remarks on the usefulness and profitability of the method conclude the paper.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org