
Ebook: New Trends in Software Methodologies, Tools and Techniques

Software is the essential enabler for the new economy and science. It creates new markets and new directions for a more reliable, flexible, and robust society. It empowers the exploration of our world in ever more depth.
However, software often falls short of our expectations. Current software methodologies, tools, and techniques remain expensive and not yet reliable for a highly changeable and evolutionary market. Many approaches have been proven only as case-by-case oriented methods.
This book presents a number of new trends and theories in the direction in which we believe software science and engineering may develop to transform the role of software and science in tomorrow's information society.
This book is an attempt to capture the essence of a new state of art in software science and its supporting technology. The book also aims at identifying the challenges such a technology has to master. It contains papers accepted at the fourth International Conference on New Trends in Software Methodologies Tools and Techniques, IV, (SoMeT_05) held in Tokyo, Japan, from 28th to 30th September 2005, (http://www.somet.soft.iwate-pu.ac.jp/somet_05). This workshop brought together researchers and practitioners to share their original research results and practical development experiences in software science, and its related new challenging technology.
One example we challenge in this conference is Lyee methodology – a newly emerged Japanese software methodology that has been patented in several countries in Europe, Asia, and America, but which is still at an early stage of emerging as a new software style. This conference and the series it continues will also contribute to elaborate on such new trends and related academic research studies and development.
A major goal of this international conference was to gather scholars from the international research community to discuss and share research experiences on new software methodologies, and formal techniques. The conference also investigated other comparable theories and practices in software science, including emerging technologies, from their computational foundations in terms of models, methodologies, and tools. These are essential for developing a variety of information systems research projects and to assess the practical impact on real-world software problems.
SoMeT_02 was held on October 3–5, 2002, in Sorbonne, Paris, France, SoMeT_03 in Stockholm, Sweden, SoMeT_04 in Leipzig, Germany and the conference that these proceedings cover, SoMeT_05, was held in Tokyo, Japan. These events initiate a forthcoming series that will include the 5th conference, SoMeT_W06, to be organized in Quebec, Canada on September 2006 (http://www.somet.soft.iwate-pu.ac.jp/somet_06/).
This book is also in part a means for presenting few selected parts of the results of the Lyee International research project (http://www.lyee-project.soft.iwate-pu.ac.jp), which aims at the exploration and development of novel software engineering methods and software generation tools based on the Lyee framework. This project was sponsored by Catena and other major Japanese enterprises in the area of software methodologies and technologies.
This book participates to provide an opportunity for exchanging ideas and experiences in the field of software technology, opening up new avenues for software development, methodologies, tools, and techniques.
The Lyee framework for example, captures the essence of the innovations, controversies, challenges and possible solutions of the software industry. This world wide patented software approach was born and enriched from experience, and it is time, and again through SoMeT_W05 to try to let it stimulate the academic research on software engineering, attempting to close the gap that has so far existed between theory and practice. We believe that this book creates an opportunity for us in the software science community to think about where we are today and where we are going.
The book is a collection of 26 carefully reviewed best-selected papers by the reviewing committee.
The areas covered in the book are: – Requirement engineering and requirement elicitation, and its tools; – Software methodologies and tools for robust, reliable, non-fragile software design; – Lyee oriented software techniques, and its legacy systems; – Automatic software generation versus reuse, and legacy systems, source code analysis and manipulation; – Software quality and process assessment; – Intelligent software systems design, and software evolution techniques; – Software optimization and formal methods; – Static and dynamic analysis on software performance model, and software maintenance; – End-user programming environment, User-centered Adoption-Centric Reengineering techniques; – Ontology, cognitive models and philosophical aspects of software design; – Software design through interaction, and precognitive software techniques for interactive software entertainment applications; – Business oriented software application models; – Software Engineering models, and formal techniques for software representation, software testing and validation; – Aspect oriented programming; – Other software engineering disciplines.
All papers published in this book are carefully reviewed and selected by SOMET international program committee. Each paper has been reviewed by three or four reviewers and has been revised based on the review reports. The papers were reviewed on the basis of technical soundness, relevance, originality, significance, and clarity. The acceptance rate for the papers listed in this book is 48% in this year SoMeT.
This book was made possible by the collective efforts of all the Lyee International project collaborators and other people and supporters. We gratefully thank Iwate Prefectural University, University of Laval, Catena Co., ISD, Ltd, SANGIKYO co., and others for their overwhelming support. We are especially thankful to the program committee and others who participated in the review of all submitted papers and thanks also for the hot discussion we have had on the PC meetings to select the papers listed in this book.
This book is another milestone in mastering new challenges on software and its new promising technology, within the SoMeT framework and others. Also, it gives the reader new insights, inspiration and concrete material to elaborate and study this new technology.
We would also like to thank and acknowledge the support of the University of Leipzig, Telematik and e-Business group for allowing us to use the Paperdyne System as a conference-supporting tool during all the phases on this transaction.
The Editors
Whereas software variability deals with customisation and adaptability of software, we consider here the issue of modelling variability for Information System artefacts. We view variability in the larger perspective of the system meeting the purpose of many organisations and customer groups. We propose to represent this multi-facetted nature of a purpose through the notion of intentions and strategies organised as a map. The map is a directed, labelled, non-deterministic graph with intentions as nodes, and strategies to achieve intentions, as edges. Its nature allows the capture of different forms of variability through multi-edges between a pair of nodes thereby enabling many traversals of the graph from beginning to end. Besides, using the refinement mechanism of the map, it is possible to represent variability at different levels of detail. We show the power of a map to represent variability and, as an illustration, model the variations of the SAP Materials module as a map. Additionally, we show that variations in process models can also be captured in the map formalism. Again, we apply a multi-facetted process model to customise the SAP materials module.
Optimally software design should be robust enough to handle any future additions or code changes, however it is not always possible to predict the direction a software project will go in. In research projects and in-house projects, where the user and creator are often the same, software design may be overlooked entirely. If software is complex, or has not been designed for its current modifications, it may merit reengineering. The future use and needs of the program must be evaluated. There are many metrics that are used to make this decision, but these metrics only act as a guide. Most of the decision to redesign a software system is subjective, and is often made as the development of the program becomes increasingly difficult. In this paper, the redesign of a complex piece of software is examined. The process of redesign is evaluated to determine if the work put into it was worth the benefits accrued by the new design.
Often declared dead or at least dying, C/C++ is still the lingua franca of many application domains. Aspect-Oriented Programming (AOP) is a programming paradigm that supports the modular implementation of crosscutting concerns. Thereby, AOP improves the maintainability, reusability, and configurability of software in general. Although already popular in the Java domain, AOP is still not commonly used in conjunction with C/C++. For a broad adoption of AOP by the software industry, it is crucial to provide solid language and tool support. However, research and tool development for C++ is known to be an extremely hard and tedious task, as the language is overwhelmed with interacting features and hard to analyze. Getting AOP into the C++ domain is not just technical challenge. It is also the question of integrating AOP concepts with the philosophy of the C++ language, which is very different from Java. This paper describes the design and development of the AspectC++ language and weaver, which brings fully-fledged AOP support into the C++ domain.
The importance of low power consumption is widely acknowledged due to the increasing use of portable devices, which require minimizing the consumption of energy. Energy dissipation is heavily dependent on the software used in the system. In this paper we analyze the energy consumption of the Collatz Problem and draw some useful conclusions.
This study considers our thinking is comprised of the set of something. Something in this study is cited as existences resident in our mind. This study considers the existences can be defined by certain rules. What are they? In this study, they are what cannot be negated, nor affirmed. The rules are introduced to the limits of our thinking, by using Whole, Part, Static, Dynamic, Synchronous, Asynchronous, Memory, Assimilation, Awareness and Mind, as axiom. In this study, these concepts are defined but their rationality cannot be verified by humans. This study proposes a hypothesis of the world comprised of such rules. By the rules of this world, how Software is introduced by human intelligence is observed theoretically.
“Lyee” is meant to be this theoretical axiomatic observation. With this theoretical observation, we expect a new awareness of software is established.
With this effect, for example, we can obtain universal algorithm independent of programming languages and programs. With this algorithm, we can realize automatic programming language conversion and program's diagnosis.
In this paper, a theoretical observation of Lyee is proposed. Other rules and examples of concrete cases not discussed herein can be referred to other papers.
The Lyee methodology allows the development of a software by simply defining its requirements. More precisely, a developer has only to provide words, calculation formulae, calculation conditions and layout of screens and printouts, and then leaves in the hands of the computer all subsequent troublesome programming process, i.e. control logic aspects. The formalization of Lyee methodology led to the definition of Lyee-Calculus, a formal process algebra, that easily and naturally supports the basic concepts of the Lyee methodology. Moreover, we provided an implementation of the constructs of the Lyee-Calculus in Java language in order to concretely show the efficiency of this calculus and its suitability for the Lyee methodology. In other words, this Java implementation of the Lyee-Calculus provides a means of bridging the gap between Lyee requirement specifications and their implementations.
In this paper, we present a new software development environment, LyeeBuilder, that allows to automatically generate applications from specifications using a GUI interface. This software aims to give to programmers an environment that allows them to automatically generate applications from screens and word definitions.
An overview of three most frequently used methods for qualitative evaluation of entities is presented in the paper, namely Overall Integral Index, Fuzzy Analytic Hierarchy Process and Consensus Relation. The implementation of these approaches to the measurement of the software quality attributes is discussed.
The focus of this paper is a pilot study of IT practitioners regarding risk management practices and tools used in Australian software development projects. Our previous work [1] explained the method used for investigating whether there were differences in the practices and procedures relating to projects with 1) inter-nal customers, 2) external customers and 3) both internal and external customers. For a comprehensive view to enable understanding the method use for data analysis in this pilot study 1) we explain the approach to what we have undertaken; 2) we discuss data collection for the survey and 3) describe our data analysis from the survey. Our respondents were from software development organizations in Australia and all had previously been involved with at least one software development project. Overall, we found that 1) risk management practices were used more frequently for projects involving external customers, 2) risk manage-ment is taken more seriously when external customers are involved, 3) the people who were responsible for risk management practices had senior positions within the organization, 4) there was no difference in the type of customer for projects where simulation and predictive tools were used, 5) external customers were more satisfied with simulation and predictive tools than internal customers.
Do we always use the same name for the same concept? Usually not. While misunderstandings are always troublesome, they pose particularly critical problems in software projects. Requirements engineering deals intensively with reducing the number and scope of misunderstandings between software engineers and customers. Software maintenance is another important task where proper understanding of the application domain is vital. In both cases it is necessary to gain (or regain) domain knowledge from existing documents that are usually inconsistent and imprecise.
This paper proposes to reduce the risk of misunderstandings by unifying the terminology of the different stakeholders with the help of an ontology. The ontology is constructed by extracting terms and relations from existing documents. Applying text mining for ontology extraction has an unbeatable advantage compared to manual ontology extraction: Text mining detects terminology inconsistencies before they are absorbed in the ontology. In addition to this, the approach presented in this paper also introduces an explicit validation of ontology gained by text mining.
As a first step to resolve the assignments of an end user development environment, this paper proposes a co-developing model for the participation of users in web application development. First, the procedure for a simple web application is arranged from the view points of “screen” and “screen transition” so that an end user can easily take part in the development. These can be classified into two parts; one that an end user can define and the other that an end user cannot. Using the definition items in these two parts, an operational model for co-developing is proposed, and the procedures for co-development are shown. A collaboration model that can be developed by a user and a system engineer is proposed; then, based on the proposed model, a tool supporting co-developing is built.
The requirement to improve software productivity has promoted the research on software metric technology. There are metrics for identifying the quality of reusable components. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. A suit of metrics can be used to obtain the reusability in the modules. And the reusability can be obtained with the help of Neuro-fuzzy based approach where neural network can learn new relationships with new input data, can be used to refine fuzzy rules to create fuzzy adaptive system. An algorithm has been proposed in which the inputs can be given to Neuro-fuzzy system in form of Cyclometric Complexity, Volume, Regularity, Reuse-Frequency & Coupling, and output can be obtained in terms of reusability.
Verification is an important instrument in the analysis of systems. Roughly, this means that requirements and designs are analyzed formally to determine their relationships. Various candidates for formalizing system development and integration have been proposed. However, a major obstacle is that these introduce non-standard objects and formalisms, leading to severe confusion. This is because these models often are unnecessarily complicated with several disadvantages regarding semantics as well as complexity. While avoiding the mathematical details as far as possible, we present some basic verification ideas using a simple language such as predicate logic and demonstrate how this can be used for defining and analyzing static and dynamic requirement fulfillment by designs as well as for detecting conflicts. The formalities can be found in the appendix.
The paper is devoted to the discussion of the limitations and possibilities of the automation of the process of program construction from an informal specification.
Intelligent information processing systems, including search, question answering, and summarization systems, must be able to handle pronouns, i.e. elements that do not have independent reference. We thus, focus on pronominal anaphora resolution (Mitkov [20], Kennedy and Boguraev [18], Lappin and Leass [19]), and identify the main features of an intelligent software design that includes the identification of minimal domains of interpretation based on asymmetric relations. We show that the identification of domains of interpretation based on asymmetric agreement (Di Sciullo [9]) opens new avenues in intelligent systems design.
This paper contains a description of our method of semantic meta parsing, and the method is implemented as a Prolog program. For comparison, we present briefly Johnson and Kay's method of semantic abstraction [9]. The two methods are compared by application on a sample problem from Hans Kamp's Discourse Representation Theory (DRT) [10].
Enterprise models should have a capacity to describe consistently business processes across organisational and technical system boundaries. It would help system designers to understand why the technical system components are useful and how they fit into the overall organisational system. The implementation bias of many information system methodologies is a big problem for inconsistency and integrity control. The same implementation oriented foundations are often used in system analysis and design phase, without rethinking these concepts fundamentally. Common repository of most CASE tools does not guarantee the consistency of enterprise architectures for a reason that interplay among static and dynamic dependencies is not available. Enterprise modelling and integration should stick to the basic conceptualisation principle that prescribes analysis of only conceptually relevant aspects. It cannot be influenced by any implementation details. The consistency problems are best detectable and traceable at the conceptual layer. In this study on semantic dependencies, we demonstrate how various fundamental concepts from different classes of models can be interlinked and analysed together. An important result of this study is a set of inference rules. The inference capability is an intrinsic feature of logical approaches, but the conventional methods of system analysis have not yet dealt in sufficient detail with the inference principles.
Conceptual modelling in software engineering is dominated by the implicit semantics of UML, some other OMG products, and a bunch of related modelling tools. Although other alternatives to modelling are seen in academic circles, industry is dominated by this single approach. With no room to improve and no freedom to experiment, stagnation is guaranteed. The paper claims that a technology-free, purely theoretical analysis of models and modelling is necessary in order to find what real models are about, what the real modelling mechanisms behind software engineering are, and what kind of modelling infrastructure should serve as a foundation for future modelling technologies.
The emergence of Web services represents a shift from component-based architectures that have proved successful in the context of enterprise computing to service-oriented architectures that are more suited to the highly distributed Internet-based applications. This trend towards service-oriented computing necessitates the re-evaluation of software development methodologies that are used in the construction of distributed applications. With growing acceptance of service-oriented computing and increasing number of large-scale Web Services projects there is some evidence that practitioners involved in implementing these solutions are paying only limited attention to how such applications should be designed. Frequently, the design of Web Services applications is driven by performance and scalability considerations, rather than any sound software engineering principles. A comprehensive methodological framework is required to guide designers and developers of service-oriented applications through the various phases of software development life cycle with specific emphasis on producing stable, reusable and extendable services.
In this paper we discuss design of service-oriented applications from a software engineering perspective, and propose a software development framework for Web Services based on identification of elementary business function using business function decomposition and mapping these functions to service operations. We apply interface design principles adapted from object-oriented design as guidelines for the design of services.