Ebook: New Trends in Software Methodologies, Tools and Techniques
Software is the essential enabler for science and the new economy. However, current tools and methodologies remain expensive and not yet sufficiently reliable for a changing and evolving market, while many approaches which seemed promising have proved to be little more than case-orientated solutions. This book contains 30 extensively reviewed papers from the SoMeT10 international conference on new trends in software methodology, tools and techniques in Yokohama, Japan, and is the latest volume in the SoMeT series. The annual SoMeT conference represents a chance for researchers and practitioners worldwide to meet and share research results and practical developments in software science and related new technologies. This year, the emphasis has been on human-centric software methodologies, end-user development techniques and emotional reasoning. Subjects covered range from all aspects of software security, optimization and assessment to intelligent software systems, cognitive models and the philosophical aspects of software design. Exploring issues from research practices, techniques and methodologies, and proposing and reporting the solutions needed for global world business, this book offers an opportunity for the software science community to reflect on where they are today and how they can work to achieve an optimally harmonized performance between the design tool and the end-user.
Software is the essential enabler for science and the new economy. It creates new markets and new directions for a more reliable, flexible and robust society. It empowers the exploration of our world in ever more depth. However, software often falls short of our expectations. Current software methodologies, tools, and techniques remain expensive and are not yet sufficiently reliable for a constantly changing and evolving market, and many promising approaches have proved to be no more than case-by-case oriented methods.
This book explores new trends and theories which illuminate the direction of developments in this field, developments which we believe will lead to a transformation of the role of software and science integration in tomorrow's global information society. By discussing issues ranging from research practices and techniques and methodologies, to proposing and reporting solutions needed for global world business, it offers an opportunity for the software science community to think about where we are today and where we are going.
The book aims to capture the essence of a new state of the art in software science and its supporting technology, and to identify the challenges that such a technology will have to master. It contains extensively reviewed papers presented at the ninth International Conference on New Trends in software Methodology Tools, and Techniques, (SoMeT_10) held in Yokohama, Japan, with the collaboration of SANGIKYO Co., from September 29th to October 1st 2009. (http://www.somet.somet.iwate-pu.ac.jp/somet_10/).
This conference brought together researchers and practitioners to share their original research results and practical development experience in software science and related new technologies.
This volume participates in the conference and the SoMeT series
Previous related events that contributed to this publication are: SoMeT_02 (the Sorbonne, Paris, 2002); SoMeT_03 (Stockholm, Sweden, 2003); SoMeT_04 (Leipzig, Germany, 2004); SoMeT_05 (Tokyo, Japan, 2005); SoMeT_06 (Quebec, Canada, 2006); SoMeT_07 (Rome, Italy, 2007); SoMeT_08 (Sharjah, UAE, 2008); SoMeT_09 (Prague, Czech Republic, 2009) and SoMeT_10 (Yokohama, Japan, 2010).
This book, and the series it forms part of, will continue to contribute to and elaborate on new trends and related academic research studies and developments in SoMeT_2011 in Germany.
A major goal of this work was to assemble the work of scholars from the international research community to discuss and share research experiences of new software methodologies and techniques. One of the important issues addressed is the handling of cognitive issues in software development to adapt it to the user's mental state. Tools and techniques related to this aspect form part of the contribution to this book. Another subject raised at the conference was intelligent software design in software security and programme conversions. The book also investigates other comparable theories and practices in software science, including emerging technologies, from their computational foundations in terms of models, methodologies, and tools. This is essential for a comprehensive overview of information systems and research projects, and to assess their practical impact on real-world software problems. This represents another milestone in mastering the new challenges of software and its promising technology, addressed by the SoMeT conferences, and provides the reader with new insights, inspiration and concrete material to further the study of this new technology.
The book is a collection of 30 carefully refereed papers selected by the reviewing committee and covering:
• Software engineering aspects of software security programmes, diagnosis and maintenance
• Static and dynamic analysis of software performance models
• Software security aspects and networking
• Agile software and lean methods
• Practical artefacts of software security, software validation and diagnosis
• Software optimization and formal methods
• Requirement engineering and requirement elicitation
• Software methodologies and related techniques
• Automatic software generation, re-coding and legacy systems
• Software quality and process assessment
• Intelligent software systems design and evolution
• Artificial Intelligence Techniques on Software Engineering, and Requirement Engineering
• End-user requirement engineering, programming environment for Web applications
• Ontology, cognitive models and philosophical aspects on software design
• Business oriented software application models
• Model Driven Development (DVD), code centric to model centric software engineering
• Cognitive Software and human behavioural analysis in software design.
All papers published in this book have been carefully reviewed, on the basis of technical soundness, relevance, originality, significance, and clarity, by up to four reviewers. They were then revised on the basis of the review reports before being selected by the SoMeT_10 international reviewing committee.
This book is the result of a collective effort from many industrial partners and colleagues throughout the world. I would like acknowledge my gratitude to JSPS (Japan Society for the Promotion of Science) and SANGIKYO Co., for the sponsorship and support. Also, my thanks go to Iwate Prefectural University, ARISES, and all the others who have contributed their invaluable support to this work. Most especially, I thank the reviewing committee and all those who participated in the rigorous reviewing process and the lively discussion and evaluation meetings which led to the selected papers which appear in this book. Last and not least, I would also like to thank the Microsoft Conference Management Tool team for their expert guidance on the use of the Microsoft CMT System as a conference-support tool during all the phases of SoMeT_10.
The editor
This paper discusses the importance of requirements documents and the reasons that the requirements documentation methods commonly applied in industrial software development are inadequate. The use of functional methods and tabular expressions for producing precise requirements documentation is explained and illustrated. This includes:
• An explanation of the two-variable model, common in mechanical and electrical engineering showing how it can be used to documenting computer system requirements.
• An explanation of the reason that the two variables method, though theoretically useful for software systems, is not practical for them.
• A description of a four-variable model that is practical for software requirements documentation.
• A way of structuring the requirements document to make the requirements easier to specify and review.
• A discussion of the concept of mode and an explanation of how modes and mode classes can be used in requirements documentation.
Finally we explain how a well-structured requirements document can be used to organize the software so that it will be easy to maintain and there will be clear traceability between code and requirements.
Managing and developing a set of software products jointly using a software product line approach has achieved significant productivity and quality gain in the last decade. More and more, product lines now are becoming themselves entities that are sold and bought in the software supply chain. Customers build more specialized product lines on top of them or derive themselves the concrete products. As customers have different requirements, whole product lines now may vary depending on customer needs—they need to be customized. Current approaches going beyond the scope of one product line do not provide appropriate means for customization. They either are tailored to specific implementation techniques, only regard customization on few levels (e.g., only source code level), or imply a lot of manual effort for performing the customization.
The PLiC Approach tackles this challenge by providing a generic, reusable reference architecture and methodology for implementing such customizable product lines. In the reference architecture, a product line consists of so-called product line components (PLiCs), which are flexibly recombinable slices of a formerly monolithic product line, thereby maintaining strict separation of concerns. The approach furthermore comprises a tool-supported methodology for recombination of PLiCs based on customer needs and thus minimizes manual intervention when customizing. We implemented the PLiC Approach for a complex model-driven product line, where it facilitates comprehensive customization on various levels in the models, the model transformation chain, and in the source code with reasonable effort. This gives evidence that our approach can be applied in various other contexts where the same or fewer customization levels need to be considered.
This paper considers and analysis the idea of magic wand as approach to representation new technologies, their implementability, application and use in different object domain. Magic wand approach in its currently implementable form is suggested to be the standard for any new technology representation. How to describe a new technology, how to accumulate and control the active knowledge base constitute the main subject of the paper.
A multilevel approach to realize a self-explanatory representation of symbols, variables, language constructs and software components is considered. Some aspects related to the implementation of the approach within the concept of Filmification of methods are provided. In particular, super-symbols and constructs of Language of Integrated view and Language of Algorithmic interFaces of a cyber-Film programming environment are analyzed and examples of embedded-clarity support are presented.
In this paper we tackle the question how early phases of software projects can be improved by applying methodologies provided by psychological science. We show an approach for developing business information systems, called No-Frills Software Engineering, which benefits from systemic approaches. These approaches allow to resolve communication and interaction issues still persistent in early phases of software projects, which are normally not addressed by conventional software process models.
Impact analysis is an activity of assessing the effect of making a set of changes to a software system. Many approaches have been developed include performing impact analysis on a high level model that reflects to low level analysis using class interaction prediction. However, analysis from the model contains false results due to not all interactions between classes have impact to one another. In this paper we introduce a new impact analysis approach that is able to filter some false results using a set of impact prediction filters. The contributions of the paper are: (1) a new impact analysis approach; (2) a new set of impact prediction filters and; (3) evaluation results that show the new impact analysis approach improves the accuracy of the prediction results.
A large set of partial data models is used in designing a large information system. These partial data models provide several complementary views on the system to be developed. This however leads to a need for compositional models that are able to produce a single integrated model. These data models are often described by a class diagram of Unified Modeling Language because it is a very popular modeling language and describing a static view of a system. In this paper, we present syntax and semantics of a class diagram describing a data model. We propose a family of well-formed class diagrams as a domain of class diagram algebra and composition operations as merge and difference operations. We then show that algebraic properties as associativity, commutativity and involutivity are desired for model management to develop a large information system.
Fitting information systems to business needs is considered equally important by both, the Requirements Engineering and MIS communities. Even though alignment/fit clearly appears as desirable, a number of issues still remain unsolved. This paper gives to fitness a central position and introduces the notion of a fitness relationship and its measurement. It highlights two sets of issues, one involved in understanding this relationship and the second in engineering it. It also points out broad directions and trends in resolving these issues.
In this paper, using an application software system called “sales funnel” is proposed to improve sales force performance measurement and training for B2B business. With this software, updating prospects' information weekly at the origin by sales representatives, allows real time sharing of information throughout a corporation and visualizing the entire sales process. This reduces data preparation time before corporate level strategic discussions. Enforcing weekly updates by this system results in time shortened PDCA (Plan, Do, Check, and Act) cycles, thereby enabling the acceleration of the sales activities. Moreover, repeated PDCA cycles with enforced logical thinking, gives essential on-the-job training to the sales representatives, nurturing them into becoming better performers.
In order to improve the information system development process many companies have adopted system development methodologies. Various evaluation methods and frameworks exist in the literature to assist companies during their adoption phase. However, a need was identified to evaluate the efficiency of a system development methodology after adoption. In an era where there is a reappraisal of the usefulness of system development methodologies it is essential to have a valid and reliable method of evaluation. The aim of this paper is to propose an evaluation model to measure post-implementation efficiency of a software development methodology. Data Envelopment Analysis, a linear programming method, is investigated as a means to evaluate efficiency of a SDM after adoption. This paper focus on the Scrum software development methodology as it is a popular methodology in use today. With Data Envelopment Analysis it is possible to classify different companies' use of Scrum as efficient or inefficient. The results also make it possible to identify specific areas in a company that needs improvement. These individual recommendations can be applied to increase the post-implementation efficiency of a company's systems development methodology.
Use cases were introduced into the Unified Modeling Language to capture the functional requirements in object oriented systems development. This work reports the results of comparing two controlled experiments conducted on use case models with different subjects, in which the effect of use case format on users understanding of systems requirements is assessed. Replication with subjects of different knowledge in use case technique allowed us to investigate whether subjects experience play any role in the comprehension of use case models. The results of the controlled experiments showed that for the comprehension tasks, which required only surface understanding of the Use Case model, the provision of diagrams along with the textual use case descriptions significantly improved comprehension performance of both novice and high knowledge users. However, diagrams had no effect on users performance in the deep understanding tasks. Moreover, there was no evidence that prior experience with use cases has influenced subject's performance in surface and deep understanding tasks in familiar and unfamiliar application domains.
A new data management method called ISSEI was proposed in the previous report [1], to avoid slowing down of data retrieving time for the database that has an inherent nature of ever growing data volume. ISSEI's data handling performance, namely adding, retrieving, updating and deleting data were evaluated. Through the evaluation, it was confirmed that ISSEI's data handling performances are not dependent on the database size, while the conventional DBMS such as MS SQL Server requires more time for data handling as the database size grows [6].This paper deals with evaluations of ISSEI's performance limitation on system operation and maintenance toward applying ISSEI to a corporate knowledge database that has growing data volume. Through the evaluations, interesting results were obtained. One interesting result is that ISSEI's performance dependency on the data volume size is maintained up to the upper capacity of hard disk. Another interesting result is that data recovery performance of ISSEI is drastically degraded by the length of data records. From the two results, it is concluded whether ISSEI can be applied for the database that has growing data volume or not.
Software lifecycle is the process by which software is conceived, developed, maintained, and decommissioned. To the development team, initiating effective application lifecycle management (ALM) is challenging for three reasons. (1) ALM definition is hard since lifecycle activities are interdependent and complex in nature that involves product, project, people, process, tool and technology. (2) ALM activities require the support of correctly tailored tools. (3) Effective ALM activities execution requires discipline. To take on the three challenges of initiating ALM, we present a new approach called Rapid Application Lifecycle Management (RALM). RALM provides a reference model with a number of templates for ALM activity definition. Once customized, the templates are converted into platform-specific process definition files with tool support. Observations from a field application of RALM are presented and discussed.
There is a popular function that can display the target object to full screen in using computer. This function has a feature that many contents on last active object are hidden by the other object when the user switches the active object to the other object. Therefore, the resource for thinking is consumed by keeping the memory of the hidden information. In this paper, we propose three ideas for keeping the display of contents on active object below. One is an idea that is keeping the desired object to front side of the screen. The other idea is to list files while displaying the active object. And to set automatically the size and position of object to the frequently used size and position file by file. In addition this paper describes how to apply the above ideas to the control-icon that can easily handle multiple objects and was proposed by the authors' previously.
Firewalls are crucial elements in enforcing network security policies. They have been widely deployed for securing private networks but, their configuration remains complex and error prone. During the last years, many techniques and tools have been proposed to correctly configure firewalls. However, most of existing works are informal and do not take into account the global performance of the network or other qualities of its services (QoS). In this paper we introduce a formal approach allowing to formally and optimally configure a network so that a given security policy is respected and by taking into account the QoS.
Although we have enjoyed exchanging information electronicly these days, paper is still a useful media for communications. From this viewpoint, a printer is a fascinating device to connect the cyber space to the real world. In this research we propose a peer-to-peer remote printing system. Printer devices, however, remain yet large and heavy because they require papers and inks. It is still difficult to carry around the printers devices. In this paper, we propose an advanced printing system for ubiquitous computing, PrinterSurf which provides a flexible printing service anytime anywhere via computer networks using PCs and closely-located printer devices. This paper introduces our research on PrinterSurf as well as some issues to be dealt with in future.
This paper focuses on the service composition based on security properties of services from an end user perspective. End users are usually not expert in computer security, but expert users of computer software. They typically either own or work for small and medium enterprises (SMEs). The proposed framework attempts to demonstrate that end users of small enterprises can compose a service based application based on the security profiles of software services. The paper argues that the security concerns of various stakeholders of services should be specified differently. The paper envisions a framework with which end users could select services consistent with their preferred security features suitable for their businesses. With the same token, consumers of such applications can easily understand the security profile of services in order to make a B2B transaction. This will provide end users more power to force the service developer to offer better security-aware services. The main contribution of this paper is a framework on which further work could be initiated.
This paper deals with networked applications in the emerging field of online virtual worlds. Example applications include, e.g., Massively Multiplayer Online Computer Games (MMOG), networked e-learning, training and simulations, etc. Highly interactive virtual worlds bring a new security challenge: the multiple participants of such applications do not always show cooperative and intended behavior, but rather may act in an illegal way (cheating) which is harmful for other participants, thus intentionally or accidentally procuring illicit advantages for themselves. The paper studies the new challenge of cheating in virtual-world applications in three areas: a) system and application programming, b) economics, and c) law. The main contributions of our work are as follows: 1) We present a systematic classification of cheating threats in virtual worlds, and describe software solutions that help prevent them in future Internet-based applications; 2) We enhance the classical economic analysis of crime and punishment for applying it to virtual worlds; 3) We describe our development approach for networked virtual worlds and its implementation as the Real-Time Framework (RTF) which has been designed at the University of Muenster; 4) Finally, we explore the law aspects of cheating in virtual applications in the context of the legal system in Germany. The consideration of informatics aspects together with the corresponding problems of economics and law allows us to tackle virtual-world security in a holistic, systematic manner.