Ebook: New Trends in Software Methodologies, Tools and Techniques
Software is the essential enabling means for science and the new economy. It helps us to create a more reliable, flexible and robust society. But software often falls short of our expectations. Current methodologies, tools, and techniques remain expensive and are not yet sufficiently reliable, while many promising approaches have proved to be no more than case-by-case oriented methods. This book contains extensively reviewed papers from the tenth International Conference on New Trends in software Methodology, Tools and Techniques (SoMeT_11), held in St Petersburg, Russia, in September 2011. The conference provides an opportunity for scholars from the international research community to discuss and share research experiences of new software methodologies and techniques, and the contributions presented here address issues ranging from research practices and techniques and methodologies to proposing and reporting solutions for global world business. The emphasis has been on human-centric software methodologies, end-user development techniques and emotional reasoning, for an optimally harmonized performance between the design tool and the user. Topics covered include the handling of cognitive issues in software development to adapt it to the user's mental state and intelligent software design in software utilizing new aspects on conceptual ontology and semantics reflected on knowledge base system models. This book provides an opportunity for the software science community to show where we are today and where the future may take us.
Software is the essential enabler for science and the new economy. It creates new markets and new directions for a more reliable, flexible and robust society. It empowers the exploration of our world in ever more depth. However, software often falls short of our expectations. Current software methodologies, tools, and techniques remain expensive and are not yet sufficiently reliable for a constantly changing and evolving market, and many promising approaches have proved to be no more than case-by-case oriented methods.
This book explores new trends and theories which illuminate the direction of developments in this field, developments which we believe will lead to a transformation of the role of software and the integration of science into tomorrow's global information society. By discussing issues ranging from research practices and techniques and methodologies, to proposing and reporting the solutions demanded by global world business, it offers an opportunity for the software science community to think about where we are today and where we are going.
The book aims to capture the essence of a new state of the art in software science and its supporting technology, and to identify the challenges that such a technology will have to master. It contains extensively reviewed papers presented at the ninth International Conference on New Trends in software Methodology Tools, and Techniques, (SoMeT_11) held in Saint Petersburg, Russia with the collaboration of Saint Petersburg University from September 28–30, 2011. (http://www.somet.soft.iwate-pu.ac.jp/somet_11/). This round of SoMeT11 celebrates the 10th anniversary
Previous related events that contributed to this publication are: SoMeT_02 (the Sorbonne, Paris, 2002); SoMeT_03 (Stockholm, Sweden, 2003); SoMeT_04 (Leipzig, Germany, 2004); SoMeT_05 (Tokyo, Japan, 2005); SoMeT_06 (Quebec, Canada, 2006); SoMeT_07 (Rome, Italy, 2007); SoMeT_08 (Sharjah, UAE, 2008); SoMeT_09 (Prague, Czech Republic, 2009) and SoMeT_10 (Yokohama, Japan, 2010).
This conference brought together researchers and practitioners to share their original research results and practical development experience in software science and related new technologies.
This volume participates in the conference and the SoMeT series
A major goal of this work was to assemble the work of scholars from the international research community to discuss and share research experiences of new software methodologies and techniques. One of the important issues addressed is the handling of cognitive issues in software development to adapt it to the user's mental state. Tools and techniques related to this aspect form part of the contribution to this book. Another subject raised at the conference was intelligent software design in software security and programme conversions. The book also investigates other comparable theories and practices in software science, including emerging technologies, from their computational foundations in terms of models, methodologies, and tools. This is essential for a comprehensive overview of information systems and research projects, and to assess their practical impact on real-world software problems. This represents another milestone in mastering the new challenges of software and its promising technology, addressed by the SoMeT conferences, and provides the reader with new insights, inspiration and concrete material to further the study of this new technology.
The book is a collection of carefully selected papers refereed by the reviewing committee and covering:
• Software engineering aspects of software security programmes, diagnosis and maintenance
• Static and dynamic analysis of software performance models
• Software security aspects and networking
• Agile software and lean methods
• Practical artefacts of software security, software validation and diagnosis
• Software optimization and formal methods
• Requirement engineering and requirement elicitation
• Software methodologies and related techniques
• Automatic software generation, re-coding and legacy systems
• Software quality and process assessment
• Intelligent software systems design and evolution
• Artificial Intelligence Techniques on Software Engineering, and Requirement Engineering
• End-user requirement engineering, programming environment for Web applications
• Ontology, cognitive models and philosophical aspects on software design
• Business oriented software application models
• Model Driven Development (DVD), code centric to model centric software engineering
• Cognitive Software and human behavioural analysis in software design.
All papers published in this book have been carefully reviewed, on the basis of technical soundness, relevance, originality, significance, and clarity, by up to four reviewers. They were then revised on the basis of the review reports before being selected by the SoMeT_11 international reviewing committee.
This book is the result of a collective effort from many industrial partners and colleagues throughout the world. I would like acknowledge my gratitude to JSPS (Japanese Society for the Promotion of Science) and SANGIKYO Co., for the sponsorship and support. Also, my thanks go to Iwate Prefectural University, Saint Petersburg University, and all those who have contributed their invaluable support to this work. Most especially, I want to take this opportunity to thank the reviewing committee and all those who participated in the rigorous reviewing process and the lively discussion and evaluation meetings which resulted in the selected papers which appear in this book. Last and not least, I would also like to thank the Microsoft Conference Management Tool team for their expert guidance on the use of the Microsoft CMT System as a conference-support tool during all the phases of SoMeT_11.
The editors
Software development is a dynamic process where demands for change have been recognized to be inevitable. All forms of modifications to the user requirements cause software to change. The sources of changes could vary considerably and come from the users, business, technology, and organizations. Changes to requirements give rise to an intrinsic volatility, which is claimed to impact on many aspects of software development. Requirements Volatility (RV) is claimed to be a major source of risk to the management of software projects. Investigating the sources of, reasons for, and impacts of requirements changes is an important prerequisite for understanding the characteristics of requirements volatility. In this talk, I will discuss the nature of requirements changes and describe the results of our longitudinal study of requirements volatility. The results of this study have improved our understanding of this complex and multifaceted phenomenon and have provided valuable empirical evidence for the impacts of RV resulting in important insights for more effective management of requirements.
Components have been introduced in order to support software reuse. Components are reusable building blocks for larger systems consisting of units (code pieces) and their construction plan. Technologies like COM+, .Net, CORBA JavaBeans and EJBs (or web services which may be considered as web-based components) have been developed to support the component concept. However, components still have the problem of both unavailable and unreliable documentation of their properties. Moreover, a lack of validation support concerning the components' or composed systems' behavior may be observed. In other words, there is a lot of knowledge about how to combine components technically but less experience to validate the interactions between the components.
In this paper we introduce an approach to model components and their compositions as a base for automatic validation techniques to check the components composition and their interactions. The validation aims at deriving the component models from the implementation which allows comparing the specifications with the real components code.
This paper proposes a new verification method for individual web service specified using OWL-S, the Ontology Web Language for semantic web Services, where probabilities are introduced to model uncertain choices. The approach makes use of a probabilistic model checker named PRISM. The web service description is received as an input in form of OWL-S, which is analyzed and automatically transformed into a Discrete Time Markov Chain (DTMC) or a Markov Decision Process (MDP). The obtained DTMC/MDP is then processed and coded into the PRISM language. This code is parsed and verified by the PRISM model checker to determine which required properties are satisfied by the web service. In addition to the atomic processes, different composite processes are considered including sequence, choice, if-then-else, repeat-while, repeat-until, split, and split join. We also prove that the transformation algorithm is sound and complete and introduce a software tool implementing the approach.
This paper discusses the principles of documents for the systems design of a software system without a sequential nature. It reports firstly on the design aspect from “Human intentional activity”, and secondly the documentary aspect for human interception. Emphasis is put on diagrams, charts and tables, in essentially the same situation as a hardware.
The development of Web applications should be supported by business professionals themselves since Web applications must be modified frequently based on their needs. This paper describes the end-user-initiative development of Web applications based on two approaches. The first approach uses domain-specific frameworks which are effective for UI-driven systems. The second approach uses visual modeling tools which are useful for model-driven systems such as workflow systems. Both approaches suppose the three-tier architecture of user interface, business logic and database. After these approaches were applied to the development of some applications, the composite approach was considered for a typical application which should be easily developed and frequently modified by end-users.
Despite numerous software engineering problems, the software development community as a whole still suffers from delayed and canceled software projects. The basic idea of this paper is that this is – besides other potential reasons – also due to little focus on how people participating in software processes interact. These interactions seem crucial to the success of software processes and in fact they are very often only poorly supported.
In this paper we explain why we believe that interactions require more attention in software process modeling and support, which kinds of typical interactions we have identified, how they can be mapped to findings from the field of psychology and which kind of instruments might be suited to support them.
We consider a challenging class of emerging, highly interactive virtual environments which we call Real-Time Online Interactive Applications (ROIA). Popular examples include multi-player online computer games, e-learning and training applications based on real-time simulations, etc. ROIA combine tough demands on the level of scalability and highly intensive user interactivity with real-time QoS requirements on distributed performance. A major problem in this context is the efficient and economic utilization of server resources for distributed service provision, which is very difficult to achieve due to the variable numbers of users. In this paper, we propose a novel combination of two platforms that address this challenge by utilizing Cloud Computing for ROIA provision: we extend the Real-Time Framework (RTF) by the novel Cloud resource management system RTF-RMS. RTF is a high-level development platform that provides suitable management and scalability mechanisms for ROIA. RTF-RMS is a resource management system on top of RTF that implements workload analysis and distribution techniques. We illustrate how RTF-RMS interacts with RTF and describe how both platforms are used together to scale application servers up and down during runtime to conform to a changing number of users.
This paper deals with management issues on employees who are working apart the employer. Since the management activities are highly dependent on the supervisor's individual capabilities and performance, IT based managements are proposed for productivity improvement of employees working apart. Then, the Timed-PDCA concept, which has been developed for target achievements, is applied to the IT based management system. Some unique ideas for at-a-glance visualization of workers activities and report writing navigation for enforcing workers to seriously utilize their mind are presented.
Due to the Internet advancement, more applications are being developed over heterogeneous technologies and platforms. Thereby, reusability and interoperability are the important aspects for developing and exposing the computational services. That is why the Enterprise Application Integration is an important aspect in order to leverage high reusability and interoperability among disparate applications and services. Pattern based design is an effective way to avoid an expensive process of reinventing, rediscovering as well as revalidating agnostic software artifacts. In this paper, we are proposing a modification of the traditional Model-View-Controller (MVC) design pattern and exploring other composite patterns for an efficient integration of the applications and services. The key point is that the demarcation of a Functional (View) and an Implementation (Model) task can be achieved deliberately by inducing an Integrator (Controller). The Controller capabilities can be significantly enriched by encapsulating certain Non-Functional activities such as security, reliability, scalability, and routing of request. This enhancement enables the separation of Integration Logic from Functional Logic (Client Application) and Implementation Logic (Service). In our approach, the Controller can be considered as the compound design pattern of the Enterprise Service Bus (ESB). To demonstrate how the modified MVC pattern can be used for developing heterogeneous software environment, we are developing an e-Learning Computational Cloud (eLC2) software environment. This paper discusses the peculiarities of the modified MVC design patterns, main components of system architecture as well as how integration of the software components is realized by using the Dependency Injection pattern to achieve higher reusability and interoperability of Service Oriented applications.
One of the reasons why a project fails is that the project manager cannot correctly determine the skill level of the people in charge of tasks, and as a result, the task is not correctly assigned. In this research, we studied an inference system that determines the validity of task allocation based on a user model of the person in which system experience and skills acquired in the past are accumulated. The inference system determines from the degree of system similarity whether the skill that the person acquired in the past can be used for this task. In this paper, we propose a method for calculating the degree of similarity between the current system and systems experienced in the past based on a set system attributes for each skill. In addition, we developed a prototype system based on this model, evaluated it by using data from an actual development project, and confirmed its effectiveness.
URDAD, the Use-Case, Responsibility Driven Analysis and Design is a service-oriented methodology used by requirements engineers to produce the Computation Independent Models (CIMs) of the Model Driven Architecture (MDA) with sufficient detail and precision that they can be used directly as Platform Independent Models (PIMs). The analysis and design process is supported by a metamodel specifying the modeling semantics and a concrete grammar used to capture URDAD models. In this paper we identify quality criteria for the resultant requirements specification and for the process itself. For each quality criterion we identify a set of quality drivers and show how quality drivers are embedded within the URDAD methodology.
Knowledge management system (KMS) should handle efficiently a large information that is in most cases is related to user experiences and related situation that can affecting the performance of these KMS. User usually is keen to looking for the knowledge that fits best in his situation and domain work that instantaneously is changing due to handling situation. This outline presented in this paper facilitates the necessary detection of knowledge discovery that fit in the current problem solving, by estimating the user situation model along with user profile as a coherent model that reflects on the knowledge base in transparent manner. This would participate in determination of the best fitness of work flow that represents a semantic alignment between user profile and knowledge as pair-wise representation for optimized workflow that can participate to reflect a better performance in relation to semantic integration of user model and related knowledge. It is status report paper.
We propose a framework for argumentation-basednegotiation where Agents' constraints such as budget and time play a key role in determining the set of offers agents can make. Each offer is supported by arguments and each agent tries to achieve an agreement using arguments to persuade the opponent to make concession. An agreement is achieved when the last offer is accepted. Two concession and two acceptance strategies are identified and their different combinations result in four types of negotiating agents. The paper discusses completeness, Pareto optimality, and Nash equilibrium results with respect to these types.
Programming in pictures is an approach where pictures and moving pictures are used as an algorithmic alphabet to represent algorithms. Super-characters of this alphabet are used to represent algorithmic steps (called Algorithmic CyberFrames) which are assembled into special series to represent algorithmic features. A number of the series is assembled into an Algorithmic CyberFilm. The filmification of methods has been applied to a large variety of algorithms to test expressive features of the pictures for representing computation. In addition, cognitive aspects of programming in pictures and embedded clarity annotations supporting the approach and visual inspections by other people have also been analyzed. In this paper we focus on features of the algorithmic picture language and the filmification modeling environment which can be used for automatic and/or interactive checking of application model correctness. An overview of different sources of information about the same features of the application model is considered and concrete examples of automatic checking are provided.
Embodied Ubiquitous Learning Games (UELG) represent a new genre of technology-enhanced education in which learners interact with augmented physical environment to learn. The embodiment in such games comes from instrumented clothes based on e-Textiles incorporating sensors, actuators and low-power wireless modules. Designing and building such ubiquitous systems requires a complex interplay of conventional and embedded hardware and software co-design that includes novel elements like e-Textiles. In order to reduce the cost and shorten the development life-cycle this paper proposes the use of Software Product Lines (SPL). Towards this end, this paper borrows ideas from Architecture to present and apply a framework for constructing feature models for a Software Product Line for ULEG. The framework explicitly uses different types of features related to pedagogy, technology and domain of learning. A family of UELG based on a combination of problem posing and problem solving is used as an example to explain various components of the framework.
The talk presents author's view on main trends in modern knowledge engineering (KT) and teaching of KE. The teaching framework is targeted on the development of skills that will allow facilitating the process of knowledge elicitation and structuring for intelligent systems design and development. The role of ontologies is underlined.
The paper presents one practical approach for educational ontologies development. The process of knowledge structuring and ontology design is described. The special stress is set on visual design. The visualization enhances the cognitive aspect of the ontology usage. An example of hierarchy structure of ontologies for IT project management teaching is considered.
Business architecture became a well-known tool for business transformations. According to a recent study by Forrester, 50 percent of the companies polled claimed to have an active business architecture initiative, whereas 20 percent were planning to engage in business architecture work in the near future. However, despite the high interest in BA, there is not yet a common understanding of the main concepts. There is a lack for the business architecture framework which provides a complete metamodel, suggests methodology for business architecture development and enables tool support for it. The ORG-Master framework is designed to solve this problem using the ontology as a core of the metamodel. This paper describes the ORG-Master framework, its implementation and dissemination.