Atsunori Minamikawa, Hamido Fujita, Jun Hakura, Masaki Kurematsu
327 - 335
In this paper, we propose a personality estimation application for Twitter client in smartphone. This application enables user to know the personality of other Twitter accounts. In this application personality estimation is performed by the text classification method from tweet data. In the demonstration experiment with 44 subjects, our proposed application proves its effectiveness especially for the users in their 20s or Twitter experienced users in entertainment use. Both subjective and objective evaluations show that the personality information in social media makes some contribution to build a new connection and to stimulate the social media use.
Two types of ontology have been presented to formalize a patient state: mental ontology reflecting the patient mental behavior due to certain disorder and physical ontology reflecting the observed physical behavior exhibited through disorder. The medical knowledge is represented by fuzzy attributes reflected on the medical knowledge. Aggregation function related to physical attributes represented as intuitionistic interval fuzzy weighted ordered average operators. Aggregation functions related to Bonferroni intuitionistic interval fuzzy weighted operators for mental ontology. Fuzzy representation based on these two types of ontology is aligned on medical knowledge for decision making related to medical diagnosis based on pairing function to compute similarity among medical cases. We have constructed an integrated computerized model which reflects a human diagnostician as computer model and through it; an integrated interaction between that model and the real human user (patient) is utilized for 1st stage diagnosis purposes.
In this paper, we proposed an idea of improvement of a decision tree learning algorithm using cluster analysis. We classify data set based on two relations. One is the relation between each class and each attribute and the other is the relation between attributes. First relation is used in a traditional decision tree algorithm and second relation is used in cluster analysis. Using second relation is our point in this approach. In order to evaluate our approach, we did an experiment using data set in machine learning repositories. Experimental result show the possibility that our approach is better than a traditional decision tree learning algorithm.
Tatiana Gavrilova, Vladimir Gorovoy, Ekaterina Bolotnikova
361 - 378
This article presents an approach aimed at creating teaching strategies for e-learning based on the principles of ontological engineering and cognitive psychology. For the last ten years visual representation of knowledge has been topical in e-learning methodology and it is heavily connected to ontology design and development. The proposed framework is aimed to develop a methodology where the design of ontology is evaluated by assessing its structure with several quantitative metrics. This can scaffold the process of knowledge structuring and orchestrating teaching ontologies for courseware design. The refining procedure is essential for ontology design and development. Illustrative domain ontologies are helpful in learning process. The main idea is to use visual techniques of mind-mapping and concept mapping as a powerful learning tool. Cognitive bias and some results of Gestalt psychology form a general guideline. The ideas of balance, clarity, and beauty are applied to the ontology evaluation procedures. Such approach can be widely used in enterprise knowledge management systems, in education processes, and can help educators and students create ontologies of high quality.
Vincenzo Moscato, Antonio Picariello, Angelo Chianse
379 - 394
The definition of ontologies within the multimedia domain still remains a challenging task, due to the complexity of multimedia data and the related knowledge. In this paper, we present a novel framework (MOWIS) that aims at realizing a system for building Multimedia Ontologies from Web Information Sources. In particular, we propose: i) a multimedia ontology model that combines both low level descriptors and high level semantic concepts; ii) automatic construction of ontologies using the Flickr web services that provide images, tags, keywords and sometimes useful annotation describing both the image content and personal interesting information. Eventually, we describe an example of automatic ontology generation in a specific domain and present some preliminary experimental results.
Adjuncts contribute to the spatiotemporal anchoring of events and situations described by linguistic expressions, while arguments describe the participants of events and situations. We show how a deterministic parser, based on the recovery of asymmetric relations, enables the identification of adjuncts as possible antecedents to pronouns, whether they are located in the domain of the sentence or in the domain of the discourse. We suggest that Information technologies and Natural language processing more generally should rely on pronominal anaphora resolution based on asymmetric agreement in order to improve their performance.
Roberto Revetria, Alessandro Catania, Barbara Catania, Bruno Filippo Mazzarello
413 - 423
The aim of the paper is to describe a project concerned with the development of a daily monitoring system for elderly people living alone. The system relies on a new non invasive type of communication based on devices commonly owned by elderly people, to reduce initial cost of deployment. All collected data could then be analyzed by a Medical Doctor to monitor the real current situation of the patient using open source instrument to generate analysis, report and data mining tasks.
Igino Genuini, Alessandra D'Ambrosi, Elisa Silvetti, Claudio De Lazzari, Domenico M. Pisanelli, Francesco Fedele
424 - 429
Every year in Italy more than 1000 people under 35 years old are victims of sudden cardiac death (SCD). Such an incidence is comparable to that of other Western countries. Pathologies responsible for SCD should be diagnosed as early as possible in order to have effective prevention. The pilot study described in this paper is a particular application of telecardiology. It exploits the teleconsulting paradigm in order to perform a screening in high school students aimed at preventing sudden cardiac death.
Application programming for modern heterogeneous systems which comprise multiple accelerators (multi-core CPUs and GPUs) is complex and error-prone. Popular approaches, like OpenCL and CUDA, are low-level and offer no support for the two most complicated issues: 1) programming multiple GPUs within a stand-alone computer, and 2) managing distributed systems that integrate several such computers. In particular, distributed systems require application developers to use a mix of different programming models, e.g., MPI together with OpenCL or CUDA. We propose a uniform approach based on OpenCL for programming both stand-alone and distributed systems with GPUs. The approach implementation is based on two parts: 1) the SkelCL library for high-level application programming on heterogeneous stand-alone computers with multi-core CPUs and multiple GPUs, and 2) the dOpenCL middleware for transparent execution of OpenCL programs on several stand-alone computers connected over a network.
Both SkelCL and dOpenCL are built on top of the OpenCL standard which ensures their high portability across different kinds of processors and GPUs. The dOpenCL middleware extends OpenCL, such that arbitrary computing devices (multi-core CPUs and GPUs) in a distributed system can be used within a single application, with data and program code moved to these devices transparently. The SkelCL library offers a set of pre-implemented patterns (skeletons) of parallel computation and communication which greatly simplify programming for multi-GPU systems. The library also provides an abstract vector data type and a high-level data (re)distribution mechanism to shield the programmer from the low-level data transfers between a system's main memory and multiple GPUs. This paper describes dOpenCL and SkelCL and illustrates how they are used to simplify programming of heterogeneous distributed systems with accelerators.
Teresa Murino, Riccardo De Carlini, Giuseppe Naviglio
445 - 456
Nowadays, the inventory management is a central issue for companies. On the one hand, having too high stock levels in spite of the real requirements may lead to circulating capital's immobilization; on the other hand, low stock levels may cause lost of sales and consequent customer dissatisfaction. The aim of this paper is the definition of a model based on a customized AHP (Analytic Hierarchy Process) model for evaluating the optimal procurement policy regarding standard items (screws, bolts, screw nuts, etc.). These items are the last sub codes within complex bill of materials (BOM) for products manufactured by an important make to order company. This model allows to assign to each BOM items the right management inventory policy, depending on their priority and criticality according to the customer requirements.
The aim of this study is to compare, in terms of cost and safety, two of the most important maintenance policies: Incidental Maintenance and Condition Based Maintenance. This has been accomplished through a simulation based approach. An optimization process has also been carried out in order to choose the optimal maintenance policy. A scenario analysis for the model's key parameters has been carried out finding significant results for several production contexts.
Mosè Gallo, Paola Aveta, Giuseppe Converso, Liberatina Carmela Santillo
475 - 496
This paper proposes a System Dynamics approach to manage supply risks. In Supply Chain Risk Management field tools such as modeling and computer simulation are assuming an increasingly important role in supporting strategic, tactical and operational business decisions. In particular, these are applied to the risk sharing phase and to evaluate appropriate management and mitigation strategies. In a company's perspective, a SD approach provides a reading key of reality through an analysis of how policies and decisions influence and are influenced by the environment and how these impact on dynamics of available resources. In this context a model to assess supply risks has been defined. This model considers the operational characteristics of a generic company operating in its supply chain environment, and their interactions. This problem has been approached through several steps with an increasing level of detail. We start by decomposing the supply risk problem in its subparts, in order to understand the mechanisms making this system complex. The simulation based approach proposed here allows to carry out an analysis and subsequent mitigation of supply risks, furthermore providing some counterintuitive advices in order to maximize company's profitability.
The aim of this work is to create a simulation model of a manufacturing system operating within the supply chain by system dynamics approach heeding dynamics of system-company and factors that may affect performance, so that management can have a useful tool for decision support. The results have shown interesting correlations between management choices and the system outputs.
Raffaele Di Micco, Daniela Rita Montella, Giuseppe Naviglio, Elpidio Romano
518 - 537
In this work, the Design of Experiment (DOE) was applied to a Single Stage Multi Product Kanban System verification. The study was begun with the construction of a simulation model in Arena and continued with the optimization with OptQuest, in relation to the number of kanban cards in a kanban board placed between two stations of a production plant. Eventually, the model was then tested for system robustness verification, using DOE approach. Specifically, the aim of this work is to demonstrate the effectiveness of a methodology that, through the implementation of the DOE analysis, allows identifying those parameters that mostly influence the output of a simulation model previously implemented with ARENA.
In recent years, there has been an increasing interest in sustainable development that could be regarded both as new constraint and opportunity to achieve a competitive advantage. Several tools like the implementation of an Environmental Management system, the adoption of Life Cycle assessment and Eco-Labelling may be used in order to enhance companies' competitiveness. In this context, policy makers are challenged to design effective policies and organizations for exploiting the opportunities that increasing environmental awareness provides. In order to make this task easier a holistic and systemic approach based on modeling may be used, so that users understand both structure and dynamics of complex systems in which they are in.
The purpose of this study is undertake a review of the seaborne coal supply chains for two important Power Stations in the Mediterranean Sea. We are considered four different scenarios. An important aspect of this study has been the consultation with the coal supply chain participants, involving interviews, site visits and workshops, to share present information, medium term plans, objectives and expectations. The design of experiment (DOE) approach has been employed. DOE is important as a formal way of maximizing information gained by available resources. It has more to offer than “one change at a time” experimental methods because it allows a judgment on the significance to the output of input variables acting alone, as well as input variables acting in combination with one another. Using the results of the simulations, a regression analysis has been performed, providing multi-dimensional response surfaces expressing the dependency of throughput and demurrage days on the independent variables considered. The results of simulations are described at paragraph at the end of paper.
The purpose of this study is undertake a review of the seaborne coal supply chains for two important Power Stations in the Mediterranean Sea (associated at two important Companies). We have studied four scenarios. Each scenario has been developed in terms of: ocean freight assessment and preliminary operative costs estimation. The purpose of this study is to identify, analyse and make recommendations on key issues and potential bottlenecks that might result in capacity constraints and/or supply chain inefficiency thus leading in unnecessary additional costs of the coal delivered to the Power Plants.