As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
This paper will explore the use of augmented reality, and its ability to reveal deep technologies (hidden technologies), as a means to create more effective tools for developers or students studying embedded computing, which are typified by topics such as the internet of things, pervasive computing and robotics. This approach aims to enrich the experience for developers and learners by constructing a meaningful view of the invisible things around us. Thus, we will propose and explain a general model, called ViewPoint, which consists of several components such as learning design specification, collaborative environments, augmented reality, physical objects and centralised data. Furthermore, to support the proposed model, we present a 4-dimensional learning activity task (4DLAT) framework which has helped us to structure our research into several phases where we can scale up from single-learner-discrete-task to group-learner sequenced-task, based on the proposed scenario. As a first step towards these lofty ambitions, this study will focus on developing Internet-of-Things systems based on a small self-contained eco-system of networked embedded computers known as Buzz-Boards.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.