Concurrent Engineering (CE) has been one of the major topics in the last few decades to achieve the goal of cost and time reduction as well as quality improvement. Achievement of CE is based on the collaboration of various activities ranging from design disciplines, manufacturing and assembly, marketing and purchasing, all the way to the end users. In this respect, collaboration of people from various activities among different locations is crucial to the success of CE. To support collaboration between two difference places, various types of communication tools are available these days, including business video conference systems such as Polycom, or proprietary Voice over IP services such as Skype. Even though remote communication is available with these tools, participants who are physically separated during a meeting do not have the same feeling as a face-to-face meeting because of several reasons. Some of them would be the lack of presence and immersion. This study proposes an idea of body movement-based interaction during a remote meeting to feel the presence of remote participants, and to experience the immersion into the virtual space. According to the literature, the four movement features potentially influence the immersion in virtual space; namely natural control, mimicry of movements, proprioceptive feedback, and physical challenge. This study focuses on two types of body movement, namely hand gesture and head movement, to implement the idea of body movement-based interaction. Hand gesture covers the natural control and mimicry of movement towards a distant object. This study uses a pair of acceleration sensors and an inclined plastic panel for movement-based interaction. The acceleration sensors detect hand motion and position, and the detected signal is recognized by a signal-recognition algorithm. A plastic panel is used to fold both hands during manipulation by providing comfortable and easy operation. The panel also works to project the hand gesture in three dimensional space onto its two dimensional gesture, which makes it easier to design the detection algorithm [23]. As a bench level experiment, it was recognized that the hand motion controlled the distant object for simple manipulation. Head movement covers the physical challenge as well as the mimicry of head movement using a physical object. This study uses a gyro sensor attached to around ear in order to detect head movement of the subject during the conversation. The detected signal by the sensor was used to control the remote robotic arm, which holds the smart phone for Voice over IP communication, and to mimic the head movement of the subject in a remote place. As a result, presence of a remote subject was significantly recognized in the physically active movement of the phone, which was quite different in comparison with the fixed position of the phone. The idea of body movement-based interaction was proposed and implemented in the two types of movement, or hand gesture and head movement. Based on the preliminary experiments for these two types, feasibility of the idea will be discussed in this paper.