Takashi Yoshinaga

Ph.D in Engineering

Augmented Reality, Visualization, Motion Sensing

LinkedIn, Facebook, Twitter


Let's Talk!

I'm open to talk with everyone! If you are interested in my works, feel free to contact me.


AR Echography

I've proposed using AR(Augmented Reality) technology to support echography by superimposing cross section and internal organs.

Shape of internal organ and probe position/angle which are recorded by skilled physician in previous is visualized. This system was confirmed it can help unskilled physician to acquire echogram. 

Cross section is superimposed in the view of physician who is wearing HMD (HoloLens/Magic Leap). This is based marker-less tracking technology and real-time transmission of its position/angle and ultrasound image.

AR system for remote instruction of echography. Left image is the interface of remote doctor's side, and right image is the AR intereface for patient side. Instruction of prope operation was achieved only with visual information.

Dinamic body mark for echography. Procedure to acquire echogram is recorded/visualized by using volumetric video. This technology is not only for archiving medical skill but also remote collaboration in tele-medicine.

Wearable Motion Sensing

I've developed motion sensing system by using wearable sensors such as IMU, EMG and camera to support training of sports and rehabilitation.

IMU sensors and SLAM technology was used together for motion capture. IMU is used for occlusion free sensing of joint angles and SLAM sensor is used to spacial position of human body. See also demo of full body tracking and sports sensing.


It was developed to provide fun for patients of rehabilitation while their training of hand motion. Virtual character goes forward while patient's arm is kept horizontal and turns direction while hand is bent left or right.

Hobby & Prototyping

Project HoloBox: Perspective effect is achieve on iPad without specific 3D display. In addition to this interaction between virtual world inside the display of iPad and AR device like HoloLens2 is available. Related works [Link

Prototype of a system that synchronizes the real door and the portal representation. I believe that a more advanced Mixed Reality will be achieved by synchronizing real and virtual objects like this demo.

HoloTuberKit. This system enables us to broadcast point cloud through internet and to visualize it on XR devices. Point cloud viewer is compatible with HoloLens, Nreal Light, Meta Quest and ARCore device. You can down load these apps from GitHub.[WebRTC] [YouTube]

Remote communication system based on volumetric video streaming and AR/VR technology. Users can talk remotely while sharing operation of virtual objects (2D image, video and 3D model). For this demonstration, Nreal Light was used in both sides.

Spatial clipper. This system can extracting spatial mesh which is inside the clipping sphere. Furthermore, this system can transmit mesh to other device [Demo]. By using this, hands free scan and sharing spatial shape with remote people can be achieved.

Tried to create AR fireworks. iPad was used as a controller to place the origin of the fireworks. Furthermore, HoloLens2 was used as a viewer.

Sending digital data to AR space by shaking smartphone. Shaking gesture which is detected by accelerometer of smartphone is used as trigger to send 2D/3D images to HoloLens. It was developed because I was inspired by SF movies.

Seeing remote environment through finger frame. Leap Motion is used to detect the gesture and to modify a size of window which is shown as AR image on Meta2. Please see also related work with them. [Link]

Sharing AR experience among multiple users and multiple devices. In this video, not only virtual object but also operation are shared with HoloLens users and smart phone which recorded this scene, by using bidirectional communication.

Real- time AR Coloring. Square frame is recognized to clip coloring area. And textured object is visualized on AR devices. This is available on HoloLens, ARcore device, Aryzon and Looking Glass.

Test development of half mirror AR and interaction with virtual character & user's hand. I felt this style of AR enable all users, especially children, to experience optical see-through AR more easier without wearing HMD. 

Transforming normal smartphone into a controller of Looking Glass just by reading QR code. These devices are linked via wifi connection and data sent from smartphone was managed by using websocket server.

AR shooter by using HoloLens 2 and EMG sensor. Virtual bullet is shoot when hand is grabbed strongly. Wearable sensor like EMG can enable us to add new interaction which can not be achieved just by using image processing.

Making flat wall into touch panel by using depth and image processing technique. This shows you interactive transformation of virtual peephole to see through the next room. Image of the next room is captured by web camera.

Scanning environment by using iPad/iPhone LiDAR sensor. Furthermore linking real world and virtual world. For example, avatar of user who is in virtual world appears in real room and placement of objects synchronizing in each environment. 

GitHub Repositories

Tutorial Hands-on Seminar

I've been holding +100 tutorial hands-on seminar about AR contents creation in Fukuoka Japan since 2013. Some of slides are translated in English.