Headsets for virtual reality such as head-mounted displays have become ubiquitous and bring immersive experiences to individual users. People who stand outside the virtual world may want to share the same scenes that are shown on the screen of the headset. It is therefore of great importance to merge real and virtual worlds into the same environment, where physical and virtual objects exist simultaneously and interact in real time.We propose shared augmented virtual environment (SAVE), a mixed reality (MR) system that overlays the virtual world with real objects captured by a Kinect depth camera.We refine the depth map and exploit a Graphics Processing Unit (GPU) based natural image matting method to obtain the real objects fromcluttered scenes. In the syntheticMR world, we can render real and virtual objects in real time and handle the depth from both worlds properly. The advantage of our system is that we connect the virtual and real worlds with a bridge controllermounted on the Kinect and need to calibrate the whole systemonly once before use. Our results demonstrate that the proposed SAVE system is able to create high-quality 1080p live MR footage, enabling realistic virtual experiences to be shared among a number of people in potential applications such as education, design, and entertainment.