When designing interactive architectural systems and environments, the ability to gather user feedback in real time provides valuable insight into how the system is received and ultimately performs. However, physically testing or simulating user behavior with an interactive system outside of the actual context of use can be challenging due to time constraints and assumptions that do not reflect accurate social, behavioral, or environmental conditions. Employing evidence-based, user-centered design practices from the field of human-computer-interaction (HCI) coupled with emerging architectural design methodologies creates new opportunities for achieving optimal system performance and design usability for interactive architectural systems. This paper presents a methodology for developing a mixed reality computational workflow combining 3D depth-sensing and virtual reality (VR) to enable iterative user-centered design. Using an interactive museum installation as a case study, user point cloud data is observed via VR at full scale and in realtime for a new design feedback experience. Through this method, the designer can virtually position him/herself among the museum installation visitors in order to observe their actual behaviors in context and iteratively make modifications instantaneously. In essence, the designer and user effectively share the same prototypical design space in different realities. Experimental deployment and preliminary results of the shared reality workflow are presented to demonstrate viability of the method for the museum installation case study and for future interactive architectural design applications. Contributions to computational design, technical challenges, and ethical considerations are discussed for future work.