About this project
VReactable is a research project conducted at Hogeschool voor de Kunsten Utrecht. We are supervised by the XR researcher Joris Weijdom as a client to create a tool designed to inspire XR artists and captivate individuals without prior experience in XR. The team consist of a programmer, an 3D artists, and an interaction designer. We developed the tool from scratch including interviewing with target groups, product analysis, product design, development, and playtest. I contributed to the project as a Python developer and also participated in every process in product development. The outcome of this project has been publicly released on Github. It has received approval from both the client and the supervisors.
VReactable is an interface that serves as a bridge between the virtual reality (VR) realm and the physical world. The tool runs on a VR platform named Resonite, which is a platform that allows users to customize their own space and write scripts in their virtual space. VReactable provides a tangible interface in which users can manipulate a Resonite world by moving cubes in the real world. This brings an opportunity for users to interact with VR users differently.
The tool transform the cube states to parameters. In Resonite, users can apply the parameters to various of usage. For example, the cube rotation yaw can become the intensity of light, the volume of music, or even color. To create a dynamic exhitbition in virtual world.
To track the objects, we use ArUco Marker in OpenCV's library. ArUco markers represent a category of fiducial markers designed for easy recognition by computer vision systems. It is widely used in the field of pose recognition in robotics. These markers typically consist of blackand-white squares or rectangles with unique patterns that cameras can readily identify.
I created a user-friendly GUI for the cube tracker in Python to simplify its usage, recognizing that running Python scripts with command lines might not be intuitive for non-developers. A camera calibration tool is developed as well to increase camera tracking quality.
The tool was presented in several exhitbtion inside HKU. We develop a preset which transform cubes into virtual object insde the virtual world, exemplifying the potential for multiple users to work together. In this scenario, we constructed a space with various 3D models. To delineate the roles of users on both sides, cube users assume responsibility for large objects in the space and environment settings, assessing the overall presentation from a top view. Conversely, VR users focus on the visual experience from a first-person perspective, generating smaller objects such as characters and animals within the space. This collaborative approach empowers users to collectively craft a well-balanced virtual space engagingly. During exhitbition, some visitors expressed keen interest in the tool’s interaction with the virtual world and proposed intriguing ideas based on its capabilities. For instance, a student from the game design discipline suggested using the tool for storytelling as a Dungeon Master in a Dungeon and Dragons (DnD) game. Another individual envisioned the tool’s potential application in controlling music parameters. Our observation revealed that many artists drew inspiration from the tool during the showcase.