Architects have long sought to create spaces that can relate to or even induce specific emotional conditions in their users, such as states of relaxation or engagement. Dynamic or calming qualities were given to these spaces by controlling form, perspective, lighting, color, and materiality. The actual impact of these complex design decisions has been challenging to assess, from both quantitative and qualitative standpoints, because neural empathic responses, defined in this paper by feature indexes (FIs) and mind indexes (MIs),are highly subjective experiences. Recent advances in the fields of virtual procedural environments (VPEs) and virtual reality (VR), supported by powerful game engine (GE) technologies, provide computational designers with a new set of design instruments that, when combined with brain-computing interfacing (BCI) and eye-tracking (E-T) hardware, can be used to assess complex empathic reactions. As the COVID-19 health crisis showed, virtual social interaction becomes increasingly relevant, and the social catalytic potential of VPE scan open new design possibilities. The research presented in this paper introduces the cyber-physical design of such an affective computing system. It focuses on how relevant empathic data can be acquired in real time by exposing subjects within a dynamic VR-based VPE and assessing their emotional responses while controlling the actual generative parameters via a live feedback loop. A combination of VR, BCI, and E-T solutions integrated within a GE is proposed and discussed. By using a VPE inside a BCI system that can be accurately correlated with E-T, this paper proposes to identify potential morphological and lighting factors that either alone or combined can have an empathic effect expressed by the relevant responses of the MIs.