Loading...

    sorted by publishing date

    Post Image

    Last Edited: 12 days ago

    From Intuition to Immersion: The Resonant Future Self Framework Makes Foresight Felt

    What if we could test-drive our future selves?

    Most scenario workshops stay in the head. The Resonant Future Self Framework flips that script by letting participants embody possible futures through immersive narrative experiences - while their brains and bodies react in real time. Peaks of coherence in EEG and heart-rate variability become “north-star signals” that guide purpose discovery. The method weaves together generative AI, emotional storytelling, and biometric sensing to open a new path toward deeply felt, personal foresight.

    First public presentation – Vienna, May 2025
    The framework was introduced at the Futures4Europe Conference 2025 – Exploring Future-Oriented Collective Intelligence (Vienna, 15–16 May), organised by the AIT Austrian Institute of Technology as part of the Eye of Europe flagship initiative. The presentation focused on the conceptual foundations of the method, shared results from the first single-participant pilot study, and outlined how emotional resonance - measured through real-time physiological signals - can enrich scenario work with an embodied, intuitive layer of decision-making.

    Next stop – AIMEDIA, Venice, July 2025
    From 6–10 July, the project will be featured in the AI in Immersive Media track at the First International Conference on AI-based Media Innovation (AIMEDIA 2025) in Venice.

    The presentation will cover the generative narrative pipeline that produces the six personalised future-self videos, share expanded pilot-study findings (including EEG and HRV analysis), and preview ongoing efforts to partially automate the scripting, visual generation, and video assembly process using tailored AI workflows.

    While human editing and quality control remain essential, these steps point toward a future in which the method becomes faster, more scalable, and easier to adapt across educational, coaching, and wellbeing contexts.

    Read more
    To explore more about the method, goals, and future plans, visit the dedicated project page (link below).

    Posted on: 08/06/2025

    Last Edited: 12 days ago

    Resonant Future Self Framework1February 2025 - May 2027

    AI-Enhanced Immersive Foresight for Purpose Discovery

    An experimental methodology that lets people feel alternative life-paths instead of merely thinking about them by combining AI-generated personal future narratives, immersive audio-visual simulation and real-time bio-feedback to spot “inner resonance”.

    The Resonant Future Self Framework is an experimental methodology that merges strategic foresight, immersive narrative design, biometric sensing and applied neuroscience to help people feel their way into personally meaningful futures. Participants co-create six AI-generated “future-self” stories based on a deep self-inquiry, then step inside those stories. While they watch - and sometimes star in - the simulation, EEG, heart-rate variability, respiration, skin conductance and peripheral temperature are recorded in real time to detect moments of coherence or dissonance between the body and the imagined life path. These physiological “resonance markers” complement post-session journaling to reveal which futures evoke a profound sense of rightness and which trigger resistance or cognitive overload .

    The practice unfolds in nine interconnected steps: 

    1. Discover personal strengths, conflicts and aspirations; 

    2. Reflect by generating diverse narrative futures with generative-AI support; 

    3. Select the two scenarios that feel most alive; 

    4. Embody them in immersive media while gathering biometric data; 

    5. Review the combined subjective/objective signals; 

    6. Activate the preferred future through weekly micro-practices in daily life; 

    7. Backcast concrete milestones linking present and desired identity; 

    8. Revisit the experience after six-to-eight weeks; 

    9. Second Test under identical biometric conditions to measure shifts in alignment .

    An April 2025 single-participant pilot demonstrated that personalised immersive future-self simulations can indeed generate rich, differentiated biometric signatures - and that those signatures align closely with felt meaning. During two iterations of an AI-generated, audio-visual journey into imagined future identities, EEG, heart-rate, respiration, skin conductance and peripheral temperature were recorded continuously. The aggregated signal showed alternating phases of elevated gamma activity and transient heart-rate acceleration (markers of emotional arousal) followed by stretches of sustained alpha rhythms and increased heart-rate variability (often linked to calm, integrative attention). Immediately after each immersion, the participant completed reflective journaling; the qualitative entries mapped almost point-for-point onto the physiological timeline, suggesting that this form of embodied resonance mapping can translate subjective alignment into measurable data.


    The session also surfaced practical insights - such as the need to calibrate narrative intensity to avoid cognitive overload and to provide structured debrief protocols - that now inform the forthcoming group trials. Although limited to a single case, these early findings validate the technical pipeline and bolster the central hypothesis that combining AI-personalised storytelling with real-time biofeedback offers a powerful pathway to purpose discovery.

    Building on that proof-of-concept, the research progresses through three phases: 


    (1) Individual immersive foresight to refine protocols; 


    (2) Group-based resonance testing to explore social amplification and shared insight; and 


    (3) Scalability studies that port the method to mobile-friendly, light-weight XR so schools, universities and coaching programmes can adopt it at low cost.

    Posted on: 08/06/2025