In this project, I used Runway to create a digital, AI-driven version of myself—a 3D stylized character that not only represents my visual identity, but also reflects my personality, tone, and way of thinking. Rather than treating Runway as a simple video generation tool, I approached it as a system for designing a living persona.

4decdfd9e97313880c2d2e0072cc5c0a_raw.mp4

f2005c7baea28d9c21137ae032b82764.png

A key part of the process was writing detailed prompts to define the character’s identity and behavior. Instead of using generic instructions, I structured the prompt around personality, voice, and interaction style. For example, I described the character as calm, emotionally aware, and slightly introspective, with a slow and reflective speaking rhythm. I also intentionally avoided instructions like “helpful assistant” or “provide clear answers,” as I found that these would quickly make the character feel generic and detached from my personal identity.

What I found most interesting is that prompt writing in this context feels less like giving commands and more like shaping a personality system. Small changes in wording—such as asking the character to “pause before responding” or “speak with subtle emotional nuance”—had a noticeable impact on how the character behaved. This made the process highly iterative, where I continuously refined the prompt to make the character feel more like “me” rather than an AI imitation.

Visually, combining a 3D cartoon version of myself with Runway’s character system created a strong sense of embodiment. The character no longer felt like a static avatar, but more like a presence that could respond, express, and engage. This shift from representation to interaction was one of the most compelling aspects of the tool.

At the same time, I also encountered some limitations. Maintaining a consistent and nuanced personality requires careful prompt design, and the system can sometimes drift toward more generic responses if the instructions are not specific enough. This revealed that the quality of the result is highly dependent on how clearly the designer defines the character’s identity.

Overall, this project helped me see Runway not just as a generative AI tool, but as a medium for designing digital identity. It enabled me to experiment with the idea that a person can be represented not by static content, but by an interactive, responsive character. In this sense, the project is less about creating an avatar, and more about exploring a new interface where the self becomes something that can speak, respond, and evolve.