Resonant, dwelling, unfadingFull-body Shadow Interaction for Collaborative PerformanceThis work explores the use of full-body shadows as a means of non-traditional sound-based instrumentation. We explore the affordances of shadow-based interaction for performance and as a tool for embodied, immersive dialogue with self and environment. Such affordances include: multi-location, in which one’s shadows may be cast on surfaces at distances; morphology based on source angle, position, reflection, as well target surface typology; and an immediacy of visible feedback.
In collaboration with: Annais Jasmin Paetsch, Adrian Montufar.
Advised by Professor Kimiko Ryokai of UC Berkeley School of Information.
Shadow SynthCollaborative Embodied Shadow SynthesizerThe Shadow Synth is a tangible user interfare for creating soundscapes with one's shadow. Photocell arrays along the front face gather light values in real-time, and are mapped via Ableton to specific track levels. The relationship between the light intensity and volume is inverse, such that the less light a photocell receives, the louder the sound is. In this way, shadows become sound-generating.
In collaboration with: Annais Jasmin Paetsch, Billy Kwok, Akash Minajan, Salih Berk Dinçer, Tee O'Neill, Nichole Chi, Forrest Brandt.
Project for NWMEDIA C262: Theory and Practice of Tangible User Interfaces, taught by Professor Kimiko Ryokai in Fall 2021 at UC Berkeley School of Information.
The Bee BoxMixed Reality, Nature-based Educational StorytellingThe Bee Box is a mixed reality, nature-based educational storytelling experience meant to be interacted with by two players. The Bee Box consists of a physical, miniature garden box with real soil and moss, and a virtual rendering of this garden box to be explored within a VR headset.
This is a design prototype of meaningful collaborative pathways between physical and digital environments in education contexts. This experience is enabled technologically by bidirectional sensor-based feedback, in which actions done on the physical board result in meaningful changes in the virtual environment, and vice versa.
In collaboration with: Qianyi Chen, Akash Mahajan, Yuting Wang, and Yiyao Yang.
Final project for NWMEDIA C262: Theory and Practice of Tangible User Interfaces, taught by Professor Kimiko Ryokai in Fall 2021 at UC Berkeley School of Information.
Multimodal Learning Analytics for Embodied DesignUsing Machine Learning and Recurrence Quantification AnalysisBy analyzing how students’ sensorimotor behavior shifts over time in response to task-based perception, behavioral patterns are revealed that indicate phase shifts in conceptual understanding, reflecting the complex and dynamic nature of embodied content-based learning. In applying predictive machine learning techniques to telemetry data, multimodal learning analytics aims to contribute to research-informed education design, as well as improve educational support interventions such as adaptive tutoring systems.
In collaboration with Julien Putz.
Final project for Data 244: Data Mining & Analytics, taught by Professor Zachary Pardos in Fall 2021 at UC Berkeley School of Education.
Health-Aware Recipe GenerationUsing Long Short-term Memory Recurrent Neural NetworksA novel machine-learning driven culinary recipe generator that moves beyond simply inputting ingredients. We extend a recurrent long-short term memory model's input to incorporate a given nutritional profile as well as a sample of ingredients. This would not only generate a relevant and sensible recipe but one created specifically for one's specific dietary needs.
In collaboration with Anirudhan Badrinath and Noah Reiner.
Final project for Data 244: Data Mining & Analytics, taught by Professor Zachary Pardos in Fall 2021 at UC Berkeley School of Education.
A tool for developing emotional granularityFeeels is an interactive emotion wheel, to support emotional expression, intelligence, tracking, and self-awareness.
Fullstack, serverless web app. Built using React, D3js, Next.js, Node.js, Express, MongoDB, Styled Components, ChakraUI, and ESNext. Hosted on Vercel.
For making sense of it allThe Insight Map is an ongoing collaboration between Mikey Siegel and myself, emerging from a need to make sense of the myriad tools, techniques, approaches, and experiences that make up our lives and quest for well-being. This tool is intended as an aid in understanding associations between emotional, somatic, and mental experiences, actions, and contexts, by visualizing the relationships of vocal and written notes as a network topology.
Fullstack, serverless web app. Built using React, Next.js, and D3js.
What does joy feel like for you? Where in the body do you feel joy? Can you taste joy? Hear, smell, touch it? What images emerge when you think of joy?
I am fascinated by the spectrum of what emotions feel like for each of us. How diverse the experiences of joy, of sadness, of love, of fear can be. We are all human, yet we embody and embrace our humanness uniquely as our own.
Fullstack, serverless web app. Built using React, Next.js, and D3js.
The Dharma Action Network is a space to connect and explore the breadth of possible Dharma responses to our climate crisis.
JAMStack site using Contentful as a headless CMS. Built using React, Gatsby.js, Styled Components, Node.js, and ESNext. Hosted on Netlify.
The San Francisco Dharma Collective is a sangha-led Buddhist community, located in the Mission district of San Francisco.
JAMStack site with dynamic functionality using Netlify serverless functions. Built using React, Gatsby.js, Styled Components, Node.js, and ESNext. Hosted on Netlify.
XyloFun is a non-profit organization working to make music education more accessible to the most under-resourced schools in South Africa.
JAMStack site using Contentful as a CMS. Built using React, Gatsby.js, Emotion, Rebass, ThemeUI, and ESNext. Hosted on Netlify.