Song Layerer is a small Node.js-based tool I made for experimental audio composition. Inspired by Every Recording of Gymnopédie 1, this tool time-stretches multiple audio tracks to a uniform duration and layers them together, creating some interesting results.
Key features include automatic download of top search results from YouTube, support for user-selected audio files, time-stretching capabilities, and layering of multiple audio files into a single output.
Clair de lune:
Philip Glass's Metamorphosis
Holst: The Planets: Jupiter
Built using Node.js, ffmpeg, yt-dlp, and Rubberband. Hosted on GitHub.
Pipeline for gathering music recommendations from Reddit posts
This project is a Next.js application that scrapes music-related information from Reddit posts and comments. It uses AI to analyze and extract artist, album, and song names from the scraped content.
This is a work in progress. I want to expand the scope of the project to include more features, such as the ability to look up further information about the artists, albums, and songs, and link to playback sources.
Built using Next.JS, Vercel AI, and OpenAI API. Hosted on Vercel.
Browser extension integrating RateYourMusic with Spotify
This browser extension seamlessly integrates RateYourMusic with Spotify, allowing users to easily find and add albums from RYM lists to their Spotify playlists. It adds Spotify buttons next to album entries on RYM list pages, enabling quick searches and direct access to albums on Spotify.
Key features include Spotify authentication, RYM page integration, album search functionality, and direct Spotify access. The extension is built to support both Chrome and Firefox browsers using manifest v3.
Future improvements include implementing playlist creation directly from RYM lists, adding support for other streaming services, and enhancing the UI/UX.
Built using JavaScript, WebExtension API, Spotify Web API, and browser-polyfill. Compatible with Chrome and Firefox.
Resonant, dwelling, unfadingFull-body shadow interaction for collaborative performance
This work explores the use of full-body shadows as a means of non-traditional sound-based instrumentation. We explore the affordances of shadow-based interaction for performance and as a tool for embodied, immersive dialogue with self and environment. Such affordances include: multi-location, in which one’s shadows may be cast on surfaces at distances; morphology based on source angle, position, reflection, as well target surface typology; and an immediacy of visible feedback.
In collaboration with: Annais Jasmin Paetsch, Adrian Montufar. Advised by Professor Kimiko Ryokai of UC Berkeley School of Information.
The Shadow Synth is a tangible user interfare for creating soundscapes with one's shadow. Photocell arrays along the front face gather light values in real-time, and are mapped via Ableton to specific track levels. The relationship between the light intensity and volume is inverse, such that the less light a photocell receives, the louder the sound is. In this way, shadows become sound-generating.
In collaboration with: Annais Jasmin Paetsch, Billy Kwok, Akash Minajan, Salih Berk Dinçer, Tee O'Neill, Nichole Chi, Forrest Brandt. Project for NWMEDIA C262: Theory and Practice of Tangible User Interfaces, taught by Professor Kimiko Ryokai in Fall 2021 at UC Berkeley School of Information.
The Bee BoxMixed reality, nature-based educational storytelling
The Bee Box is a mixed reality, nature-based educational storytelling experience meant to be interacted with by two players. The Bee Box consists of a physical, miniature garden box with real soil and moss, and a virtual rendering of this garden box to be explored within a VR headset.
This is a design prototype of meaningful collaborative pathways between physical and digital environments in education contexts. This experience is enabled technologically by bidirectional sensor-based feedback, in which actions done on the physical board result in meaningful changes in the virtual environment, and vice versa.
In collaboration with: Qianyi Chen, Akash Mahajan, Yuting Wang, and Yiyao Yang. Final project for NWMEDIA C262: Theory and Practice of Tangible User Interfaces, taught by Professor Kimiko Ryokai in Fall 2021 at UC Berkeley School of Information.
Multimodal Learning Analytics for Embodied DesignUsing machine learning and recurrence quantification analysis
By analyzing how students’ sensorimotor behavior shifts over time in response to task-based perception, behavioral patterns are revealed that indicate phase shifts in conceptual understanding, reflecting the complex and dynamic nature of embodied content-based learning. In applying predictive machine learning techniques to telemetry data, multimodal learning analytics aims to contribute to research-informed education design, as well as improve educational support interventions such as adaptive tutoring systems.
In collaboration with Julien Putz. Final project for Data 244: Data Mining & Analytics, taught by Professor Zachary Pardos in Fall 2021 at UC Berkeley School of Education.
Health-Aware Recipe GenerationUsing long short-term memory recurrent neural networks
A novel machine-learning driven culinary recipe generator that moves beyond simply inputting ingredients. We extend a recurrent long-short term memory model's input to incorporate a given nutritional profile as well as a sample of ingredients. This would not only generate a relevant and sensible recipe but one created specifically for one's specific dietary needs.
In collaboration with Anirudhan Badrinath and Noah Reiner. Final project for Data 244: Data Mining & Analytics, taught by Professor Zachary Pardos in Fall 2021 at UC Berkeley School of Education.
The Insight Map is an ongoing collaboration between Mikey Siegel and myself, emerging from a need to make sense of the myriad tools, techniques, approaches, and experiences that make up our lives and quest for well-being. This tool is intended as an aid in understanding associations between emotional, somatic, and mental experiences, actions, and contexts, by visualizing the relationships of vocal and written notes as a network topology.
Where in the body do you feel joy? Can you taste joy? Hear, smell, touch it? What images emerge when you think of joy?
I am fascinated by the spectrum of what emotions feel like for each of us. How diverse the experiences of joy, of sadness, of love, of fear can be. We are all human, yet we embody and embrace our humanness uniquely as our own.