Hello
I'm Amber, a third-year environment and communication design major at Carnegie Mellon University. I like projects that make people think or feel differently and merit more than their measurable results.
Initial research For this project, I was interested in the sky and and its invisible trace of sound. Although the sky is generally a visual experience, I wanted to experiment with ways it might become audial as well. To begin my image to sound research, I looked into a color to sound device, an article on different types of background noise, and Yuri Suzuki's Face the Music project.
Concept An installation that showcases generative music derived from images of the sky.
See images and read more https://amberlee.me/sound-of-the-sky
Concept Art and music are frequently found hand-in-hand at festivals to create rich experiences for attendees. My project explores the benefits of at-home streaming, browsing, and sharing to create a contemporary digital experience in the form of a website.(edited)
Images and demo video See attached and embedded below!
Hello
I'm Amber, a third-year environment and communication design major at Carnegie Mellon University. I like projects that make people think or feel differently and merit more than their measurable results.
Initial research For this project, I was interested in the sky and and its invisible trace of sound. Although the sky is generally a visual experience, I wanted to experiment with ways it might become audial as well. To begin my image to sound research, I looked into a color to sound device, an article on different types of background noise, and Yuri Suzuki's Face the Music project.
Concept An installation that showcases generative music derived from images of the sky.
See images and read more https://amberlee.me/sound-of-the-sky
Concept Art and music are frequently found hand-in-hand at festivals to create rich experiences for attendees. My project explores the benefits of at-home streaming, browsing, and sharing to create a contemporary digital experience in the form of a website.(edited)
Images and demo video See attached and embedded below!
Design for Environments Studio IV is a course for juniors (third-year undergraduates) taking Carnegie Mellon School of Design's Environments Track. We explore design, behavior, and people's understanding, in physical, digital, and 'hybrid' environments. The course comprises practical projects focused on investigating, understanding, and materializing invisible and intangible qualitative phenomena, from intelligence to social relationships, through new forms of probe, prototype, speculative design and exhibit.
In spring 2020, our first set of projects focused on Autographic Visualizations. The term (by Dietmar Offenhuber) refers to things that create a record or trace of something that's happened to them—how they are used by people, or how something in the surrounding environment affects them (they are forms of indexical visualization, or qualitative interface). Sometimes the trace is accidental or incidental, or solely a side-effect of a property of the materials. But it can be designed intentionally, and perhaps can reveal patterns which would otherwise be invisible.
We took this prompt as a starting point, with an intro to data materialization from Marion Lean, and created a wide range of interactive projects which build on and expand the idea, addressing everything from physicalizing a person's heartbeat through ferrofluid, to a VR physicalization of race and privilege. We have sonification of sunsets, new kinds of educational building blocks, ways to experience laughter, 3D printing as a real-time visualization of a human-machine conversation, synchronized breathing, a physicalization of the thread of a conversation, and design fiction extrapolating from your digital traces.
In the second half of the semester, we were forced by circumstances to pivot to an online form of distributed studio—which offered challenges but also some interesting new ways of responding to the environments of our everyday lives: the interior world, the endless Zoom, and the connections across space and time zones. These Virtual Environments Studio projects are a diverse mix of ideas ranging from new ways of experiencing music and art online, to remote therapy environments, some addressing aspects of the current COVID-19 situation directly, and others looking to a world beyond or above our contemporary distress.
In working on the projects, and 'showing' them internally, we explored combinations of Discord, Figma, and other tools to see whether we could partially replicate the feeling of 'stopping by someone's desk' in a physical studio—I don't think we're quite there yet, but this group of students have the sensitivity to the design of physical and digital environments, and the talent, to be at the forefront of evolving design practice in the new environments of the lives ahead of us.
Design for Environments Studio IV is a course for juniors (third-year undergraduates) taking Carnegie Mellon School of Design's Environments Track. We explore design, behavior, and people's understanding, in physical, digital, and 'hybrid' environments. The course comprises practical projects focused on investigating, understanding, and materializing invisible and intangible qualitative phenomena, from intelligence to social relationships, through new forms of probe, prototype, speculative design and exhibit.
In spring 2020, our first set of projects focused on Autographic Visualizations. The term (by Dietmar Offenhuber) refers to things that create a record or trace of something that's happened to them—how they are used by people, or how something in the surrounding environment affects them (they are forms of indexical visualization, or qualitative interface). Sometimes the trace is accidental or incidental, or solely a side-effect of a property of the materials. But it can be designed intentionally, and perhaps can reveal patterns which would otherwise be invisible.
We took this prompt as a starting point, with an intro to data materialization from Marion Lean, and created a wide range of interactive projects which build on and expand the idea, addressing everything from physicalizing a person's heartbeat through ferrofluid, to a VR physicalization of race and privilege. We have sonification of sunsets, new kinds of educational building blocks, ways to experience laughter, 3D printing as a real-time visualization of a human-machine conversation, synchronized breathing, a physicalization of the thread of a conversation, and design fiction extrapolating from your digital traces.
In the second half of the semester, we were forced by circumstances to pivot to an online form of distributed studio—which offered challenges but also some interesting new ways of responding to the environments of our everyday lives: the interior world, the endless Zoom, and the connections across space and time zones. These Virtual Environments Studio projects are a diverse mix of ideas ranging from new ways of experiencing music and art online, to remote therapy environments, some addressing aspects of the current COVID-19 situation directly, and others looking to a world beyond or above our contemporary distress.
In working on the projects, and 'showing' them internally, we explored combinations of Discord, Figma, and other tools to see whether we could partially replicate the feeling of 'stopping by someone's desk' in a physical studio—I don't think we're quite there yet, but this group of students have the sensitivity to the design of physical and digital environments, and the talent, to be at the forefront of evolving design practice in the new environments of the lives ahead of us.
90%. It's the failure rate for non-profit fundraisers on GoFundMe, the biggest crowding funding platform for such purposes. I looked into this issue as well as trying a fundraiser myself. This project is about how to frame the idea of sharing and associating oneself with a cause differently in the donating environment.
Whenever it shows up in Iron Man, I can't help but marvel at Jarvis, the hologram AI reacting to gestures of Tony Stark. I wanted to also explore creating visuals that react to my gestures in realtime. I used the TouchDesigner software and little After Effects to create the visuals.
I also wanted to explore creating a surreal experience in a space filled with lights and intriguing shapes, playing with audience's emotion. This is an experience for VR, creating using Unity, After Effects, and Mandelbulb 3D.
90%. It's the failure rate for non-profit fundraisers on GoFundMe, the biggest crowding funding platform for such purposes. I looked into this issue as well as trying a fundraiser myself. This project is about how to frame the idea of sharing and associating oneself with a cause differently in the donating environment.
Whenever it shows up in Iron Man, I can't help but marvel at Jarvis, the hologram AI reacting to gestures of Tony Stark. I wanted to also explore creating visuals that react to my gestures in realtime. I used the TouchDesigner software and little After Effects to create the visuals.
I also wanted to explore creating a surreal experience in a space filled with lights and intriguing shapes, playing with audience's emotion. This is an experience for VR, creating using Unity, After Effects, and Mandelbulb 3D.
My name is Davis Dunaway. I'm a student at CMU studying Environment Design and Game Design. Below are the two projects I did for my Environments Studio this semester. The first is a project about human conversation. I generated unique 3D prints as two participants took a Turing Test. Participants could watch how the print changed based off their answers in real time and got to take the artifact with them as soon as the experience ended. The second project is a design for a single piece face shield. Due to the increased demand of PPE, its important that face shield designs be cost, time, and material effective. 3D printing solutions can take up to 5 hours just to produce a single shield. I worked to design a shield that could be produced in under 5 minutes using a single piece of Mylar and some clever folding techniques used in origami. I am currently working to get the shield into mass production. Thanks for taking the time to look at these projects and I will be happy to answer any questions you might have about them.
My name is Davis Dunaway. I'm a student at CMU studying Environment Design and Game Design. Below are the two projects I did for my Environments Studio this semester. The first is a project about human conversation. I generated unique 3D prints as two participants took a Turing Test. Participants could watch how the print changed based off their answers in real time and got to take the artifact with them as soon as the experience ended. The second project is a design for a single piece face shield. Due to the increased demand of PPE, its important that face shield designs be cost, time, and material effective. 3D printing solutions can take up to 5 hours just to produce a single shield. I worked to design a shield that could be produced in under 5 minutes using a single piece of Mylar and some clever folding techniques used in origami. I am currently working to get the shield into mass production. Thanks for taking the time to look at these projects and I will be happy to answer any questions you might have about them.
The Turing Test was developed by Alan Turing in 1950, and is used to analyze a machine's ability to exhibit intelligent behavior equivalent to, indistinguishable from, that of a human. The Loebner Prize is an annual competition in artificial intelligence to award computer programs by judges to be the most human-like.
Computers are advancing and progressing, but some argue that human's ability to communicate is also regressing. We wanted to create an experience to visualize and reflect on the human-ness of conversations today, so we created The Most Human Human. The process is as follows: there are two human participants: one judge and one competitor, as well as one bot. The human competitor and the bot are competing to prove their "human-ness", and the judge infers which response between the two is the human to the best of their ability. As the test progresses, a 3D print is generated in real-time.
The end product is a conversational piece, and tangible artifact of the experience. At the end of the experience, we hope to leave the users a couple of takeaway questions: Will this change the way you approach conversations in the future? How do you think human conversation will change in the future? Where does computer conversation fit within this?
The Turing Test was developed by Alan Turing in 1950, and is used to analyze a machine's ability to exhibit intelligent behavior equivalent to, indistinguishable from, that of a human. The Loebner Prize is an annual competition in artificial intelligence to award computer programs by judges to be the most human-like.
Computers are advancing and progressing, but some argue that human's ability to communicate is also regressing. We wanted to create an experience to visualize and reflect on the human-ness of conversations today, so we created The Most Human Human. The process is as follows: there are two human participants: one judge and one competitor, as well as one bot. The human competitor and the bot are competing to prove their "human-ness", and the judge infers which response between the two is the human to the best of their ability. As the test progresses, a 3D print is generated in real-time.
The end product is a conversational piece, and tangible artifact of the experience. At the end of the experience, we hope to leave the users a couple of takeaway questions: Will this change the way you approach conversations in the future? How do you think human conversation will change in the future? Where does computer conversation fit within this?
I'm a student in CMU's Master of Human-Computer Interaction program. Before this program, I worked professionally as a software engineer for several years in the Bay Area. I consider myself a design technologist. My passions include design systems, generative art, and ethical technology.
With Stage, I'm building a concert livestreaming application for the post-COVID era. As many musicians move to virtual concerts under quarantine restrictions, Stage pushes the boundaries of what a digital concert livestreaming platform can do. Check out the teaser video!
Stage incorporates direct artist-fan interactions, a virtual currency to reward the most loyal fans, in-room AR experiences, and more.
When building Stage, my main focus was on nailing the visual branding. I was meticulous about building the tone and voice of the app, from the moodboard to the "brand words" to the app name to the wordmark. There are some peeks at the behind-the-scenes process in the project images.
Hope you enjoy my work on this project! I would love any comments and feedback during the showcase, or through Twitter. My portfolio is still in-process. :)
I'm a student in CMU's Master of Human-Computer Interaction program. Before this program, I worked professionally as a software engineer for several years in the Bay Area. I consider myself a design technologist. My passions include design systems, generative art, and ethical technology.
With Stage, I'm building a concert livestreaming application for the post-COVID era. As many musicians move to virtual concerts under quarantine restrictions, Stage pushes the boundaries of what a digital concert livestreaming platform can do. Check out the teaser video!
Stage incorporates direct artist-fan interactions, a virtual currency to reward the most loyal fans, in-room AR experiences, and more.
When building Stage, my main focus was on nailing the visual branding. I was meticulous about building the tone and voice of the app, from the moodboard to the "brand words" to the app name to the wordmark. There are some peeks at the behind-the-scenes process in the project images.
Hope you enjoy my work on this project! I would love any comments and feedback during the showcase, or through Twitter. My portfolio is still in-process. :)
This is a dystopian design fiction inspired by the new implementation of health QR code in China. As a tool for public health monitoring and potentially for future mass surveillance, it pushes me to think about the consequences of the massive data system. While such surveillance measures can provide people more security and might come from goodwill, we give in our responsibility of self-discipline and let others, whether it is the authority or technology, to dictate our health practices and even daily life. Especially with the rapidly evolving sensing technology, how much autonomy is there left for us besides pressing the buttons of "yes" and "no"?
The first part of the future fiction presents the omnipotent IoT system that monitors your health condition, hosted on the phone. In 2025, your smart home can analyze different sensor data inputs and generates health indexes, which would be sent to other platforms to not only monitor your environment but also provide necessary compensations if needed. It is dedicated to creating a supportive, non-biased smart home system that provides you the health care you need.
The second part of the fiction describes the story of Josh, a first-day user of the health monitoring system, about how he gets into his first day of self-quarantine with the help of the machine:)
This is a dystopian design fiction inspired by the new implementation of health QR code in China. As a tool for public health monitoring and potentially for future mass surveillance, it pushes me to think about the consequences of the massive data system. While such surveillance measures can provide people more security and might come from goodwill, we give in our responsibility of self-discipline and let others, whether it is the authority or technology, to dictate our health practices and even daily life. Especially with the rapidly evolving sensing technology, how much autonomy is there left for us besides pressing the buttons of "yes" and "no"?
The first part of the future fiction presents the omnipotent IoT system that monitors your health condition, hosted on the phone. In 2025, your smart home can analyze different sensor data inputs and generates health indexes, which would be sent to other platforms to not only monitor your environment but also provide necessary compensations if needed. It is dedicated to creating a supportive, non-biased smart home system that provides you the health care you need.
The second part of the fiction describes the story of Josh, a first-day user of the health monitoring system, about how he gets into his first day of self-quarantine with the help of the machine:)
This breathing pillow is a physicalization of breathing. It is inspired by the moment when you lie on your loved one's chest and feel the rise and fall of their chest. To simulate a similar intimacy in relationship, we built a pair of breathing pillows where you can lie on one pillow and feel the breathing from the person who lies on the other pillow through the rise and fall of the pillow.
We used linear actuators to push the air in and out of the pillows and stretch sensors to detect the expansion and contraction of the chests.
This breathing pillow is a physicalization of breathing. It is inspired by the moment when you lie on your loved one's chest and feel the rise and fall of their chest. To simulate a similar intimacy in relationship, we built a pair of breathing pillows where you can lie on one pillow and feel the breathing from the person who lies on the other pillow through the rise and fall of the pillow.
We used linear actuators to push the air in and out of the pillows and stretch sensors to detect the expansion and contraction of the chests.