Week 8 Update: Generative Research and Future Visions of Portland Public Schools

We began this week with a guest lecture from Adam Cowart, a PhD candidate in the transition design program. He introduced us to the concept of CLA (Causal Layered Analysis). We used this framework to better understand the landscape of our problem space at Portland Public Schools. Adam described different facets of the problem space through the lens of “litany filters.” To recognize what futures are feasible, we need to understand the triad of history, present, and future, and what elements in our landscape pull, push, or weigh down progress.

We took some time in class to reframe our insights through this framework, and began synthesis of potential elements to build a bridge toward the future vision created by Prospect Studio. This process began slowly, but after some heavy lifting we began filling out the diagram with great enthusiasm! It was refreshing to revisit our secondary research (which was already categorized under a STEEP-V framework). It was revealing to see visually how much further we have advanced our understanding of this problem space since literature review and background reading.

Outside of class, our team was busier than ever — working to adapt and overcome the obstacles we’ve encountered in our generative research phase has not been easy. I’ve struggled to support these efforts. The external factors of my personal and professional life have been an ongoing source of strain. I feel so much gratitude to the support and encouragement I’ve received from this team, and this week I felt a great deal of pressure to reciprocate.

Sample of generative research protocols

Sample of generative research protocols

This effort to pay back the generosity I received (when I needed it most) began with a complete/comprehensive draft of our protocols for generative research, and the specifications for our workshop. Working with Carol, we delivered this to the team ahead of schedule. It was necessary for us to draft new protocols and workshop exercises to include a broader audience, outside of Portland Public Schools. We found that last week was somewhat of a dead end for seeking participation from our intended stakeholders (administrators and educators at PPS).

For our workshop, we wanted to know how different stakeholders perceive their relationships with counterparts, learn what different stakeholders prioritize and why, gain deeper understanding of how educators think about the future of public education, and to explore and define preferred futures.

We conducted three separate workshop sessions with educators outside of PPS. This included neighboring districts of PPS (Gresham-Barlow), as well as out-of-state educators. This approach allowed us to glean insights regarding that which is common in the US public school system, and that which is more specific to Portland. While this adaptation is not without its risks to skewed data, it is far more preferable that to remain without any additional insights beyond our primary research activities.




Screenshots of workshop activity

Screenshots of workshop activity

This was my first experience with executing participatory design with stakeholders and it has been such a rollercoaster of emotions. Since Carol and I worked on the protocol together, it was only logical that we also create the visual and interactive components for the workshop. We iterated on our initial concept by practicing with our own team, with each member taking a turn roleplaying as a participant. This helped us to work out the kinks and refine details before putting anything in the hands of our participants.

The first workshop with a real participant was very revealing. Having access to their thought process in real time, their visual associations, priorities, and ideas about the future were peeled back in layers, digging deeper into their lived experiences than we ever got through primary research and conversational interviews. Even the generation of simple sketches gave us glimpses into their inner worlds. I now question how important it was to conduct traditional interviews in the first place. Workshops are just so much more dynamic and active than interviews, and I consistently came away feeling more connected to the participants and their experiences.

Sketches.png

This weekend was highly reflective. With new insights in hand, we spent over five hours evaluating what we discovered. There was so much for us to consider and it was only once we had the chance to pick it all apart together as a team that we could begin to make sense of it all. Many of our initial assumptions were blown out of the water. Our newfound perspective gave us a real sense of how important relationships are in the field of teaching. We also learned that technology is probably the least important factor for educators — with the exception of a desire for students to have high-speed internet at home, there was little to no interest in improving access to technology generally.

I’m still getting used to applying so many different approaches and methods so quickly.  I feel like I’m only occasionally operating with a sense of clarity. There has been prolonged fuzziness that’s difficult to describe or ignore. It seems as though new insights provoke deeper questioning, while offering little in the way of certainty. I think this is just the experience of progressively revealing collective and individual ignorance. Before learning enough to act decisively, we must first gaze into the vast abyss of what we still do not know.

Week 7: Expanding scope of generative research

This week, our in-class sessions were dominated by guest lecturers who provided insights into our current work in progress. On Monday, Stacey Williams and Richard The asked us for our team’s “elevator pitch” and then asked us a few questions about the work we were doing:

  • Is the artifact(s) part of the intervention, or just a representation?

  • Is there a conceptual map that anybody should be working on to provide a system?

  • Can we design a process that will unify the decision making process at PPS?

  • Creating space where they can reflect on their own lives and experiences, and present a different model for education?

Carol was quick to respond regarding the relational mapping from our last presentation, and how our understanding of the relationships between administrators and other stakeholders has revealed a potential leverage point for meaningful interventions, but that the artifact should be something that inspires change.

Peter added that we’re separating the artifact from the process, but will develop an artifact that is representative of the depth of our research and understanding of the problem space. We then spent some time brainstorming out loud about some form of “ARC Institution” in the future could help to achieve the goals outlined in the Prospect Studio brief. A couple interventions we may want to prioritize:

  • Leadership development curriculum, teaching design and reflexivity.

  • Summer courses that are paid separately from the 9-month salary.

Peter reminded us that “future is fiction” and that it is our job as designers to bring that fiction into high enough fidelity that we make a persuasive argument through form. This ultimately means that we must situate the proposal within a fiction, and build from there.

Richard The wanted to know what other communication materials might inspire this. While not suggesting that we need to answer such questions with any degree of immediacy, we should put onto our horizon a few questions around how the ARC Institute might talk about these goals. For example, this could be a poster that says what life-long learning looks like.

Stacey’s other comments tied in well with the reading that Peter provided (Rutger Bregman). Specifically, this strange mismatch between education and the typical way we encounter work: i.e., in school, each subject is divided and compartmentalized, whereas in our work, often we must apply mastery of multiple subjects and do not have the luxury of flattening our problems into a single subject matter. Stacey pointed out that we (meaning educators, but also society) are boxed into binary thinking whereas other cultures have non-duality, non-binary ways of thinking.

HomoLudens

Knowing that this entanglement is an obstacle to change, we must also consider what other sudden changes (from external factors, such as a pandemic or climate change) might present opportunities.

On Wednesday, Liz Sanders ran us through a series of role-playing exercises, where we considered the differences in priorities for stakeholders. This was confusing at first, but eventually we sync’d up and began negotiating as if we were in fact those different people in a school system. I was representing the thought process at a district level, while Carol played a student. I recognized that there were basic needs that were not at all address in our hypothetical scenario (a hackathon to create new and sustainable transportation for the future).

This was eyeopening and made our team think differently about our own approach to generative research…

Oh, our research. It has been challenging these last two weeks, and we’re worried about getting stuck. Despite so much cold calling/emailing acquaintances, we’ve found that right now in particular is a bad time to solicit any participation. PPS is migrating to a hybrid model, with teachers having stated a great deal of concern about safety. Additionally, this next week is their spring break, so any activities that require reflection on their daily lives will not capture work activity. This is also the only week of respite they will be afforded before summer break.

Nevertheless, there is some scintilla of joy to be extracted from this obstacle. I’ve had more motivation to reach out to people I haven’t been in touch with since graduation. Some of them are doing really great, others not. Some are starting families, others are starting careers. Much to my surprise, two acquaintances are actually in the process of becoming K-12 educators. This was not expected, but it was heartening to know that such alignments exist.

Our team is also struggling with external pressures: wrapping up mini courses, midterm expectations, job hunting and interviews, design challenges, personal struggles, and more. One of the things we specified in our team contract was transparency for such events. My team has been supporting me the best they can while I navigate these struggles and diversions. I too have been supporting them the best I can.

This weekend was very productive, as we generated new protocols and refined our workshop to included a broader range of participants. I’m especially excited to try out some of the techniques we’ve been considering, including: “Thing From The Future” based on the work of Stuart Candy, prioritization card sorts, and relational mapping. That last exercise was directly inspired by our conversation with Liz Sanders.

TFTF

Thanks to a 20 oz. can of Red Bull, I was able to power through my very packed Wednesday, and I’m glad I made it that session, since we ended up monopolizing Liz in our breakout room — she seemed to be genuinely interested in our project, which was very, very humbling.

On the personal side of things, I’m glad to have my job interview and design challenge behind me. It’s been difficult to juggle so much, especially while still grieving the loss of a family member. I’ve been more emotionally raw, and feel less focused than I’d like. Some of this is due to a loss of sleep and not the workload. I seem to be “fine” during the day time, but when the sun sets, and the world gets quiet, I still think of him. I miss you, Uncle Ron. I’m sorry I won’t be there to send you to your final resting place. Like so many we’ve lost this year, you deserved better than this, and sending flowers to those left behind feels insufficient in the face of so much loss.

We’re about to cross the vital half-way mark in the semester. Normally this would include a spring break of our own, but due to concerns about increased student travel, we instead have pre-scheduled “off days” to (at least in theory) provide some periods of rest. It is something like having a nap instead of a full night’s sleep. We can make do, but that doesn’t mean we need to like it.

Week 6: Planning and coordinating generative research

“If we have to wait for the next pandemic to bring about big change, then we’re in big trouble!”

—Peter Scupelli

This week, our team presented our exploratory research findings with clients from Prospect Studio. This was something we did a “dry run” for the week prior. The feedback we received was generally very positive. In particular, I was pleased to learn that the “ARC” concept was aligned with the client’s understanding, and they even suggested that they would adopt this terminology for themselves! There was a lot of back and forth on this concept and it was incredibly validating. By recognizing the overlap and potential integration of these attributes (Adaptive, Resilient, and open to Change), and addressing them as a single verb, and not three distinct adjectives, we’ve reframed our inquiry to reflect actions and behaviors.

Fiona appreciated that we identified the multiple roles of educators who must address their own social and emotional needs, while also supporting students. She pointed out that teachers also need tools for communication.

Collaboration Structure diagram was successful with Jenny Hoang. There was some confusion around Board members and their placement within districts. Carol was able to clarify this well for the entire team and I continue to be grateful for her contributions to the team. I’m very fortunate to be working with a team that has a nearly two year old working relationship— we’ve developed a beautiful shorthand together, and we recognize each other’s queues.

The administrators as a leverage point is something that both Jenny and Fiona resonated with, and this is promising for the next phase of our research. Jenny questioned our scope under the MLP. The national level might be too broad for some contexts, and there was a lack of distinction between state government policy makers and national/federal-level policy makers. This is something we will clarify going forward. Otherwise, the mapping of structured interactions was a huge success.

A question raised as we outlined this structure was what are the leverage points we’re considering, and what insights can we glean from the advent of COVID adaptations made to facilitate learning. We are doing a grand experiment in remote learning, but what are the lessons or takeaways from this experience?

We’re especially interested in the role technologies in facilitating communication. Video conferencing is only one small part of this. Thinking about organizational structures, we want to improve the modes and means of communication between administrators, educators, and other stakeholders. During our critique, Cat explained that open communication presents problems under a framing that leads to practical solutions. Being able to express needs for things like a mid-day break can have a profound impact on the quality of life for educators day-to-day. Jenny concurs and believes from her experience with exploring PPS that there is a lot of desire around this realm.

After our Wednesday workshop, our team met to discuss these important next steps. It was also Cat’s birthday!

Screen Shot 2021-03-12 at 15.05.09.png

We had a good time, but still got a lot of work done! We had some imbalance in the distribution of work, preparing for this presentation, and we’ve amended our team contract to (hopefully) improve delegation of future tasks. We’re also rethinking the responsibilities for team members who are not assigned facilitator or notetaker for a given week. One challenge is that some tasks end up being more involved than originally thought. When splitting up the work, it can be like cutting a pizza while blindfolded: everyone gets a slice, but there’s no guarantee that those slices will be anywhere near the same portion. In the future, when we find out that we got a too big or too small of a task, we can further split and breakdown tasks (where possible) to keep everyone productive but not overburdened.

After addressing our coordination for this next phase, we began mapping our current questions and considering what we wanted to learn. What we realized through this exercise was that almost any available method of active research could provide insights to our questions, so we simply needed to prioritize what would work best for us and work from there to design experiences that will illuminate these areas.

Screen Shot 2021-03-14 at 22.24.11.png

Our next steps will include generative research and workshops, and our hope is to gain more insights into this aspect of interpersonal and organizational communication. Through our primary research, and framing under ARC, we’ve identified a few key aspects of effective communication:

  • Problem-solving mindset

  • Active listening

  • Maintaining open communication and feeling heard

Other areas of consideration collaboration structures:

  • How do educators coordinate their efforts to bring change?

  • How do they support or hinder adaptations or changes?

  • What visions do administrators see for the future of PPS, their roles, and the roles of educators?

Peter recommended that we also consider future contexts, and think about relevant trainings and preparation. Pandemics are not frequent, but when COVID-19 arrived, there wasn’t any plan in place. This put districts in an especially difficult position—reacting to sudden change is never easy, and they had no prior practice. Other sectors (especially government sectors) often need to prepare for scenarios that are unlikely to happen but are potentially very disruptive.

Thinking about this point remind me of a very grim reality, that school shootings in the United State have become so frequent that schools began holding drills. I was a high school student in 1999, when Eric Harris and Dylan Klebold committed a horrific massacre at Columbine High School in Colorado. It’s difficult to describe what a profound impact this had on my experience with public education. Growing up as a teenager in rural Utah, the proximity to this tragic event resulted in an immediate reaction. My school began conducting “random” locker searches. Teachers and counselors began interrogating students media consumption—at that time, it was believed that playing DOOM and listening to Metallica were red flags.

As a community, what we needed were meaningful policies. Instead, we were subjected to onerous and disruptive security measures, derived from alarmist and factually inaccurate claims. Their response didn’t prevent such tragedies, but they did add to the hardship of students who were already terrorized. School shooting drills have not made today’s kids any safer, because the root cause remains unaddressed. We needed policy then, and we need policy now.

Good policy, however, is only possible when there is a clear understanding of the problem. An important role of a vision of the future is to anticipate needs before they become a crisis. This can lead to preventative policy and proactive measures. To understand the present, we need to also understand the past. To understand the future we need to understand the present. To gain deeper insights beyond interviews, we’re planning to start participants with a cultural probe diary study (this might be in their chosen format or sent daily by us) and then bring a mix of administrators and teachers into a workshop.

We’re still working out the details, but our current favored approach is the “Draw Toast” exercise.

Screen Shot 2021-03-14 at 22.29.31.png

We’re nearly done with our protocols and will be contacting our participants on Monday. I’m curious about what will be confirmed (from our exploratory stage) and what will be new or contradictory to our current understanding. We’re now focusing on something specific, but there are degrees of assumption going into this next step. I’m excited (and a little nervous) to learn more from our participants and to benefit from their lived experiences.

Week 5: Primary Research on Portland Public Schools

As part of my usual Sunday routine, I write about and reflect on my process. I prefer to do this on Sunday for a few reasons:

  1. Sunday evenings tend to be a more restful time than the rest of my (sometimes hectic) week.

  2. For practical purposes, the weekly assignment is due on Sundays at midnight.

  3. It’s much more difficult to reflect on the week until I have at least some distance between myself and it.

This Sunday, however, I am finding myself much more distracted than usual. This morning I received a phone call about a death in the family. Grief cannot easily be described as what it is, but is generally understood as what it does. I am grieving, and I am experiencing grief, and grief is doing its thing—like the way cabin pressure changes on an airplane, I feel it and know what I need to do and how to cope with it, but I cannot truly escape the discomfort.

Earlier today I could have written with confidence about the state of our research or on the topic of Portland Public Schools. In fact, our interviews went very well and we found ourselves very excited at the end of each one. We were practically (((VIBRATING))) with enthusiastic observations about what we were hearing and how it related to our secondary research or insights from prior conversations, but now a sort of numbness has muted all of that. I feel like a stranger to my own words. Instead of dwelling on my own thoughts and experiences, I’m going to instead take some time to talk about the work of my peers. Of course I’d prefer to instead write about my own experiences, as I usually do, but today is different. I’m a million miles away, and cannot be reached.

In preparing for our presentation on Monday, our team met over zoom for two separate work sessions this weekend. It was necessary for us to reexamine our interview notes, insights from secondary research, and to identify relevant patterns for PPS and the subject of educator essentials.

Up until this weekend, we’ve thought about the Prospect Studio vision in quite literal terms, and have regarded their goal of teachers that are “adaptive, resilient, and open to change) as three distinct attributes. What we have found through our interviews, however, is that these attributes do not exist in isolation, but are more like dimensions to a general shape of things to come. We further realized that the “it” of these attributes is less important than the “how.” These goals are not about “states of” or in ways of being, but rather through actions, choices, and decision making—both individually and collectively. This is a problem that is just as much a matter of policy as it is a matter of social science. The question then is not how an educator might be these things, but instead a question of what conditions will encourage or discourage the act itself.

We shifted our focus to treat these attributes as a single verb: “ARC” (Adaptive Resilient, and Open to Change). By thinking about “arcing” as an action, we can think about and frame these attributes in more practical terms: how ARC is being put into action will help us to consider what relevant artifact we might ultimately create.

Screen Shot 2021-03-08 at 00.15.19.png

I usually begrudge the process of reporting on work in progress—it feels like reporting on a cake before putting it in the oven—but this time I feel very differently about the matter. I think that in this particular case, I am able to see the value to our process and how this periodic break to present work at different milestones just makes sense. We are not presenting on arbitrary dates, but during key moments in process.

In addressing what we need to outline in our presentation, we are also being invited to reflect on our work so far, and to see what has and has not changed for us as a team. We have had very active discussions as a group, about what new discoveries have reshaped our understanding or challenged our assumptions. We reviewed our interview recordings and began picking apart common threads.

In past experiences at CMU, our group work usually did not last more than maybe seven or eight weeks at most. Working at this depth, over an entire semester, has given us the time necessary to work through a more recursive and generative process. This level of analysis, meta-analysis, and reflection, have helped put some distance between our individual perceptions and the information that is available to us.

Working in this way has revealed the importance of process and some of the exercises completed in previous coursework. Affinity mapping continues to be one of the more valuable approaches to sorting uncategorized insights and to promoting pattern recognition. We have used affinity mapping as an iterative tool, especially for refining our interview questions.

Week 4: Portland Public Schools — Process Reflection

“Before moving on to the next step in your futuring process, take a moment to lean back and consider what you’ve built up to this point.”

How to Future, Ashby & Smith

This week, our team conducted the first round of interviews. We have not yet interviewed PPS educators, but instead collected general insights from a school administrator based in California. While this interview should be properly regarded as one outside of our research context, it is nevertheless relevant to our primary research objectives. Given the width of our exploration, we have many questions that can be answered from outside stakeholders.

Some questions we’re asking are indeed very broad in their scope. After speaking with Peter about the state of our project on Monday, he recommended that we frame some of our questions to account for the larger context of a not-too-distant future:

  • How is learning in the “new world” of the future different from the world we are living in now?

  • Where are we right now?

  • Where are we headed?

  • What connects the present and future are agents of change?

  • How might we design something that aligns with those forces?

  • What are the forces from the past?

  • What are the hindrances to change?

Another intervention into our process came through meeting with our counterparts (another team that is focusing on PPS). We exchanged resources and compared some of our initial concepts. While there wasn’t too much overlap between groups, it was still quite helpful to expand our questions to account for the larger system of public education in the United States.

While these abstract considerations were useful in formulating the questions our team was interested in asking participants, we also realized some very practical considerations. Namely, we decided to prioritize our interview questions under the assumption that any public school teacher is likely to be stretched thin for time, and thus would only be able to provide between 30 and 45 minutes of interview time. This constraint helped us to focus on questions that would be most likely to illuminate current unknowns regarding the present state of PPS, and the future needs of educators. 

On Wednesday, Peter and Stef continued providing feedback as we refined our interview questions. Some things we still needed to resolve (a work in progress), but one particular pain point thus far has been: how are we defining our terms?

We returned to our Miro board (which has served us well as a virtual whiteboard for our team to reference).

Screen Shot 2021-02-28 at 23.50.42.png
Screen Shot 2021-02-28 at 23.51.22.png

Peter also recommended that we consider incorporating generative research processes; e.g., ask participants to invent a “magical device” that would solve current problems. Through this process of making (participants are not required to address technical functions), they can articulate their priorities, wants, and needs. In other words, it’s not about the device itself, but the story that the device helps the participants to tell. For interviews, we also consulted with Hajira and Sofia for feedback, and the ultimate outcome of this cross-critique was a prototype of our primary research protocols. Our hope is to incorporate this refinement method with future interviews, to strengthen the value of our research with each subsequent iteration. 

Motivation has been a continuing struggle. Last week was confluence, and this week we had a day off (Tuesday), which in theory would have allowed for some rest, but instead just made the entire week feel a bit (?)… Off? It was jarring to start the week with a busy Monday, an absent Tuesday, and then continue with a “regular” week. To be honest, nothing has felt quite right since switching to online/remote learning. Beyond the workload of graduate school, there is a commutative impact from the prolonged isolation of quarantine and social distancing.

If pandemics were easy, then perhaps we would have them more often.

I’m curious about what these experiences might mean for our team’s exploration into the goal of resilience. When do routines become a rut? When do repetitions become rituals that have overstayed their welcome? How might we balance the need for variety with the desire for stability? What kinds of interview questions will illuminate these concerns?

I’m anxious to learn more from our participants. We have two additional interviews scheduled for our fifth week, with a few others still in the pipeline of confirmation.

Week 3: Portland Public Schools — Reflection on Researching Educator Essentials For a Vision of Teachers Who Are Resilient, Adaptive, Open to Change

“In sum, if you can set yourself up with a definite question for every day in the field, find a solid, reliable way to get the data you need to answer it, and feel confident in the insight that emerges- you will get where you need to be in the long run.”

—Christena Nippert-Eng

This week, our team took a deep dive into secondary research. Using the STEEP analysis framework, we assembled a large collection of articles, relevant URLs, case studies, and much, much more within a relatively short period of time—the power of scale is in play for reasons I’ll illuminate soon. Close reading of this text was then distilled into short summary statements. Hat tip to Dr. Elaine Gregersen, for this wonderful article on how to make use of spreadsheets for research. This approach had several advantages:

1. a clear division of labor.

Specifically, our team was able to divide our secondary research along discrete domains/categories while also sharing any incidental discoveries. This “yes, and” approach to research lowered the stakes and allowed for maximum contribution by every member of our team.

2. expanded exploration and discovery.

We were given a specific focus of our own choosing, and this was based entirely on our affinities, curiosities, and professional backgrounds. A clear advantage of having such a diverse group was our ability to apply personalized knowledge toward an information gathering process.

PResQsAffinity.png

3. Rapid synthesis.

After gathering our sources and insights, and taking time to discuss our findings as a group, it was easy to recognize patterns and apply our newfound information to the task of formulating dozens of relevant interview questions. This process set us on a clear path from secondary research and lit review to primary and ethnographic research.

Mapping.png

4. Clarity and transferability.

This information has been collected in a manner that will potentially benefit other teams; the indexical structure of the information we’ve collected, when paired with short summary statements, will enable others to quickly browse a significant amount of research in a relatively short period of time. It’s a buffet of relevant information!

We’re on the precipice of a convergent process, and we can now begin to glean some visions of the future of PPS beyond what was offered in the brief. The most dramatic insight revolves around “The Great Reset” brought upon us by COVID-19 is revealing unseen potential futures. We often cannot see what is possible until it happens, and the sudden shift to work/study from home is no exception. American schools are strained by unique technological and social needs. People are isolated, but also finding new and compelling ways to communicate and collaborate. We are working from within the context a novel problem and circumstance, and in doing so revealing new methods of organization and interaction.

There is a window of opportunity that I fear might be closing as vaccine rollout accelerates and we embrace a return to “normalcy” (a pre-pandemic world that we want to believe, desperately, still exists). If we return to this sleepy shadow of what once was, we risk a deep and terrible slumber that our children will never forgive us for—a good crisis is a terrible thing to waste. If we return to old habits and old ways of thinking, we will do so at the expense of those most negatively impacted by COVID-19. The underlying power structures and inequality that we cannot ignore under current conditions will be something we’ll be very tempted to sweep back under the rug once people are able to return to work without a deadly virus burning through our communities unchecked.

We need clear visions of the future; we need that clarity so much more now than before the pandemic.

Next week, our plan is to setup times for interviews. Now that we have a general landscape of what is known and documented, we have lots of questions to ask and new insights to gain. I’m very pleased with the work our team has been doing and have absolute confidence in our ability to make these interviews a success. The curiosity is palpable at the moment and we’re eager to begin connecting general and specific knowledge. These first-hand insights will fill so many gaps if we can just ask the right kinds of questions.

The current pace seems to be sustainable and the progress that we are making has been very satisfying, but I’ll admit to having symptoms of “Confluenza.” The opportunities afforded by a job fair are not something I can ignore, and while I have done my best to take advantage, I do find the experience a needless distraction. Last year’s “open studio” was downright nauseating. The contradiction of values and actions was disturbing and felt like an intrusion into an important space: the studio was a haven for critical thinking and offered a high degree of psychological safety. The presence of so many “talent seekers” and alumni felt like an intrusion in 2020. This year, those same people were viewing me from a camera inside my home.

Simply put: from a personal perspective, the online/remote format of 2021’s Confluence wasn’t an improvement. The people I spoke with were professional and generous with their time and engagement, but I could feel their fatigue through the screen. There’s just a cloud of general burnout and I admire the way so many people manage to push back against it.

Our team selected Educator Essentials because we recognized the value of educators as vital tissue, making the rest of the body of education whole and capable of movement, growth, and change. Knowing that our ultimate goal is to produce an artifact that inspires an image of educators that are resilient, adaptive, and open to change, I am both grateful and terrified of the flood of countless examples I see every day, through every interaction I share across cameras and screens. I see people who work diligently, compassionately, through these screens.

If you want to get some sense of what I really mean by this (because it is always better to show than to tell), then just watch how these children self organize when an educator is temporarily absent from zoom.  The teacher, Emily Pickering of El Paso, Texas, exhibits these traits, and it is evident in how her students responded in her absence. The future is now and we should marvel at the efforts we are seeing in our daily lives. This moment is so much bigger than all of us. The future isn’t something we can wait in line for. It is something thrust upon us with all of its dazzles and horror. What we are seeing from educators and students is just one piece of a larger picture.

team.gif

We are not “making the best of this” we ARE the best of this. All of us. For better or worse, everyone is doing the best they can. This was true before the pandemic, but it’s easier to see it now.

Simple file encryption for MacOS

A complete step-by-step recording, if you’d prefer to just follow along instead.

During a Thursday morning research methods lecture, our professor brought up the issue of data privacy and ethnographic research. Data privacy is an important issue and something I’ve written about before. One particular challenge (amplified in the context of remote work and COVID-19) is storing and accessing sensitive data. How can you safely store sensitive information, such as recorded interviews? While no method is 100% safe, there are some very basic precautions that are highly effective and easy to use.

For folks who are using MacOS (or OS X), here’s a simple method storing and accessing encrypted files.

Step 1: Open Disk Utility

Open Disk Utility, press and hold Command () + [SPACE] to open Spotlight Search, type “Disk Utility” and press Return [Enter].

Screenshot of Disk Utility on MacOS 11 (Big Sur)

Screenshot of Disk Utility on MacOS 11 (Big Sur)

Step 2: Make a New Blank Image

We’ll use this program to create an encrypted disk image that automatically resizes itself depending on how much data it contains. Go to File -> New Image -> Blank Image… or use the keyboard shortcut, Command () + N.

Blank.jpg

The image on the left shows the default blank disk image settings. The image on the right shows the settings we’re going to use. This will create a disk image with virtually unlimited space and encryption that is very tough to crack.

Default settings for a new blank disk image.

Default settings for a new blank disk image.

100 TB sparse bundle disk image with 256-bit AES encryption

100 TB sparse bundle disk image with 256-bit AES encryption

Step 3: Set Image Format To Sparse Bundle Disk Image

Click on Image Format and change from read/write disk image to spare bundle disk image.

This image format has many advantages. We’re using this format because it is compatible with Macs running OS X 10.5 (Leopard) or newer, plays well with Time Machine backups, supports APFS (the latest disk format used on Macs running 10.13 (High Sierra) and newer, and it will allow us to create a massive disk image that will automatically shrink and expand when adding or deleting files contained inside it.

SparseBundle.jpg

Step 4: Set Image Size

Click on the field for Size: and change from the default of 100 MB to 100 TB.

This is a staggering amount of data, equivalent to over 20,000 DVD movies. Don’t worry about filling up your hard drive. This number represents the theoretical capacity, and not the actual size.

Size.jpg

Step 4: Set Image Name

Click on the field for Name: and change from the default of “Untitled” to SuperSecret.

This is just an example, feel free to get creative. Repeat this step under Save As: and rename “Untitled” to SuperSecret.

Name.jpg

Step 5: Set Encryption

Click on Encryption: and change from the default of “none” to 256-bit AES encryption (more secure, but slower).

No encryption scheme is perfect, but 256-bit AES is pretty darn good. Brute force attacks using the world’s most powerful super computers would take an absurd amount of time and energy. Using a single desktop PC to brute force would require more time than the eventual heat death of the universe.

AES256.jpg

Step 6: Set PASSWORD

As is often the case, the password is going to be the weakest link in this security scheme. Password strength, password length, storing and managing passwords, etc. are lengthy subjects beyond the scope of this guide. There are password managers like Keychain, and strategies for composing memorable passwords, but keep in mind…

if you forget or lose this password, your data will be lost forever.

Password.jpg

Step 7: Double-CHECK YOUR SETTINGS

Disk Utility can do some spectacularly stupid things. It’s a good idea to double-check your settings. Make sure:

  • You’re saving the disk image to some place that is easy for you to find.

  • You’ve set the correct disk image Size, Format, Encryption scheme, and Image Format

If everything looks good, then save your disk image.

Complete.jpg

Step 8: Create Disk Image

Click “Save” and wait for the disk image to be created. This should take less than a minute, depending on your system. After it finishes, click “Done” and quit Disk Utility by File -> Quit or by pressing Command () + Q.

Create.jpg
Done.jpg

Step 9: Inspect Disk Image

On your desktop, you should see an icon with the name of your disk image and sparse bundle. Right-Click on the .sparcebundle file and click Get Info. As you can see, the file size is only 56 MB, which is just slightly larger than a typical video file.

Mounted.jpg
56+MB.jpg
GetInfo.jpg

Step 10: Verify Disk Image Capacity

The mounted disk “SuperSecret” is a rewritable volume with up to 100 TB capacity. To verify this, Right-Click on the mounted disk and click Get Info. The capacity shows 100 TB, but is limited by the physical disk’s free space.

Compare.jpg
Trash.jpg

Step 11: Unmount Disk

The mounted disk “SuperSecret” is automatically mounted after image creation. After you move your sensitive data to the image, you’ll want to unmount the disk. In general, it is a good idea to only mount the disk image when you’re planning to read or write sensitive data. Leaving the disk mounted 24/7 leaves the data vulnerable to unwanted access. Simply Click and Drag the mounted disk to the Trash to Eject.

 

Step 12: Mount Disk Image

In the future, when you need to read/write sensitive data, simply Double-Click the .sparsebundle disk image to Mount the encrypted volume. You’ll need to enter your password to access, but you can store this in Keychain to make this process automatic.

OpenPass.jpg

Week 2: Portland Public Schools Territory Mapping, Personal Reflection

“It always seems impossible until it’s done.”

—Nelson Mandela

On Wednesday we presented our territory maps to the class and Fiona—our guest from Prospect Studio. From completing the MA last year, and my work in the Service Design seminar course, I feel somewhat confident in the quality of our combined efforts and the contributions I personally had to offer. We worked through several iterations, and presented our first draft on Monday for Peter’s review. The feedback we received was helpful. In particular, Peter asked that we include systems level goals, specify equity-centered outcomes, and leverage personas and vignettes to aid in narrowing to a single area of focus. We were also encouraged to focus our primary and secondary research on the current state of things at PPS.

First Draft territory map, as it was presented on Monday

First Draft territory map, as it was presented on Monday

Stef recommended that we begin our presentation with an explanation for why we chose PPS and our chosen area of focus. Thinking about how we might explain these choices revealed for our team how these different aspects are connected, and to being recognizing some common threads. On Tuesday our team met for an informal work session. These kinds of meetings have been very helpful in keeping us on track and motivated—especially during this prolonged quarantine, working independently can be a real drag. Working together remotely and having realtime communication with one another has also been useful in ensuring we reach consensus on key decisions.

For example, at the beginning of our meeting on Tuesday, we spent roughly fifteen minutes clarifying our intentions, area of focus, and defining the various terms outlined in the brief: what does it mean for an educator to be “resilient, adaptive, and open to change”? This was important for the next phase of refining our territory map. We converged from exploring three distinct areas in the brief to a single section (educator essentials). We decided on a general format for the territory map (based on examples shared by Hajira’s morning presentation) with “who,” “what,” and “how” as distinct categories, arranged from inside-out. Initially, the concentric arrangement proved limiting. Our team opted to transition to an elliptical arrangement, exploiting the typical 16:9 widescreen format.

While I cannot speak for the entire team, I am most proud of our willingness to embrace ambiguity and to continue working through high degrees of uncertainty, toward something more coherent. I have no doubt that the outcome of our collective efforts were far superior to anything we could have made by working individually. Our collective intelligence was on display Wednesday, and it seemed that Fiona and Peter were pleased with the work.

Ultimately, we produced a fairly comprehensive map; this first deliverable represents many different perspectives, and our combined understanding of the brief. We managed to produce something that is visually appealing, and which communicates our ideas on this topic. There is still quite a bit of work to be done before we can validate these categorical relationships, but our team was very successful at representing the current state of our research, and how our process will unfold over time.

Final draft of our team’s territory map, as presented on Wednesday.

Final draft of our team’s territory map, as presented on Wednesday.

One unsettled question: how can we maximize the impact of our field research and ethnographic approach in the context of a pandemic? We shared a few options, and while we could not settle on an exact methodology just yet, we did agree to pool our resources and purchase some small tokens of gratitude. This was inspired—through personal experience with volunteering, I know that “thank you” goes much further when it includes a tasty treat or tangible artifact to commemorate the experience.

We met again this weekend and debriefed on our experience with the presentation. We continued working in a variety of collaborative digital environments and management tools (Figma, Trello, Miro, Google Docs, Slack), but upon Chris’ recommendation, we have decided to add another tool to the mix: Framer. Our team contract remains intact at this time, but we amended our debriefings to include a “rose, bud, thorn” framing. This made it much easier for the team to share their thoughts and feelings about the work so far. We also added a “shoutout” option for meaningful expressions of gratitude. This worked very well for our first debrief, and will likely be standard going forward.  

The Future of Portland Public Schools — Week 2, Team Update

This week, our team posted a bio on our course homepage. Chris put it best, “we cute as hell, just saying:”

the_mccc_collab-2.gif

Making the best of remote collaboration

We also drafted a team contract to establish expectations and roles, among these are:

Goals for this project:

  • Sharpen our skills for…

  • Research-based design

  • Remote collaboration

  • Providing meaningful artifacts for a client

  • Something great for our portfolios

Expectations

  • Meetings scheduled to avoid conflict and maximize productivity

  • Established deadlines

  • Delegation of tasks

  • Regular updates and communications for peers

  • Time to celebrate accomplishments

Policies:

  • Commitment to ongoing research to strengthen and refine ideas, concepts.

  • Flexibility and understanding (because there’s still a pandemic and bad things can and will happen)

Consequences and conflict:

  • Call people in, not out. If something is not going well, we all have a problem and can work together to solve it.

  • Transparency: let’s communicate when there are problems, we all have valuable perspectives and may recognize something that others do not

  • When possible, table discussions if we cannot easily make decisions. Decide when/if to address problems as a group or between group members

Custom rules for meetings:

  • Rotation of responsibilities so folks don’t burn out on repetitive tasks.

In addition to structuring our collaborative efforts, we also completed a first draft of our research team’s territory map.

First draft version of territory map for PPS, based on the brief provided by Prospect Studio

First draft version of territory map for PPS, based on the brief provided by Prospect Studio

Our professor, Peter Scupelli provided feedback on our initial presentation draft:

  • Need to include systems level goal

  • Need to specifically include equity-centered systems and outcomes

  • Leverage personas and vignettes to aid in choosing area of focus

  • Recommend researching the current state with a focus on equity

To improve the focus and process of our research, we’re also reviewing text from How To Futurea book by Madeline Ashby and Scott Smith. Chapter 3 describes the process for sensing and scanning information, which will be useful for our secondary research and literature review, as well as synthesizing insights from primary research and participant interviews.

Since the scope of this project is fairly broad and also because we are working on it for 15 weeks, we’ve also decided to adopt the project management tool, Trello to aid us in tracking our individual and collective progress, assign/delegate tasks, and note passing key milestones.

Using the feedback we received from our professor and our excellent TA, Sefania La Vattiata, we set more specific goals for completing our territory map.

Summary of Decisions/Surfacing Perceived Alignments:

  • Converged from 3 slices to one sections of the outcomes. Current Territory Map should focus exclusively on Educator Essential: Adaptive, resilient and open to change

  • To frame our future, we are currently considering the possibilities and methods that adults can learn how to partner with students about what happens after graduation/their long term goals

  • New Format for territory map based on Hajira’s work. From the center outward: Who-> What -> How -> Future experiences

  • Future experience should be structured within the futures scenario we are currently exploring

  • For the future experiences outer most rim, each scenario ideally aligns with an accompany assumption or question about the present. ie Future: VR explorations of proposed new school house Past: How/Can people see these plans today?

To do list for Wednesday’s Presentation:

  • Update deck with new (1) focus, (2) research, (3) Territory Map/Reasoning

  • Add Section in the deck about for the proposed Future Scenario we are exploring

  • Deck should be finished no later than Wednesday at 11am, share any changes

Which area are we truly focussing on? Educator Essentials? Graduate Portrait? We’re a bit hung up on the adaptive, lifelong learning? Our hope is that through this process, it will be easier to narrow our focus to the educator essentials.

UPDATE:

Our team completed their second revision of the territory map for the Educator Essential vision of AdaptiveResilient, and Open to Change.

1*bAAER_Lboz9o7XOi2HVFaA.png

Reflecting on the first week (Studio II and Research Methods)

On Monday we kicked off a design research project for Studio II. Peter began the brief with a quote:

“We see things not as they are, but as we are.”

—H.M. Tomlinson, Out of Soundings

This has stuck with me throughout the week as our team began exploring two public schools for consideration of focus. Portland Public Schools are an obvious choice because I bring unique insights to this domain—I have a couple years experience volunteering at Ockley Green K-8 and NAYA’s after school youth mentorship program. I’m able to contribute lived experience and perspective. I know PPS educators and know many of the challenges they face. I also feel very much indebted to this community, as they opened their doors to me and helped me at a critical stage of my journey into the field of design. It is my sincerest wish and goal to contribute to a more positive future for the children of district.

Having reviewed the briefs for Portland Public and for Santa Clara, I am reminded of the work I did over the previous summer, interning as a communication designer for Dezudio (Ashley and Raelynn’s design studio, here in Pittsburgh). This was a fantastic opportunity to apply recently acquired skills and knowledge from service and communication design coursework to address the challenges of Brooklyn LAB Charter School in the context of COVID-19. Central to this work was understanding the needs of historically marginalized communities which already could be described as “in crisis.” These conditions were amplified by COVID-19, but also presented an opportunity to justify significant overhaul to this institution and their approach to supporting student’s academic needs.

I’m excited to work with our assigned team. Cat, Caro, and Chris were all members of our MA cohort, and we have good rapport from previous projects and our time together in the studio. Additionally, this team comprises of a plurality of individual experiences and perspectives. Cat attended a private school in D.C., Carol is from Taiwan and has no direct or personal experience with public education in the United States. Chris and I both come from hyper-conservative and religious homes, and this impacted the way in which our parents made choices regarding education. For Chris, this meant a combination of home schooling as well as traditional enrollment. For me this meant several gaps and educational deficiencies (e.g., attending a rural school district with trailers in lieu of a school building) and eventually dropping out altogether. I did not finish high school, and instead passed the G.E.D. when I was still seventeen years old. These experiences lend to a strong negativity bias on my part, which I hope will balance out some of the more optimal experiences of my peers—I know all too well what doesn’t work in public education.

For the first week of our Research Methods course, we read excerpts from Alan Cooper and a paper by Branka Krivokapic-Skoko. A few points I found helpful from Cooper:

I agree with many sentiments, but his definitions for design, expertise, stakeholders, and just the entire framing of “users” feels very outdated.

I’m not sure that I agree with him on this statement:

  • Users of a product should be the main focus of the design effort.

I take issue with this framing because, as Cooper points out, the user is not always the same person as the customer. This is certainly the case in public schools, where students are not paying for a service, but still have specific needs. And are students “users” of a product? This market-based framework seems much more useful in a for-profit context.

When he says that, “it is important to speak to both current and potential users, that is, people who do not currently use the product but who are good candidates for using it in the future because they have needs that can be met with the product and are in the target market for the product” I question why this is the goal. Is it good to grow the market for growth’s sake? What if I’m designing iron lungs and that JERK Salk is trying to push me out of the market? 

  • Product and competitive audits: Also in parallel to stakeholder and SME interviews, it is often quite helpful for the design team to examine any existing version or prototype of the product, as well as its chief competitors.

This one seems pretty obvious, but last semester it was also very important that our research include lots of exploration into the same product/system space. Knowing what is out there helped us to recognize new potentials for existing applications and solutions.

On Wednesday, Stef (our TA for the class) shared her team’s project to give us a better sense of what to expect in terms of process and crafting our deliverables. This was useful for priming ideas about how to approach the somewhat open-ended prompt to create an artifact to represent daily life for PPS in the year 2035. Based on this impression, I began thinking about individual goals for this project and thinking about what I can hope to improve or learn throughout this process:

  • Remote collaboration

  • Research-based design

  • Providing meaningful artifacts for a client

  • Producing something great for a portfolio

I inserted these items into our team contract. For Monday, we need to complete a first draft territory map for PPS. This process has been a bit slower, and Cat mentioned how much she misses having access to a physical whiteboard. Even though we’ve been working in this “new normal” for nearly a full year, it is impossible to ignore what we have lost by switching to a remote learning context.

Brooklyn Laboratory Charter School - Designing For An Academic Year Under The Context of COVID-19

This summer has only just begun and I am now involved in two separate projects related to educational institutions and their response to COVID-19. Working with Dezudio and members of my CMU Design cohort, we are consulting a handful of teams to help them develop their strategy and documentation for Brooklyn Laboratory Charter Schools (LAB).

In the first week of this project, Brooklyn Lab teams presented their strategies for the 2020/2021 school year. There was a lot of information to sort through, and many different ways to interpret the key terms (e.g., “A” and “B” shifts, virtual, online, in-person, “brick and mortar,” traditional, etc). Additionally, all stakeholders are confronted with multiple layers of complexity. This impedes decision making and increases stress for all involved. I believe that it is highly appropriate to view these policies through the lens of a navigation system.

For students and their parents, this navigation involves when, where, how, and with whom they will receive an education. For instructors and staff, there is a question of when and where they will be in performing their most common tasks, and how they will interact with the students they serve, as well as when and where they will conduct their professional obligations beyond the classroom. For administrators and their efforts to support a highly modified school year format, there is a clear need for mapping, to help them maintain “the big picture.”

To achieve successful navigation, we may want to leverage the familiar look and feel of MTA maps, and adopt a language to reflect this navigation mindset. Instead of calling different delivery formats “shifts” we can call them “tracks” with different activities as “stations.” This metaphor can help reduce the cognitive load for stakeholders, enabling them to make decisions faster, and with more clarity.


Maps are useful for reducing cognitive load; navigating a city this size requires abstraction and timed decision making, and maps provide scaffolding for making those decisions.

Maps are useful for reducing cognitive load; navigating a city this size requires abstraction and timed decision making, and maps provide scaffolding for making those decisions.


I agree with Klaus’ assessment of the classroom diagrams: simple shapes and colors can be used to identify the most common categories (students, teachers, etc.), with a key to help reinforce the symbols’ meaning. I’ve included some sketches and prototypes from last week to show what these concepts might look like.

Concept sketch to explain the multiple channels; a student’s schedule might include a combination of in-person, alternative location, and online/in-home instruction.

Concept sketch to explain the multiple channels; a student’s schedule might include a combination of in-person, alternative location, and online/in-home instruction.

Using familiar conventions as metaphor will help parents, teachers, and students understand these new policies.

Using familiar conventions as metaphor will help parents, teachers, and students understand these new policies.

Maintaining high standards of rigorous academics is a challenge even under the most ideal conditions. Mapping the relationship between leadership, teachers, students, and the different education delivery formats.

Maintaining high standards of rigorous academics is a challenge even under the most ideal conditions. Mapping the relationship between leadership, teachers, students, and the different education delivery formats.

A key with simple colors and shapes can help readers understand the meaning of words like “Hybrid.”

A key with simple colors and shapes can help readers understand the meaning of words like “Hybrid.”

Kinetic-friendly spoon project Mega Post

That’s a wrap! It’s certainly been an interesting semester, but now I am ready to put it behind me. Reflecting on the spoon project, I have some final thoughts and observations. First, I want to thank the fine folks at CMU School of Design. From the amazing and hardworking faculty and graduate student cohort, I have had nothing less than inspiration and encouragement throughout this entire process, despite the obvious challenges of working remotely.

Rendering of sixth and final (?) spoon design. I pulled the kitchen design (Pierre Gilles) and bowl (Damogran Labs) from GrabCad.com. The spoon and coffee mug are mine.

Rendering of sixth and final (?) spoon design. I pulled the kitchen design (Pierre Gilles) and bowl (Damogran Labs) from GrabCad.com. The spoon and coffee mug are mine.

This project was divided into two parts: the first part focused on exploring different ways of prototyping and making. This was described to me as an informal way of A/B Testing for methods. The second part involved the deliberate iteration of prototypes through user testing — a challenge in the context of a global pandemic and social distancing. To make the most meaningful design choices possible given limited resources, I decided to leverage the power of physical simulation to supplement the making of physical prototypes.

There are a variety of 3D software tools that offer some degree of physical simulation. For this project, I selected Maxon Cinema 4D R20 (Educational License) and Blender as my two ways of making. I chose these because I already am familiar with Cinema 4D and understand know how to manage a workflow in that context, because Blender is open source and free for anyone to use, and both programs work under MacOS and Windows environments (my rendering workstation is a Hackintosh with multiple operating systems, which grants the flexibility to overcome certain technical limitations). My initial experiments with Cinema 4D were… not great.

My very first (and failed) attempt to simulate fluids in Cinema 4D. Carnegie Mellon University School of Design Prototyping for Interaction Spring 2020

As you can see, there are “physics” happening here, but they are not anything close to the physics of the real world. This is not “real world” physics, this is Asshole Physics:

Zachary "Spokker Jones" Gutierrez and I came up with the term "Asshole Physics" when we were discussing the game and the physics models it employed. Basically there's a lot of crap you can knock over and kick around, including dead bodies, buckets, cans, and little sections of drywall which are standing around in the middle of rooms for no obvious reason. Zachary casually mentioned, "I have made it a point to knock over every fucking thing in that game. I am living out my fantasies of being a giant asshole," and I responded by stealing his "asshole" comment and claiming that I made it up. Thus "Asshole Physics" was born.

Without more sophisticated plugins to simulate fluid, Cinema 4D R20 is only “out of the box” capable of non-newtonian semisolids. I can make stuff bump around and “squish.” I can have a 3D character micturating on the side of a building. I can create the appearance and illusion of something like a fluid, but with such restrictions, I could not realistically evaluate my spoon designs. I explored my options and found that Next Limit’s RealFlow plugin would meet my basic needs. Best of all, they offer a free 30-day trial! My initial excitement quickly waned after the plugin failed to install and activate on my system…

(This email chain is long and covers a week of back and forth with customer service. I am including the entire conversation as a way to recreate my experience. While this may not directly relate to the scope of this project, I still believe that there is value in documenting the unexpected problems that crop up when trying to do something new.)

Mail_02.png
Mail_03.png
Mail_04.png
Mail_05.png
Mail_06.png
Mail_07.png
Mail_08.png
Mail_09.png

It took a week to finally get everything sorted with the demo. During that time, I began to explore option B: Blender.

Blender is a free, powerful, open source 3D creation tool. Best of all, it includes the mantaflow fluid simulation engine (since version 2.8). I have worked with Cinema 4D on other projects, and have become fairly comfortable with the interface. Given my experience with Fusion 360, Inventor, and C4D, I knew that I would need to overcome a learning curve before I could use this software to meet my needs for this project. Fortunately, I was able to find a spectacular tutorial series for beginners.

If you want to read more about my experience with the tutorial, click here.

This tutorial was ideal because it involved exercises that helped me learn how to use the interface, and covered several different workflows. I was really impressed with Blender’s node-based material system and procedural textures. You can work stri…

This tutorial was ideal because it involved exercises that helped me learn how to use the interface, and covered several different workflows. I was really impressed with Blender’s node-based material system and procedural textures. You can work strictly with parametric modeling, or you can discretely modify mesh geometry to create highly organic and imperfect forms. I’m excited to work with Blender on future projects. It’s a very exciting time to be working in 3D.

While working through these tutorials, I began sketching and working in Fusion 360 to craft my first spoon designs for part 2 of this project. You can read more about this experience here.

Takeaways from Part 1

I really appreciated the responsiveness from the team at Next Limit. Clearly there are problems with the software’s implementation of their product’s copy protection. This is an all-too-common problem in the world of software. Programmers gotta eat just like everybody else, and we certainly should make sure that the talented and hardworking folks behind the code are able to put food on their table at the end of the day. Piracy can deprive a small business of the necessary revenue to keep the lights on, so I am absolutely sympathetic to this reality and what risks are involved when you release your software for demo purposes. Getting people to pay for something that they can easily get for free is a challenging proposition. At the same time, you cannot realistically expect to get customers to pay for software if they cannot try it first. Ultimately, this one week of back and forth with customer support was a critical loss. I never completed a side-by-side comparison of fluid simulations. While I did eventually succeed at installing and using RealFlow to do fluid simulations, (and was honestly impressed with how easy it was) I did not, however, have enough time to setup a comparable simulation to evaluate spoon designs. My trial expired about a week ago, and I see this aspect of the project as a lost opportunity. If Next Limit applied similar licensing practices as Maxon (verify it through .edu email address), they could offer an educational package of their RealFlow plugin.

Blender really came through for me. The learning curve was aggressive, but not impossible. While I found mantaflow to be a respectable and entirely capable fluid simulator, it was not without its own share of issues. I spent a lot of time making granular tweaks to improve the fidelity of my simulations, while also using the observations from my simulations to inform design decisions for my spoons in part 2 of this project.

Part 2: Design Iterations Based on User Testing

While this project required user testing and design iterations based on feedback, I decided to limit the user evaluations to address handle shape and the spoon’s overall dimensions. This was not an arbitrary decision or an excuse to focus on physical simulation of fluid dynamics (with user testing as an aside). No, this decision was based on the nature of the course from which it was assigned: Prototyping for Interaction Design. This semester I have have been focusing on designing for interaction (arguably, all designers do, at some point in their process, focus on this aspect). When thinking about the tools we use (to eat food) as a system, it is important to consider the touchpoints involved. The handle of a spoon is a non-trivial component. It can take on many forms, and naturally includes affordances. How someone holds a spoon, and how easy it is for them to use it are central to the evaluation of the design.

The iterations of design were highly generative in nature, inspired by both user evaluations and physical simulations, I maintained a homeomorphic continuity: treating the initial shape as an elastic form to be molded and reshaped to maximize performance. Knowing how a concave shape might be optimized to perform under rapid movement — I wanted to create something that would be useful, and the physical simulation of fluids facilitated a means of evaluation — is only one aspect of a more complicated interaction, and this test alone could not fully address human needs. When physical form is designed and directed to improve user interaction (and physical properties are given equal consideration), it is possible to create a truly useful tool. I realize that this is a very technical description, but it is easier to understand when properly visualized. I have rendered a compilation sequence to show how this spoon shape evolved to its final(?) form (I am still considering a physical prototyping stage for this project over the summer).

A sequence of fluid dynamics tests designed to evaluate fluid retention of concave forms. Carnegie Mellon University, School of Design, Prototyping for Interaction, Spring 2020.

Toward the latter half of this sequence, you will notice a change in colors (for both the liquids and spoons). I decided to differentiate the final rendering sequences as these were based on user evaluations. The colors chose for these final sequences are based on the color tags used for the user test:

These printouts are derived from DXF vector images exported from Fusion 360. The designs shown are oldest (top) to newest (bottom). The fifth design (blue) is rendered with a blue body and green liquid.

These printouts are derived from DXF vector images exported from Fusion 360. The designs shown are oldest (top) to newest (bottom). The fifth design (blue) is rendered with a blue body and green liquid.

I printed and mailed the paper prototype to a potential user suffering from ongoing hand tremors (my partner’s mother). I sent this without written instructions. Instead, I only provided different color tags to facilitate feedback. My user let me know that the red spoon handle was in the “Goldilocks” zone in terms of size and shape: not too big, not too small, not too curvy, not too straight. Using this feedback I constructed the sixth and final (?) form — see the first image of this post.

The user test included a direct side-by-side comparison with existing dinnerware.

The user test included a direct side-by-side comparison with existing dinnerware.

Before developing these simplified paper prototypes, I also experimented with ways of making more three-dimensional forms that could be sent in the mail. While this novel approach showed some potential, I was concerned with how user error might complicate or (even worse) bias feedback. Still, these paper prototypes helped me to better understand and interpret the scale of my 3D models.

PaperPrototype_01.jpg

Final Thoughts

This project still feels somewhat incomplete. Perhaps this is because the generative design process itself can always demand further iteration, or maybe it is because I have not yet created a physical prototype that can actually be tested as an eating instrument. Maybe it is only because there were still a few “rogue droplets” (grrrrrr) that I simply could not keep contained with the completion of my sixth iteration. Whatever the net effect might be from these various shortcomings, I am pleased with the learning opportunities that were presented throughout this exploration of design.

Were I to continue with this process, the next steps would be to 3D print the latest shape using a food-safe material (there are a few third-party vendors that offer this service). I would then ship that latest design for further user evaluation. I believe that there are still many additional iterations necessary before I could defend having created something that satisfies the criteria I set out with this project (i.e., a spoon that overcomes the challenges of involuntary muscle movements and essential tremors).

If I were to collaborate with others, I would also want to evaluate the ecological and economic impact of such a device. How might we go about manufacturing to appropriate scale? How might additional user tests with a wider audience influence the existing form? There remains many unanswered questions and a newfound respect for the power of generative design.

Bugs in the Blender

I have continued to have luck exploring the Fluid simulations in Blender, but this process has not been without its quirks. I recently encountered a strange issue related to Particle Radius settings

Particle Radius

The radius of one liquid particle in grid cells units. This value describes how much area is covered by a particle and thus determines how much area around it can be considered as liquid. A greater radius will let particles cover more area. This will result in more grids cell being tagged as liquid instead of just being empty.

Whenever the simulation appears to leak or gain volume in an undesired, non physically accurate way it is a good idea to adjust this value. That is, when liquid seems to disappear this value needs to be increased. The inverse applies when too much liquid is being produced.

What does this look like in practice? My most recent simulation actually seems to produce fluid as the scene progresses.

Nevertheless, I was able to gain critical insights into this form and will continue to iterate new designs. This is being done in conjunction with paper prototyping. These forms are less sophisticated, but still provide valuable information about how users will experience and interact with this flatware.

SpoonPrototype.jpg

Spoonfuls of updates

This week was packed full of progress on multiple projects. I received feedback for my group’s birth control information app “MyGallery.” Our work was even featured on CMU’s Design page.

Crafting an iconographic representation for the withdrawal method was my proudest moment.

Crafting an iconographic representation for the withdrawal method was my proudest moment.

I’ve continued to explore fluid simulations with Blender. I’ve ran into some technical hurdles: Blender 2.82 uses a variety of protocols to leverage GPUs for rendering and computation. It offers an AI-driven denoiser (Optix), CUDA path tracing, and OpenCL. My MacBook Pro has an AMD Radeon Pro 5500M GPU as well as the option to plug in a Radeon Frontier Edition (first generation Vega) eGPU on Thunderbolt 3. Plenty of GPU compute power in either configuration, but there is a snag: MacOS 10.15 (Catalina) has deprecated OpenCL in favor of Metal 2+. CUDA and Optix are proprietary to nVidia GPUs. Apple hasn’t shipped a Mac with nVidia GPUs since Kepler launched (GeForce 700 series). Blender supports AMD ProRender, but I found it was terribly unstable.

I could easily slip into a tangent about how unfortunate the breakup between Apple and nVidia truly is, but I will spare you.

My current workflow involves queuing some tasks to my desktop, running Windows 10. The GPUs are dual Radeon VIIs. Unfortunately, I found that rendering on Blender is unstable when both GPUs render in parallel. No problem, since I can free up the other GPU for Folding@Home (a hobby of mine that has exploded in response to COVID-19). Who would have guessed that a global pandemic would boost a distributed computing project to exascale?

Despite these obstacles of platform compatibility, I have made significant progress on my simulation-based research. It is difficult to understate how exciting this project has been for me. For some context: the ASCI Red supercomputer (at the Sandia National Laboratories) was built in 1996, and was the fastest supercomputer in the world until 2000. It was the first computer to achieve true terascale computing (one trillion floating point operations per second). I built my first terascale computer in 2013. This was shortly after leaving my job at Intel. There was something very gratifying about building a computer with a CPU I helped manufacture. GLaDOS G4 (you can see the project here, scroll down to “Everything Else”) was built with a GeForce GTX 780 GPU and Intel Core i7 4770k overclocked to 4.5 GHz. It ran nearly silent and fit inside an up cycled Apple Power Mac G4 (microATX equivalent) case.

The ASCI Red supercomputer was designed to simulate nuclear weapons tests. Today, I am using a system roughly ten times more powerful to simulate soup spilling out of a spoon. I was inspired to approach this problem by two projects. The first was a 2013 project from Portland State University (my alma mater) to make a coffee cup for zero-gravity environments. they used drop cages and 3D printing to iterate several designs until they had a shape that held liquid. “It wasn’t needed, but it was requested.”

The other project hit me right in the heart.

The S’up Spoon is the embodiment of good design. The design was inspired by deep empathy for a user’s problem, and the solution involves as little design as possible. There are few technologies in this world that we trust enough to put in our mouths. If you can make it in this space, you can make it (almost) anywhere. During the fall semester, Moira and I visited the Carnegie Museum of Art. They had an exhibition on accessibility design, and I was brought to tears by stories of innovation and vibrant improvements to quality of life for people with disabilities. Technology, at its very best empowers people to realize their fullest potential. We can easily get lost in the exhilaration of the complex, but this impulse must not dampen our ability to appreciate the elegance of simplicity. Some problems are best solved by form. I saw many incredible solutions in that exhibition, but this spoon has really stuck with me.

My goal is not to make something better, but perhaps a little bit different. The shape of the S’up spoon is intuitive, and if we had never seen a spoon before, we might conclude that it is the better design over more traditional forms. It is however, under our current cultural context, a strange thing to behold. It looks more like a wizard’s pipe or a warrior’s horn. It is beautiful and ergonomic. I do not intend to elevate those specifications. Instead, my goal is to make a spoon that is inconspicuous while still achieving similar results for users who suffer from motor movement difficulties.

How has my first design faired under simulation?

While I can certainly see the appeal of a long hollow channel, I’ve become increasingly concerned with how this shape my be difficult to keep clean. I can imagine objects getting wedged toward the back depending on what is being consumed. I have began to work on a second iteration with a more shallow channel. Still, this first iteration does fairly well. It is managing to retain most of the 15ml (i.e., 1 tablespoon) of fluid under rapid movement.

I enjoyed this simulation so much that decided to make a rendering:

I have not yet gotten back into Cinema 4D to evaluate RealFlow. Despite the challenges regarding compatibility, I am truly impressed with how powerful this open source software has become with this latest release.

Now that I have established this workflow, I can easily switch out revised designs to test under identical conditions. I’m still not sold on the current handle shape, and I think I can improve liquid retention by tweaking the angle of the lips. The flat bottom (Chinese style spoon) does fairly well, with it’s obtuse angle walls. Next, I will try a concave structure with a wider base for the handle and a more aggressive descending angle.

Prototyping – Part 2

Working with Blender has continued to go well.

I have also been looking at some of the existing solutions in this space:

KFS Easy Eat

http://www.eating-help.com

Liftware, by verily

https://www.liftware.com

EliSpoon

https://elispoon.com

Ornamin - Supportive Cutlery (Parkinson’s)

https://www.ornamin.co.uk/shop/cutlery-set?number=SW24

S’up Spoon

https://www.youtube.com/watch?v=C8nNlWw6KbA

Apex Medicine Spoon

https://www.riteaid.com/shop/apex-medicine-spoon-0233706

I have been sketching and studying these forms in consideration for my own designs.

Sketch.jpg
Sketch 1.jpg
Some shapes are unappealing because of their associations. These still deserve consideration, as they function well in this space.

Some shapes are unappealing because of their associations. These still deserve consideration, as they function well in this space.


This week I will begin iterating designs in Fusion 360. Hopefully, I will finally be able to make use of my RealFlow trial license. I’m curious to see how the “out of the box” settings function with these geometries.

Fluid Simulation in Blender

“Throw off your fears let your heart beat freely at the sign that a new time is born.” — Minnie Riperton

I’ve completed my workflow design for fluid dynamics testing in Blender. Here’s a proof of concept:

Now I just need to figure out RealFlow in C4D…

Blender: Time To Make The Donuts

I’m very happy with the results of my first project in Blender.

White Frosting

White Frosting

Classic Pink

Classic Pink

And Nekkid!

And Nekkid!

After struggling with Cinema 4D, I decided to use a plugin rather than trying to cludge together some handcrafted fluid simulation. Unfortunately, I ran into a licensing problem with Next Limit’s RealFlow plugin. I’ve been emailing back and forth all week, and they finally got a fix for me — SUCCESS! While I waited to sort that out, I decided to give Blender a try. I began following this tutorial, but quickly found myself getting lost. I needed to become more familiar with the software and interface. I decided to follow a beginner’s tutorial to get my feet wet. I choose this particular video series because it employs procedurally generated elements, and covers all the basic modeling commands, node handling, and the complete stack of scene construction. And also: donuts are delicious. The world is terrifying, and we could all use something sweet.

I am honestly shocked that Blender is a free program. Many 3D programs are horrifically expensive; without student licensing, I could never afford to touch most of the 3D tools I’ve been learning over the years. Yet Blender seems to be very capable as a 3D program. I have only scratched the surface, but it is very impressive. Now that I have familiarized myself with Blender, and have a working fluid simulator plugin for C4D, I am in good shape to begin A/B testing.

Prototyping Cutlery

For one of my final projects this semester, I’m interested in creating a set of eating tools that help account for involuntary muscle movements (e.g., Parkinson's disease or tremors) and other mobility difficulties that limit the enjoyment and consumption of foods; I'm interested in exploring simple solid shapes, living hinges, and assembly forms derived from explicit advantages of additive manufacturing techniques.

[I want to make a really nifty spoon.]

Fabricating physical prototypes will be a challenge (…)

Seriously: fuck you, COVID-19.

This is not the only challenge, however. Finding access to food-safe materials, conducting a series of user tests, iterating forms, and self-directed research will also require creative workarounds to overcome the limitations of working while under “shelter-in-place” orders due to global pandemic.

I have decided to go 100% digital. instead of building various forms and testing their ability to hold fluids under rapid motion, I will instead conduct a series of simulated physics tests to evaluate forms. For the first part of this project, I am required to conduct an A/B test or evaluation. I have decided to conduct dual testing using different 3D programs.

Method 1:

Maxon Cinema 4D includes a variety of physical simulation abilities—including particles and fluid dynamics. I intend to leverage this software’s capacity to test various designs and forms. Tests will be designed to evaluate fluid retention under repeated multi-axial movements. Cutlery designs will be tested against traditional forms (e.g., standard soup spoons).

Method 2:

Blender is a free, open source platform for creating 3D models, rendering, animation, and more. Among the built-in features is a fluid simulator. Combined with rigid body and gravity physics, it should be possible to evaluate a variety of spoon shapes and (potentially) even different forms of cutlery.

Considerations:

By using two different simulations, it should be possible to more thoroughly evaluate a design’s fluid retention abilities.

Timeline:

Week 1 — Cinema 4D Workflow: Since I am already familiar with Cinema 4D, I have decided to begin this project by constructing my first simulation with this software. I will use Fusion 360 to generate original spoon designs, as well as a “traditional” spoon shape to compare performance.

Week 2 — Blender Workflow: Using the assets from week 1, I will spend week 2 developing and executing a comparable test running under Blender’s fluid simulation engine.

Resources:

Blender Tutorial - Realistic Fluid Simulation: https://www.youtube.com/watch?v=zmw-BTCbWMw

Cinema 4D Tutorial - Water simulation Animation: https://www.youtube.com/watch?v=JehbYBAZw7c

What does Day 1 look like?

Let’s just say I have a lot to learn.

Interactive Design Prototyping

THE TIME HAS COME TO…PUSH THE BUTTON

Wireless communication between Arduino #1 and #2

Wireless communication between Arduino #1 and #2

My current project in IxD Prototyping involves physical computing (i.e., “interactive systems that can sense and respond to the world around them.”) I have worked with Arduino before (Restricted Area, 2017) but this newest project is expected to have a daily use. In my head, I keep a long list of annoying technology interactions—this gets updated frequently. We are saturated with unsatisfying technology and devices that cause more problems than they solve. We have inconveniences stacked upon inconveniences, and if we were to step outside of this environment, you would inevitably conclude that most electronics are made to punish the buyers. I am looking to improve just one such interaction.

Back in 2012 I bought an HD video projector. If you love to watch movies, there is something magical about having “the big screen” at home. I love it. Do you know what I don’t love? Using an infrared remote control on a devices that is mounted above and behind me. Seriously, Epson: what where you guys (and yes, I’m assuming it was a team of men, with their dumb penises getting in the way of common sense) thinking?! The primary function of the remote control is to simply turn the projector on and off. I would gladly give up the remote control entirely if I could simply move the power button to the armrest of my couch. Instead, I must contort my arm in Kama Sutra fashion just to find the right angle to get the sensor to recognize the POWER-ON command from the remote.

Getty Images: the various methods for turning on an Epson HD Projector.

Getty Images: the various methods for turning on an Epson HD Projector.

My girlfriend’s method to bypass the projector is more elegant: she retrieves a step-stool from our utility closet and presses the ON/OFF button on the projector chassis. This works well, but … well, let’s just say, it ruins the mood. I began to explore other options, and realized that the primary issue is that IR remotes are directional. The IR sensor is part of the assembly, and cannot be relocated. Arduino is capable of IR communication, it is also capable of RF communication. Radio frequency is far less dependent on line-of-sight, especially within the context of indoor and residential use. Imagine what WiFi would be like if it worked over infrared. Consider also that Apple abandoned their IR remote interface for the Mac.

Enter the Arduino

I found a few open source projects that utilize IR and RF communication:

https://learn.sparkfun.com/tutorials/ir-communication/all

https://www.electroschematics.com/ir-decoder-encoder-part-2-diy-38-khz-irtr-module/

https://create.arduino.cc/projecthub/electropeak/use-an-ir-remote-transmitter-and-receiver-with-arduino-1e6bc8

https://learn.adafruit.com/using-an-infrared-library/hardware-needed

https://www.sparkfun.com/datasheets/Components/nRF24L01_prelim_prod_spec_1_2.pdf (PDF Warning)

https://www.deviceplus.com/arduino/nrf24l01-rf-module-tutorial/

https://forum.arduino.cc/index.php?topic=421081.0

https://howtomechatronics.com/tutorials/arduino/arduino-wireless-communication-nrf24l01-tutorial/

All of these resources are excellent. I want to call attention to one more link: https://create.arduino.cc/projecthub/muhammad-aqib/nrf24l01-interfacing-with-arduino-wireless-communication-0c13d4

I have a bone to pick with this one. Take a look at the wiring diagram:

Diagram created by /u/Muhammadaqibdutt

Diagram created by /u/Muhammadaqibdutt


Note the LED pin-out for the receiver. This diagram shows the positive leg of the LED connecting to Pin 3

Now, lets take a look at the code:

SOURCE.png

The devil is in the details: “digitalWrite(6, HIGH)” condition turns the LED on. Pin 3 does nothing.

This made for some very “fun” troubleshooting. I’ve since ironed out all the kinks, and have successfully pirated the IR remote signal from an Epson brand projector (on loan from the Design Office at CMU), and have moved on to making an enclosure. Will I 3D print or laser cut? I have not yet decided.

Here is some sample code for my RF triggered IR emitter:

(NOTE: this code is just one half of the project, and by itself cannot do anything. You’ll also need IR and RF libraries to make this code work on your Arduino)

#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>
#include <IRLibAll.h>
RF24 radio(9, 10); // CE, CSN
const byte address[6] = "00001";
boolean button_state = 0;
int led_pin = 3;
IRsend mySender;
void setup() {
  pinMode(6, OUTPUT);
  Serial.begin(9600);
  radio.begin();
  radio.openReadingPipe(0, address);   
  radio.setPALevel(RF24_PA_MIN);
  radio.startListening();
}
void loop()
{
  if (radio.available())
  {
    char text[32] = ""; 
    radio.read(&text, sizeof(text)); 
    radio.read(&button_state, sizeof(button_state));
    if (button_state == HIGH)
    {
      digitalWrite(6, HIGH);
      Serial.println(text);
      //Arduino Remote On/Off button code
      mySender.send(NEC, 0xffa25d);
    }
    else
    {
      digitalWrite(6, LOW);
      Serial.println(text);
    }
  }
  delay(5);
}