Yes, Virginia, trans rights are human rights

The Easter Resurrection: Not of Christ, but of Common Decency

Dear Editor: I am 42 years old, and some people say you should remain silent when family members casually dehumanize entire groups of people. These same people probably think chocolate bunnies are an appropriate substitute for standing up for basic human dignity.

I recently had a conversation with someone—let's call her "Virginia," because nothing says "protecting the identity of transphobes" like using the name of an entire commonwealth—who expressed views about transgender people that would make Tucker Carlson blush with professional jealousy.

After ending our call with the diplomacy of someone evacuating a building that smells like gas, I received a series of texts that could generously be described as "I'm doubling down!" First came the classic non-apology ("I'm sorry YOU don't believe I should FEEL this way"), followed by a YouTube video about a transgender swimmer. The video was deeply unserious, offering nothing more than subjective anecdotes, misrepresentation of institutional policies, slippery slopes, straw men, false equivalences, confirmation bias, intentional misgendering, and overgeneralizations.

In short, it was an interview with a bigot who wants to be seen as a victim and not a perpetrator. The crescendo was a cherry-picked news story about a rape case where—surprise!—pronouns were somehow framed as the real villain.

Today is Easter Sunday. It is a holiday celebrating a guy who was literally known for hanging out with society's outcasts and telling self-righteous people to stop being such massive jerks—a position deemed so extreme that the state politely replied by torturing him to death in public.

So, Virginia, this isn't a debate. I'm not "just asking questions" or "having a dialogue" about whether some humans deserve basic dignity and legal protection. I'm writing this to explain why you're wrong—not just factually wrong (though hoo boy, are you ever), but wrong in the deeper, "this-is-not-who-you-were-raised-to-be" sense of the word.

Consider this my Easter gift to you: an opportunity for resurrection. Not of the 2,000-year-old preacher kind, but of your basic human empathy. After all, if there's one thing Easter teaches us, it's that profound transformation is possible—even from those we've given up for dead.

The Drag Queen Paradox

There was a story on the news last night showing life imitating an old children's riddle. It seems that a truck got stuck at the entrance to the Holland Tunnel. Too high for the clearance. Well, for hours, the experts tried to find some way to unwedge the vehicle, but to no avail. Finally, a ten-year-old girl in a passing car suggested simply letting the air out of the truck's tires, thus lowering it to the clearance level, which they did. And it worked.

Working Girl (1988)

Remember that time in elementary school when your teacher asked a riddle that stumped all the adults but seemed blindingly obvious to the kids?

That's exactly what we're dealing with when it comes to anti-trans arguments. The argument goes something like this: "I have no problem with a man wearing a dress, but I don't think that they should be doing it around kids! I heard that some schools are even letting these drag queens visit schools! They're reading stories to children and indoctrinating them with gender ideology!"

Easter Bunny costume worn at a public event.

A deviant radical rabbit impersonator spreading their bunny ideology, allegedly.
source: https://bellevuecollection.com/easterbunny/

Here's the paradox at the heart of Virginia's position (and those of the countless talking heads who've monetized this particular flavor of outrage): If gender truly is fixed, biological, and immutable, then dressing up in drag is simply theater—a harmless performance, no different from wearing any other costume. It can't "change" or "influence" a child's gender identity any more than dressing as the Easter Bunny can turn a child into a rabbit.

This is not a light point. The biological essentialists love to clutch their pearls, and claim that their bigotry is rooted in concern—"Think of the children!"—and will gleefully refer to transwomen as "men wearing dresses." By this logic, we've already established that drag (and by extension, social transition) is just a presentation of fabric and makeup. The person underneath remains unchanged because, remember, gender is supposedly immutable!

But if drag is somehow "dangerous" or influential to children's gender identities, then gender must actually be fluid, socially constructed and influenced by culture, environment, or role models. This would mean gender isn't strictly binary or biologically fixed—and if gender is malleable, there's no basis for discriminating against people whose identities differ from traditional norms. To suggest that merely witnessing drag performances could somehow alter a child's gender identity is to admit that gender isn't fixed after all. Oops!

So which is it? Is gender an unchangeable biological reality, in which case drag is just harmless dress-up? Or is gender so fragile and impressionable that it can be altered by exposure to someone wearing sequins and reading "The Very Hungry Caterpillar"? Because it literally cannot be both.

The truth is that neither position justifies bigotry. If gender is fluid, then we should respect transgender identities as valid, part of the human continuum of gender and sexuality. If gender is fixed, then a trans person's existence is not a threat to anyone else's gender identity. Either way, the only logical conclusion is respect and acceptance.

This isn't advanced calculus. It's not even long division. It's the kind of basic reasoning any third-grader could follow. And that's what makes it so maddening—the failure isn't one of intellectual capacity, but of willful blindness. Because the point was never logical consistency. The point was always to find a socially acceptable veneer for plain old prejudice.

The Time-Wasting Machine: Why "Debating" Human Rights Is A Trap

A man walks into a pawn shop and asks, "How much for this gold chain?" The pawnbroker, seasoned and skeptical, takes the chain and performs his usual test—scratching it with a file. As expected, it's fake.

The next day, the same man walks in and presents another supposed gold chain. Still skeptical, the pawnbroker applies acid to this chain, and finds that it is also a fake. Day after day, the same person brings in gold chains. Each time, the pawnbroker tests them, and each time, they turn out to be counterfeit.

Eventually, THE PAWNBROKER stops testing altogether, dismissing every chain as fake without a second glance. It's a waste of time, and the pawnbroker has better things to do.

—The Parable of The Pawnbroker

Virginia, those garbage YouTube videos you sent me are counterfeit gold chains. And frankly, I've run out of acid. The fact is that you've had plenty of time to sincerely research this topic in earnest, to listen to queer and transgender voices, and learn with empathy and an open mind. You've chosen instead to pick low-hanging fruit from low-effort propagandists on the dregs of the internet. Dismissing expertise doesn't make you a rebel, it makes you a fool.

As YouTuber Thought Slime (aka Mildred) brilliantly explains in their video "Fascists Will Waste Your Time," debates with bigots aren't actually debates at all. They're time-theft operations. While you're out there living your life (working, caring for loved ones, enjoying hobbies, and (presumably) having sex) the professional bigot has nothing better to do than argue endlessly about whether certain humans deserve basic dignity.

Their strategy isn't to win these arguments. It's to have them in the first place. To keep the question perpetually open. To make it seem like human rights are up for reasonable discussion between well-meaning adults instead of what they actually are: non-negotiable table stakes for participating in civilization.

And, I'm paraphrasing Mildred here, ask yourself this: If I came to your house for Easter dinner and spent the entire meal loudly questioning whether your best friend Amanda deserves basic human rights—"I'm just saying, what if Amanda shouldn't be allowed to use public bathrooms? I'm just asking questions!"—how long would I remain at your table? Ten seconds? Maybe fifteen if you needed time to put down your fork?

Yet somehow when the target of this rhetorical harassment is an entire marginalized group of Amandas, we're all expected to entertain these arguments ad nauseam. We're labeled "extreme" if we don't patiently explain, for the eleventy millionth time, why transgender people deserve legal recognition, fair access to public spaces, housing, healthcare, and the basic courtesy of being addressed by their correct names. I call bullshit.

It's an exhaustion tactic. In today's saturated outrage media environment, nobody has time to research and formulate thoughtful responses to every half-baked theory pushed by rage-merchants on Facebook. We're drowning in information, and the anti-trans crowd knows it. They don't need to convince you with valid arguments—they just need to overwhelm you with so many bad ones that you throw up your hands and say, "I don't know what to believe anymore."

Description: a Nazi loser doing Nazi loser shit.
source: https://www.youtube.com/watch?v=JN1oBfg0fwI

This strategy is particularly effective on people who feel economically or socially vulnerable. When you're worried about paying medical bills or keeping your job, it's oddly comforting to believe there's a simple enemy responsible for your struggles. The billionaire class certainly isn't going to volunteer for that role, so they make sure the spotlight stays firmly fixed on marginalized groups who make convenient scapegoats.

The fascist doesn't debate to discover truth. Truth is irrelevant. They debate to normalize the idea that some people's humanity is questionable in the first place. Every minute you spend arguing whether trans people are "real" is a minute where you've implicitly accepted that this is something reasonable people can disagree about.

It isn't.

So the next time someone tries to drag you into one of these "debates," remember the pawnbroker. Remember that you can simply say, "Shut up, you're being an asshole." It's not closed-minded—it's time management. It's recognizing that some positions aren't worthy of debate, some gold chains are obviously fake, and some YouTube videos are just thinly-veiled hate wrapped in a trench coat of pseudo-intellectualism.

Simply put: A fascist will always have endless bad faith arguments, but reasonable people are not obligated to entertain every shitty thing a Nazi has to say.

Pass the Easter ham, please.

History Doesn't Repeat, But It Sure Does Rhyme

“Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words.

The anti-Semites have the right to play. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert.

If you press them too closely, they will abruptly fall silent, loftily indicating by some phrase that the time for argument is past.”

Jean-Paul Sartre

There's a reason the words "Never Again" still echo through Jewish communities worldwide. History has taught us that atrocities don't begin with violence—they begin with words. With dehumanization. With the quiet acceptance that some people's dignity is up for debate.

Before we dive into this section, let me acknowledge something: comparing anything to Nazi Germany risks hyperbole. The totalitarian machinery of the Third Reich was uniquely horrific. But understanding how that machinery was built—brick by rhetorical brick—isn't alarmist; it's essential pattern recognition.

Loser Nazi assholes destroying books taken from the library of Dr. Magnus Hirschfeld
Source: https://daily.jstor.org/90-years-on-the-destruction-of-the-institute-of-sexual-science/

Many people don't realize that when Nazi forces came to power in 1933, one of their earliest targets wasn't Jewish people, it was the transgender and homosexual community in Berlin. At the time, Berlin was home to the Institut für Sexualwissenschaft (Institute for Sexual Science), founded by Dr. Magnus Hirschfeld—a pioneering research center that conducted groundbreaking studies on human sexuality, including some of the earliest work supporting transgender individuals. On May 6, 1933, just months after Hitler became Chancellor, Nazis raided this institute. Days later, they burned its library of over 20,000 books and research materials in a public bonfire, destroying decades of irreplaceable knowledge about gender and sexuality. This wasn't a random act of violence—it was a methodical erasure of knowledge that contradicted their ideology.

As historian W. Jake Newsome documents, Nazi officials like Wilhelm Frick declared that "unnatural fornication between men must be prosecuted with all severity as this vice will lead to the downfall of the German people" [Pink Triangle Legacies Project]. Sound familiar? The rhetoric of "protecting children" and "saving civilization" has always been the cover for targeting marginalized groups.

The pattern is distressingly predictable:

  1. First comes the dehumanizing language: labeling a group as "degenerate," "unnatural," or a "threat" to society.

  2. Then come legal restrictions: laws that push the targeted group out of public life.

  3. Finally comes violence—sometimes state-sanctioned, sometimes merely state-tolerated.

After the initial book burnings, the Nazi regime dramatically expanded persecution under Paragraph 175 of the German Criminal Code, which criminalized homosexuality. Between 1933 and 1945, approximately 100,000 men were arrested for homosexuality, with roughly 50,000 officially sentenced. Of these, between 5,000 and 15,000 were sent to concentration camps, where they were forced to wear the infamous pink triangle. Many were subjected to torture, medical experimentation, and execution. Survivor testimonies, like that of Josef Kohout (who wrote under the name Heinz Heger), detail unimaginable cruelty—beatings, rape, and public humiliation specifically targeting gay prisoners.

This persecution wasn't just the work of fanatics at the top. It required the quiet complicity of ordinary citizens—people who might have personally known someone gay or transgender but who chose to look away, to accept the new normal, to believe that maybe there was something to what the authorities were saying about "those people."

I can hear you now, Virginia: "But that was Nazi Germany! That's not what's happening here!" And you're right—we aren't living in a totalitarian dictatorship... yet. But that's precisely why these early warning signs are so important to recognize.

In Florida alone, the past few years have witnessed a staggering acceleration of anti-transgender legislation. In May 2023, Governor Ron DeSantis signed a package of bills that banned gender-affirming care for minors, restricted it for adults, prohibited transgender people from using bathrooms matching their gender identity in government buildings, and created new restrictions on drag performances. As DeSantis himself put it, "We are going to remain a refuge of sanity and a citadel of normalcy" [BBC]—implying, of course, that transgender identities are neither sane nor normal.

In 2024, HB 1639 (dubbed the "Trans Erasure Bill") passed committee in the Florida House, aiming to ban transgender Floridians from accessing accurate driver's licenses and IDs. As Equality Florida noted, the bill exists "for the purpose of bullying transgender Floridians out of public life entirely" [Equality Florida]. These aren't isolated actions—they're part of a coordinated nationwide campaign. The year 2025 is already on track to break records for anti-LGBTQ+ legislation, with over 120 bills filed across the country before the year even began [Truthout].

When a transgender Floridian named Amy Lundberg sought gender-affirming surgery, she discovered that the University of Miami had stopped providing these services altogether—not officially because of the legislation, but because "the state's holding back funding for any institution that does anything" supportive of transgender people [NBC Miami]. This isn't about protecting children; it's about erasing an entire community from public life.

Like in 1930s Germany, the persecution isn't being carried out by monsters. It's being enabled by ordinary people—people like you, Virginia—who may not harbor deep hatred in their hearts but who have been convinced that there's something dangerous about transgender existence. People who wouldn't personally harm anyone but who vote for politicians promising to "restore normalcy" by excluding those who don't fit narrow definitions of gender. People who might say, "I have no problem with them, but..." and then proceed to explain why basic human dignity should come with conditions attached.

"Nice people made the best Nazis. My mom grew up next to them. They got along, refused to make waves, looked the other way when things got ugly and focused on happier things than "politics."

They were lovely people who turned their heads as their neighbors were dragged away. You know who weren't nice people? Resisters."

― Naomi Shulman

The lesson of history isn't that we're doomed to repeat it. The lesson is that we have a choice. We can recognize these patterns early, speak out against dehumanization in all its forms, and refuse to be complicit in the gradual erasure of our neighbors' humanity. Or we can look away until the machinery of hate has gained too much momentum to stop—when non-violent resistance is no longer a viable option and we must throw ourselves against the gears until they seize.

As federal judge Robert Hinkle wrote when permanently blocking Florida's anti-transgender healthcare law in June 2024: "Transgender opponents are of course free to hold their beliefs. But they are not free to discriminate against transgender individuals just for being transgender. In time, discrimination against transgender individuals will diminish, just as racism and misogyny have diminished" [LA Times].

History may not repeat exactly, but it rhymes. And right now, Florida's laws are rhyming with some very dark chapters of our past. The question isn't whether you'll end up on the right side of history—it's whether you'll get there before more damage is done.

Let's be crystal clear about something that right-wing politicians and professional idiots love to obscure: there is no actual scientific debate about the validity or necessity of gender-affirming care. Every major medical organization in the United States supports gender-affirming care as medically necessary and often life-saving. This includes:

  • The American Medical Association, which has explicitly stated that "gender-affirming care is medically necessary" and "has been linked to dramatically reduced rates of suicide attempts" AMA, 2023

  • The American Academy of Pediatrics, representing over 67,000 pediatricians

  • The American Psychological Association

  • The American Psychiatric Association

  • The American Academy of Child and Adolescent Psychiatry

  • The Endocrine Society

  • The World Professional Association for Transgender Health

  • The American College of Obstetricians and Gynecologists

  • The American College of Physicians

Collectively, these organizations represent more than 1.3 million doctors across the United States [HRC, 2025]. The medical consensus is overwhelming. This isn't a "both sides" issue where reasonable people can disagree—it's a case where politicians are overriding the recommendations of literally every relevant medical expert simply because it's politically convenient.

What's At Stake: This Isn't Academic

“If you can convince the lowest white man he's better than the best colored man, he won't notice you're picking his pocket. Hell, give him somebody to look down on, and he'll empty his pockets for you.”

― Lyndon B. Johnson

The culture wars over transgender rights aren't just Twitter arguments—they have devastating real-world consequences. People's lives, literally, hang in the balance.

A 2024 study published in the peer-reviewed journal Nature Human Behaviour found that anti-transgender laws directly caused increases in suicide attempts among transgender and nonbinary youth by as much as 72% —not just correlation, but causation, established by tracking over 61,000 trans and nonbinary youth across five years [Trevor Project, 2024].

The CDC's Youth Risk Behavior Survey found that approximately 26% of transgender students attempted suicide in the past year, compared to just 5% of cisgender male students [CDC, 2023]. These aren't just statistics—they're children, siblings, and friends driven to the brink by a society that treats them as political talking points instead of human beings.

And it's not just the heightened suicide risk. Anti-transgender laws create cascading social and economic harms that ripple throughout communities—often the same communities that right-wing politicians claim to champion. Here's where it gets deeply ironic: the very people who rail against transgender rights are often hurting themselves in the process.

Consider the case of Trevor, featured in Jonathan Metzl's award-winning book "Dying of Whiteness." Trevor was a conservative white man in rural Tennessee with severe liver failure who needed medical care. Because Tennessee repeatedly blocked Medicaid expansion under the Affordable Care Act, he couldn't access the lifesaving care he needed. When asked if he supported Obamacare, Trevor told researchers: "Ain't no way I would ever support Obamacare or sign up for it. I would rather die." [Boston Review]. His reason? He didn't want his tax dollars "paying for Mexicans or welfare queens."

This is the tragic irony that Metzl documents extensively: conservative white Americans often support policies that literally shorten their own lives. In Tennessee, resistance to the Affordable Care Act meant that white Americans who would have really benefited from healthcare reform were "loath to support Medicaid expansion" because they didn't want minorities to benefit [Boston University]. Metzl's research found that this opposition to expanded healthcare "cost every single white resident of the state 14.1 days of life." [Metzl].

The same dynamic plays out in the fight over transgender rights. By supporting politicians who demonize trans people, many rural Americans are backing leaders who are simultaneously gutting their healthcare, defunding their schools, and dismantling economic protections that would benefit them directly.

Rural communities and hospitals in states that refused to expand Medicaid have suffered disproportionately. According to the Kaiser Family Foundation, expansion was associated with "a large reduction in hospital closures" [KFF]. When rural hospitals close, entire communities lose access to healthcare—not just transgender people.

Meanwhile, transgender people face extraordinary economic challenges. According to the Williams Institute, transgender people are four times more likely than the general population to be living below the poverty line, with more than 25 percent reporting an annual household income of less than $20,000 [Center for American Progress]. In states that expanded Medicaid under the ACA, the uninsurance rate among low- and middle-income LGBT communities dropped by 10 percentage points, compared to only 6 points in non-expansion states. This translates to real lives saved.

When politicians block healthcare access to score political points against transgender people, they're not just hurting the transgender community—they're hurting everyone who needs affordable healthcare, especially in rural areas. When they cut education funding while raging about "gender ideology," they're not just making schools less welcoming for transgender students—they're depriving all children of quality education.

Consider current statistics: According to the Human Rights Campaign, as of August 2024, 39.4% of transgender youth (about 118,300 teenagers) live in the 26 states that have passed bans on gender-affirming care [HRC]. These bans don't just affect transgender teens—they often include provisions that restrict public funds for healthcare across the board, limit what doctors can discuss with any patient, and interfere with the doctor-patient relationship for everyone.

Virginia, you might think the culture war over transgender rights doesn't affect you personally. But the politicians using transgender people as scapegoats are the same ones implementing policies that hurt your community, your healthcare, your schools, and your family's future. This isn't a coincidence—it's a deliberate strategy to distract you from the real sources of economic insecurity.

The medical evidence is overwhelming: gender-affirming care saves lives. A 2022 study in the journal JAMA Network Open found that gender-affirming care was associated with 73% lower odds of suicidality among transgender youth [JAMA Network Open]. When politicians override medical consensus, they're playing politics with people's lives—including the lives of their own constituents.

The fact is, the states with the most aggressive anti-transgender laws also have some of the nation's worst health outcomes, highest poverty rates, and most underfunded social services. This isn't helping anyone—it's hurting everyone except the politicians who ride the wave of manufactured outrage to power.

So the next time someone tells you that transgender people are the reason your community is struggling, remember: the real threat isn't the transgender teenager trying to live authentically or the drag queen reading stories at the library. The real threat is the cynical politician using them as distractions while picking your pocket and dismantling your healthcare.

Transphobia isn't just morally wrong—it's a scam by the wealthy that none of us can afford.

Beyond "Tolerance": The Problem With Being A Christian Bigot

I've tried to appeal to your compassion by showing the devastating harm caused by these laws. I've tried to appeal to your self-interest by showing how politicians use transphobia to undermine policies that would benefit your community directly. Now, let me appeal to your sense of religious values.

As we mark Easter Sunday, it's worth remembering what Jesus actually taught. He didn't tell us to protect "traditional gender roles." He didn't tell us to enforce conformity in others. He told us to love our neighbors as ourselves. He surrounded himself with society's outcasts—people who violated every social norm of his time. The Jesus of the Gospels would be sitting with transgender youth today, not calling for their erasure from society.

In fact, our modern concept of "tolerance" falls far short of what Jesus actually taught. "Tolerance" suggests putting up with something unpleasant—like tolerating a toothache or tolerating a boring dinner guest. Jesus called for love, not tolerance. And love doesn't say "I'll put up with your existence as long as you stay out of my sight." Love says "I see you fully as you are, and I cherish what I see."

Remember the story of the Good Samaritan? When Jesus was asked "Who is my neighbor?" he didn't respond with "Only the people who look and act like you." He told a story about a despised foreigner showing mercy to someone from a group that hated him. The point was radical: your "neighbor" isn't defined by similarity—it's defined by shared humanity.

Florida's policies don't reflect these values. They don't reflect compassion or mercy. They don't even reflect basic American principles of equal protection under the law. What they reflect is fear—the same fear that has driven persecution throughout history.

So Virginia, I'm asking you to ask yourself: What would Jesus do? Would he support laws that drive children to suicide? Would he support erasing people from public life? Or would he sit with the marginalized, heal the wounded, and rebuke those who use religion as a cover for cruelty?

The Easter story isn't just about resurrection. It's about transformation. It's about the possibility of radical change. It's about seeing the world anew.

We all have biases. We all inherit prejudices. But we don't have to be defined by them. We can choose to grow. We can choose to learn. We can choose to see people—all people—as fully human, deserving of dignity, respect, and love.

That's my Easter prayer for you, Virginia. Not just tolerance, but transformation. Not just reluctant acceptance, but genuine celebration of the beautiful diversity of human experience.

The resurrection we need isn't of some ancient religious figure. It's the resurrection of our collective humanity. It's time to roll away the stone of prejudice and step into the light of compassion.

Yes, Virginia, trans rights are human rights. And that's not up for debate.

Bibliography

  1. YouTube Video.UPenn Swimmer speaks out against Lia Thomas | The Story of Paula Scanlan” YouTube video, accessed April 20, 2025. https://youtu.be/BhhSvWwlaA0.

  2. Willis, Brianna. “Pronoun Use at Center of Rape Case Involving Former Prisoner in California.” ABC7 Chicago, December 23, 2024. https://abc7chicago.com/post/pronoun-use-center-rape-case-involving-former-chowchilla-central-california-womens-facility-prisoner-tremaine-carroll/15696730/.

  3. YouTube Video.Working Girl | #TBT Trailer | 20th Century FOX” YouTube video, accessed April 20, 2025. https://www.youtube.com/watch?v=va1UqFivi6A.

  4. Jack L Turban, Dana King, Jason J Li, Alex S Keuroghlian “Social Transition for Transgender and Gender Diverse Youth, K-12 Harassment, and Adult Mental Health Outcomes.” accessed April 20, 2025. https://pmc.ncbi.nlm.nih.gov/articles/PMC8612964/

  5. Thought Slime. “Fascists Will Waste Your Time.” YouTube video, accessed April 20, 2025. https://www.youtube.com/watch?v=tZzwO2B9b64.

  6. YouTube Video.Elon Musk makes 'Nazi-style salute' at Donald Trump's inauguration parade.” YouTube video, accessed April 20, 2025. https://www.youtube.com/watch?v=e2bbb-6Clhs.

  7. YouTube Video.Anti-Trans Matt Walsh Confronted for Spreading Disinformation to Joe Rogan” YouTube video, accessed April 20, 2025. https://www.youtube.com/watch?v=Ly4nFCs99Rw.

  8. YouTube Video.I Debunked the Entire Manosphere” YouTube video, accessed April 20, 2025. https://www.youtube.com/watch?v=BgO25FTwfRI&t=377s.

  9. Magnus‑Hirschfeld.de. “Institut für Sexualwissenschaft.” Accessed April 20, 2025. https://magnus-hirschfeld.de/ausstellungen/institute/.

  10. Newsome, W. Jake. “The Nazi Persecution of Queer and Trans People.” Pink Triangle Legacies Project. Accessed April 20, 2025. https://www.pinktrianglelegacies.org/newsome.

  11. United States Holocaust Memorial Museum. “Paragraph 175 and the Nazi Campaign against Homosexuality.” Holocaust Encyclopedia. Accessed April 20, 2025. https://encyclopedia.ushmm.org/content/en/article/paragraph-175-and-the-nazi-campaign-against-homosexuality#:~:text=Paragraph%20175.

  12. United States Holocaust Memorial Museum. “Documenting Nazi Persecution of Gays: The Josef Kohout/Wilhelm Kröpfl Collection.” USHMM Collections. Accessed April 20, 2025. https://www.ushmm.org/collections/the-museums-collections/curators-corner/documenting-nazi-persecution-of-gays-the-josef-kohout-wilhelm-kroepfl-collection.

  13. BBC News. “Florida Governor Signs Anti‑Transgender Bills Cementing State’s Status as ‘Citadel of Normalcy’.” BBC News, August 24, 2023. https://www.bbc.com/news/world-us-canada-65627756.

  14. Equality Florida. “Statement on Anti‑Transgender Bill Passing Final Florida House Committee.” Equality Florida. Accessed April 20, 2025. https://www.eqfl.org/statement-anti-transgender-bill-passes-final-florida-house-committee.

  15. Erin in the Morning. “Nearly 120 Anti‑LGBTQ Bills Filed across the US Before 2025 Has Even Started.” Truthout, December 21, 2024. https://truthout.org/articles/nearly-120-anti-lgbtq-bills-filed-across-the-us-before-2025-has-even-started/.

  16. NBC Miami. “Anti‑Trans Laws in Florida Devastate People Seeking Gender‑Affirming Care.” NBC Miami, January 15, 2025. https://www.nbcmiami.com/news/local/anti-trans-laws-in-florida-devastate-people-seeking-gender-affirming-care/3478401/.

  17. Rosenblatt, Kalhan. “A Florida Law Blocking Treatment for Transgender Children Is Thrown Out by a Federal Judge.” Los Angeles Times, June 11, 2024. https://www.latimes.com/world-nation/story/2024-06-11/a-florida-law-blocking-treatment-for-transgender-children-is-thrown-out-by-a-federal-judge.

  18. Project 2025 Observer. “Project 2025.” Project 2025 Observer. Accessed April 20, 2025. https://www.project2025.observer.

  19. Metzl, Jonathan M. “Dying of Whiteness.” Boston Review, November 20, 2019. https://www.bostonreview.net/articles/jonathan-m-metzl-dying-whiteness/.

  20. Temple, Kelly. “Judge’s Ruling Turns Spotlight on Tennessee’s Worn‑Torn Safety Net.” Tennessee Lookout, September 10, 2024. https://tennesseelookout.com/2024/09/10/judges-ruling-turns-spotlight-on-tennessees-worn-torn-safety-net/.

  21. The Trevor Project. “Anti‑Transgender Laws Cause up to 72 % Increase in Suicide Attempts among Transgender and Nonbinary Youth, Study Shows.” The Trevor Project, June 2024. https://www.thetrevorproject.org/blog/anti‑transgender‑laws‑cause‑up‑to‑72‑increase‑in‑suicide‑attempts‑among‑transgender‑and‑nonbinary‑youth‑study‑shows/.

  22. Centers for Disease Control and Prevention. “Youth Risk Behavior Survey.” MMWR 73 (2023). https://www.cdc.gov/mmwr/volumes/73/su/su7304a6.htm.

  23. Human Rights Campaign. “Attacks on Gender‑Affirming Care by State Map.” HRC.org. Accessed April 20, 2025. https://www.hrc.org/resources/attacks-on-gender-affirming-care-by-state-map.

  24. Human Rights Campaign. “Get the Facts on Gender‑Affirming Care.” HRC.org. Accessed April 20, 2025. https://www.hrc.org/resources/get-the-facts-on-gender-affirming-care.

  25. American Medical Association. “Advocating for the LGBTQ Community: AMA Policy on Gender‑Affirming Care.” American Medical Association, February 10, 2023. https://www.ama-assn.org/delivering-care/population-care/advocating-lgbtq-community.

A Practical Guide to "Vibe Coding" with Claude and MCP Tools

Late last year I began experimenting with "Vibe Coding" using Claude and MCP (Model Context Protocol) Tools, I've explored this new frontier of human-computer collaboration, where the most critical skill hasn't been mastering syntax or memorizing APIs, but communicating clearly and logically in an ongoing dialog with an LLM.

Read More

Jimmy Carter

One of the strangest things about adulthood is noticing how many prominent and influential people are fading away. I’m an “elder millennial,” so the first U.S. president I remember was Ronald Reagan.. When he passed, I felt conflicted. I generally thought poorly of his administration, but I also felt sad. It was like a little piece of my childhood died with him.

Jimmy Carter was before my time, and I always knew him as a “former president.” What stood out—so very, very admirably—was his unwavering dedication to serving others after leaving office. All my life, Jimmy Carter was out there: looking for ways to help vulnerable people, promoting peace, and building homes for those in need.

That he once held the title of President of the United States of America seemed almost secondary, an interesting line on a résumé. For months now, we knew Carter was in hospice care and that his life would soon be coming to an end.

Now that he’s gone, I find myself admiring him even more. He passed peacefully, having accomplished more than most could ever imagine. And in his passing, there seems to be broad consensus that he was the best former president of our lifetime.

I’m also sad because the world was better with him in it. People like Jimmy Carter are extraordinarily rare. I can’t imagine any former president following their political career with such generosity, humanity, or humility.

A Bicycle for the Mind

Amid the chaos of a world in crisis, I’ve found hope in an unexpected place: coding. With tools like Claude.ai and MCP, I’ve been building a web app to help food pantries serve their communities better—automating inventory, breaking language barriers, and streamlining processes. This isn’t just about code; it’s about turning anxiety into action, using technology to create something meaningful. If you’ve ever wondered how AI can amplify human effort, this is a story for you.

Read More

Learning Through Building: Adding Sort & Translation Features to a Food Pantry Web App

Today was a journey of wins and "learning opportunities" as I worked on improving William Temple House’s food pantry management system. The morning started great - implementing sortable table headers was surprisingly smooth — I’m continually impressed by the coding capabilities of Claude.ai. Working with both Claude and ChatGPT, we created a modular code structure, which made it easy to add sorting without breaking existing features.

But then came the interesting part. When I began tackling multi-language support, my API requests to OpenAI exploded into the tens of thousands. The goal seemed simple: translate food items into our client community's languages. William Temple House serves many non-English speaking clients: Spanish, Russian, Ukrainian, Chinese, Arabic, etc. I’ve integrated OpenAI's gpt-4o-mini API for translations, and everything worked beautifully... until it didn't.

Turns out a rate limit is easier to hit once the database of food items reaches critical mass. Who knew sending thousands of translation requests over a period of just a few hours would hit OpenAI's daily cap? Probably everyone who bothered to read the documentation first. :-)

It's actually kind of funny watching the error logs fill up with "please try again in 8.64 seconds" messages. Claude as an AI coding assistant was invaluable throughout this process. When I got stuck on implementing sort functionality, it helped refactor the code while maintaining the existing event-driven architecture. Later, when we hit the translation rate limits, it suggested implementing batch processing and queuing systems - solutions I wouldn't have considered immediately.

Key takeaways:

  • Small wins matter: The sorting feature works great in my Test UI

  • Read API docs before sending 10,000 requests

  • Sometimes the best solution is to wait 24 hours for rate limits to reset; I should be taking a break on a holiday weekend anyway

  • Having an AI pair programmer helps spot potential issues before they become problems

Next steps? Take a brief pause while OpenAI's rate limit resets, then return with a smarter approach to batch translations. As frustrating as the rate limit issue was, it's pushing us to build a more robust system.

I’m going to use some of this down time to reflect more on the process of building this system. I’ve learned a lot about how best to leverage AI for software development, and hope others will benefit from what I’ve found along the way.

Video Lecture: 3D Modeling Basics for Beginners – Techniques, AR Tips, and Intro to AI Tools

I have some exciting news! October 23rd, 2024, I was once again invited to guest lecture at CMU School of Design. I decided to follow up with a recorded version to share. In this recording, made after the original lecture session, I cover the essentials of 3D modeling with a focus on beginner-friendly techniques. You'll find practical insights into mesh modeling, workflow tips for Blender, and an introduction to preparing models for augmented reality. The full lecture video is embedded below, followed by detailed notes that offer a step-by-step breakdown of theory and techniques for anyone new to 3D design. Dive in, explore, and start building your own 3D modeling skills.

Principles of Mesh Modeling

Note on Mesh Modeling Focus—Or Why This Lecture Focused Primarily on Mesh Modeling:

Meshes are the standard 3D model type used in real-time 3D engines—like Unity, Unreal, and virtually every AAA video game title in the last 30 years, going all the way back to Quake, by id Software in 1996.

Key Principles:

  1. Use Quad Faces Whenever Possible: Design your shape faces with quads instead of triangles and ngons.
    Reason: Quads are infinitely divisible, making it easier to adjust geometry resolution as needed. Tris and Ngons are not as flexible, which can lead to undesirable artifacts and poor topology.
    3D games primarily use triangles (tris) instead of quads because triangles are the simplest polygon shape and always planar (a flat surface), making them computationally faster to render in real-time on limited hardware, which was crucial for early gaming systems underpowered computer hardware. Essentially, triangles require less processing power to calculate and display on screen compared to quads, which have more vertices and edges.
    On modern computer hardware we can get away with more complex geometry, and it's generally a better trade-off to build mesh models from quads. That is, the computational costs are vastly outweighed by the benefits of evenly divisible face geometry and more manageable topology. Lastly, quads are easily converted into tris, by producing diagonal edges between the four vertices.

  2. Work from the Lowest Possible Polygon Count: Always start with the lowest polygon count (i.e., resolution) for your model. You can increase resolution later with subdivision modifiers, but it's not as easy to reduce the resolution later.
    Reason: Editing a high-resolution mesh is more difficult than working with a low-resolution one, which offers greater control and flexibility. It also takes much more processing power and memory, which will slow down Blender and increase the risk of crashes.

  3. Keep Base Shapes Simple: Keep your base shapes as simple as possible. When adding details, create those elements as separate objects. When you hit a milestone, consider duplicating a model or a collection of models to a new instance for further refinement.
    Reason: This approach makes 3D modeling more manageable, allowing for easier adjustments and maintaining clean geometry.

  4. Use Modifiers and Non-Destructive Editing Whenever Practical: Designing a symmetrical shape? Cut it in half and use a Mirror Modifier to cut your editing time in half. Keep in mind that the most complex designs can ultimately be derived from very basic shapes: Spheres, Cones, Toruses, and Cubes.

  5. Work From Reference Images, Even If Just A Few Basic Sketches: Press Shift + A to open the Add menu. Navigate to Image > Reference. Select the image file you want to use from your computer. The reference image will be added to your 3D Viewport, where you can position, scale, and rotate it as needed for your modeling task.

  6. Build The Overall Form First, and Then Separate into Smaller Objects: This will ensure that your designs are cohesive and edges are properly aligned. When you're ready to divide into separate objects, duplicate the objects into a new Collection.

  7. Experiment, Tinker, Explore, and Start Over: You're unlikely to get the design right on the first attempt. It's often necessary to work through the problem, and then start over from scratch once you've had enough time to explore the form. Reason: Your second draft will almost certainly be better than the first.

Blender Quality of Life Recommendations:

  1. Save Your Project Files Early and Often: Use Blender's "Save Incremental" (⌥+⌘+S) (Option + Command + S) to manage version control. Doing this will give you the freedom to fearlessly tinker and explore (as mentioned in the previous point) before settling on a final design.

  2. Crank Up The Number of Undo Steps: Open Edit from the top menu. Select Preferences to open the Blender Preferences window. In the Preferences window, click on the System tab. Scroll down to find theUndo Steps setting.

    Increase the value (the default is 32). If you have enough system memory, set it to 256 for more flexibility in undoing actions. Close the Preferences window to save your changes.

  3. Consider Using A Material Library: Blender has a basic built-in material library, but it's not very useful. Look into large libraries, such as PBR Material Asset Library + OneClick Add-on for Blender (https://shardsofred.gumroad.com/l/CfOnY). Creative Commons License (CC0) materials can be used for basically anything, and will save you time.

  4. Remember to Perform a UV Unwrap on Your Model Geometry for Best Results When Texturing: The most realistic textures in the world won't help you if your model doesn't have good UV Mapping. Remember the chocolate Santa Claus example? Proper wrapping is essential for creating realism with your models. https://docs.blender.org/manual/en/latest/ modeling/meshes/uv/applying_image.html

  5. Recommended Extensions and Add-ons:

    • VDM Brush Baker: Helps you create and bake Vector Displacement Maps directly in Blender.

    • Bool Tool: Boolean operations for complex shape creation.

    • Node Wrangler: Enhances node editing management.

    • Rigify: Automated rigging solution for character animation.

    • Loop Tools: Useful for organic modeling (with some bugs appearing

      in Blender 4.2—be sure to keep this add-on updated!).

  6. Other Useful Add-ons: Auto Mirror, F2, Extra Mesh/Curve Objects, Extra

    Grease Pencil Tools, Copy Attributes Menu, and MeasureIt.

    Bonus: Need furniture? Most of IKEA's catalog of products have 3D models available. Search for "IKEA" under Extensions and you can easily search and import 3D models into your scenes.
    Note: Ensure 'Allow Online Access' is enabled in Blender's System Preferences for add-on updates.

Create Augmented Reality Experiences for iOS with Xcode Developer Tools, Reality Composer, and USDZ File Format

Once you've finalized your form, added necessary details, and applied your materials, you should be ready to export your model.

Step-by-Step Instructions for Preparing 3D Assets for Export to USDZ:

  1. Duplicate Your 3D Assets and Collections: Create a new instance of your 3D assets specifically for export.

  2. Apply All Transforms: Hit A to select all visible objects, then press ⌘ + A (Command + A) and select All Transforms to apply.

  3. Apply All Modifiers: Apply all modifiers in the same order they were added to each model—except for subdivision, as tessellation data can (usually) be included without applying it directly to the models.

  4. Join All Components: Hit A to select all visible objects, then press ⌘ + J (Command + J) to perform a join operation.

  5. Export the File: Go to File > Export > Universal Scene Description (usd*).

  6. Configure Export Settings:

    • Include: Check Visible Only and Selected Only.

    • Blender Data: Select Custom Data.

    • Namespace: Use the default setting (UserProperties).

    • Blender Names: Enable this option.

    • File References: Set to Relative Path.

    • Convert Orientation:

      •  Z = Forward Axis

      • Y = Up Axis

        Note: Many other 3D tools, including Xcode's tools, interpret 3D models with a different axis orientation than Blender. If you don't apply this conversion, you'll find your model improperly rotated following import. If this happens to you, double-check these settings.

    • Use Settings for Render: Enable this option.

    • Object Types: Select Mesh, Volumes, Curves.

    • Geometry: Enable UV Maps, Rename UV Maps, Normals.

    • Subdivision: Set to Best Match.

    •  Rigging: Enable Armatures (if you have rigged and animated your

      model).

    • Materials: Select USD Preview Surface Network and Export Textures.

    • USDZ Texture Downsampling: Set to 1024px or up to 2048px (the

      largest size acceptable for iOS QuickLook).

  7. Update File Extension: Change the export file name extension

    from .usdc to .usdz.

  8. If no issues are encountered after export, you should be able to view your model in Augmented Reality on any iOS device. Open your exported file from iCloud, send it as an email, text, or AirDrop to another device to view.

Setting Up Xcode and Reality Composer:

The latest version of Xcode doesn't include Reality Composer, as Apple has shifted their focus to the Vision Pro. You can still access the Augmented Reality Tools for iOS devices, with some additional steps.

Step-by-Step Instructions:

  1. Download the Latest Version of Xcode 14: Download from the provided

    link: https://developer.apple.com/download/all/

    NOTE: You'll need to create an Apple Developer Account (it's free) to access the above link, or using this direct link: https://download.developer.apple.com/Developer_Tools/Xcode_14.3.1/Xcode_14.3.1.xip

  2. Extract and Rename The Older Version of Xcode: Rename Xcode.app to Xcode14.app and place it in your Applications folder.

  3. Open Terminal on Your Mac.

  4. Open the Applications Folder in Finder.

  5. Drag the Xcode14 App into Terminal: This will automatically add its path.

  6. Add to the Path: Next to the path, add: /Contents/MacOS/Xcode.

  7. Full Command Example: The command will look like:

    /Applications/Xcode14.app/Contents/MacOS/Xcode

  8. Run the Command: Press Enter to run the command.

  9. You should now have access to Reality Composer in Xcode. Click on the Xcode menu on the task bar, then click Open Developer Tool, and then click on Reality Composer.

    Learn more about using Reality Composer here: https://developer.apple.com/documentation/realitykit/realitykit-reality-composer
    Learn more about Apple Reality Kit and ARKit here: https://developer.apple.com/augmented-reality/tools/

BONUS: Generative AI and 3D

Tripo AI (https://www.tripo3d.ai/app) is an advanced generative AI tool that allows for both text-to-3D and image-to-3D model generation. This tool offers users an intuitive way to create complex 3D assets with minimal manual input, simply by describing what they need or providing a reference image.

Key features:

  • Text-to-3D and Image-to-3D Conversion: Users can input a detailed description or upload an image, and within seconds, the AI generates a draft model ready for refinement.

  • Prompt: "A pineapple-hedgehog with spiky fruit armor and leafy quills."

    https://tripo3d.ai/preview?share=9a57357e-6262-469c-afb1-c7af74d92c93

  • Prompt: "A 1980s sci-fi robot stylized as a Nintendo NES product."

    https://tripo3d.ai/preview?share=a08a55cd-9e66-48a5-be3d-85a26160e461

  • High-Speed Generation: Tripo’s AI processes are optimized for efficiency, allowing users to generate detailed models in a matter of seconds, ideal for prototyping or quick visualizations.

  • Customization Tools: After generating a model, users can adjust topology for increased details, or apply stylization, such as voxels.

  • Seamless Integration: Tripo3D supports a variety of export formats like .usdz .obj and .fbx, making it easy to import models into Blender and other software for further editing.

  • Generate full texture maps with PBRs: includes generation of PBR textures, adding even greater details beyond the geometry.

  • Automatic rigging and basic animations: Applies a basic animation rig to generated models and simple animations, such as a running character, to the model geometry.

Downsides:

  • Imprecise generation: just like AI image generators, results are unpredictable and often wrong.

  • Costs: Using this tool will require a membership plan, and has limited monthly credits, which limits usage.

CREDITS:

Thanks to all of these wonderful educators and content creators who continue to inform and inspire me throughout my 3D journey. Preparing this lecture required lots of time and consideration for how to condense what I’ve learned over the last five years into something I could demonstrate in under 2 hours. This wasn’t easy, but I had many fantastic resources to pull from.
If I’ve left anyone out, please leave a comment so I can include them here:

YouTube Creators:

Reference Files:

Robot model created with Tripo AI

Robot model with corrected orientation

Reality Composer demo file

Interactive USDZ demo file

Note: Due to a bug, the robot walking animation doesn’t playback in QuickLook AR for iOS.

HAVE QUESTIONS? ASK PHIL

Have questions about CAD, Fusion 360, or the Portland maker scene? Ask Phil! He’s a Principal Software Engineer at Autodesk, inc. and teaches CAD at Portland Community College. He’s also the host of Community Conversations series: Getting started with 3D modeling in Fusion 360

You can reach him at phil.eichmiller@autodesk.com

Phil Eichmiller — Principal Software Engineer at Autodesk, Inc.

TUTORIAL: How to use ultra realistic Quixel Mixer materials with Fusion 360 [Part 2]

Welcome back! In Part 2, we’ll explore adding Quixel Materials to your designs in Fusion 360 and setting up a rendering scene. If you haven’t already, review Part 1 and install Quixel Mixer. You’ll want to create and export a mix for use in Fusion 360 prior to the steps in this tutorial, or download an example material set here.

First, let’s create a new project in Fusion 360:

  1. Creating a new Fusion 360 Project

After you open Fusion 360, Click “Save” and give your project a name. In this example I used “QuixelMaterialDemo.”

After you save your project, we’ll want to create a new component and make it active.

2. Create a new Component

This is generally a good practice with Fusion 360, because we can more easily manage changes made to the design when the timeline is broken up by individual component histories. Name your component “Floor” and then make sure “Activate” is selected (should be by default), click “OK” to continue.

Next, we’ll want to create a sketch to define the floor’s dimensions. Click “Create” and make a Center Rectangle on the bottom plane.

3. Create a Floor

Make your sketch 3 meters x 3 meters in size, with the Origin at the center. Click “Finish Sketch” to continue. If you’ve done everything right, then you should have a sketch that is fully constrained (i.e., you’ll see black lines instead of blue lines for the outer dimensions of your sketch).

Next, we’ll extrude the sketch below the plain. This will create a new body, based on our sketch dimensions.

Click Create and then Extrude. Then, extrude the sketch -1mm below the plane and click “OK.”

Next, Save the design. You’ve created your first body and now would be a good time to save your progress.

Note the reason for your save and Click “OK.”

Next, we’ll want to change the Appearance of our floor. Click Modify Appearance to bring up the Appearance Window.

4. Add material

Here we can see the default material for the Floor body. We’ll want to replace that material with our Quixel Mix. To do that, let’s start by downloading a similar material.

Note: in general, you’ll find it is easier to add Quixel Mixer materials when you adapt an existing material in Fusion 360 with similar attributes. In this case, we can use the existing Asphalt Material.

After the download finishes, click and drag the Asphalt material into your design.

We can then replace the default material with the Asphalt.

5. Replace Fusion 360 Material with Quixel Mix

Next, we can begin modifying the Fusion 360 Asphalt material with the Quixel Mix.

As mentioned in Part I, the materials in Fusion 360 are made up of individual map image files:

Albedo/Diffusion/Color — the color a material reflects

Normal and/or Height Maps — the bumps and imperfections along a surface

Roughness — the smoothness of a surface (ranging from a sharp reflection to fuzzy/diffuse)

Reflectance/Specular/Metalness — the reflectiveness of a surface (ranging from mirror finish to a dull surface)

Anisotropy/Ambient Occlusion — the shadows along a surface

Refractive —how light bends through a surface

Emissive — how much light a surface emits (glow)

Translucency/Opacity — how transparent a surface is to light

If you’re using the included sample images, you’ll find some but not all of these maps. Depending on what materials you’re mixing, you’ll need different image maps. The sample image package includes:

Floor_Diffuse.png — Color (placed in Parameters)

Floor_Roughness.png — Roughness (placed in Parameters)

Floor_Specular.png — Reflectance (placed in Parameters)

Floor_Normal.png — Normal (placed in Relief Pattern (Bump))

Floor_AO.png — Anisotropy (placed in Advanced Highlight Controls)

By replacing and adding these map files to the Fusion 360 Asphalt material, you can transform it to the Quixel mix. To start this replacement process, open the Appearance window, double-click the Asphalt material and then click “Advanced…”

Rename the material to “Quixel_Asphalt” to distinguish the material from the original Fusion 360 Asphalt.

Under Parameters, we can add three (3) image maps. First, we’ll apply the diffusion/color map to the Image input in Fusion 360. Click on the Image filename 1_mats_surface_asphalt_color.jpg and navigate to your replacement images.

Select your Albedo/Color/Diffuse map file. If you’re using the sample images, it’s the file named Floor_Diffuse.png. Click Open to replace the default image file.

Next, we’ll repeat the process with the Reflectance and Roughness maps. By default, these two material attributes are set as Slider values, click the drop down arrow and then select Image to replace the slider value with an image map.

Next, select the Metallic/Specular image map if you’re using the sample images, select Floor_Specular.png and click Open.

Next, repeat the same steps for the Roughness value. Select Image and then select your Roughness Map. If you’re using the sample images, select the Floor_Roughness.png.

Now that we’ve completed the three Parameter maps, we can move on to the Relief Pattern (Bump) map. Once again, we’ll replace the default image file (1_mats_surface_asphalt_.jpg) associated with the material. Note: Fusion 360 supports both bump and normal maps. If you want to know more about these two approaches to texturing a 3D model, then click here.

Next, we need to change the Relief Pattern from a Height Map to a Normal Map. To do this, we need to Edit the image.

Next, scroll down to Advanced and change Data Type to Normal Map.

Next, we need to ensure that all of our maps are using the same Sample Size. Be sure to repeat this step for all image maps. We also need to ensure that all of our Maps have Linked texture transforms. Check the Link texture transforms under the Transforms section of the Texture Editor. Be sure to repeat this step for all image maps.

These steps are important, because they ensure that all of the image map data are aligned equally to the material in Fusion 360. After you’ve verified these settings, you can click “OK” to finalize the changes to this material.

Now that the material has been updated you can Close the Appearances window.

To check and validate our new material, we need to switch to the Render Workspace in Fusion 360. Click on the Workspace button, and change it from DESIGN to RENDER.

6. Test render scene

Next, let’s save the design to capture the new material settings in your Fusion 360 Timeline. Click File and Save.

Fusion 360 will prompt you to describe your save point. Let’s name this save “Quixel Material Added” and click OK.

Before we can test our new material, we need to edit the SCENE SETTINGS from the SETUP Menu. Open the SCENE SETTING Window and Click+Drag “Dry lake bed” to the Current Environment and then Click Close.

We also need to change the IN-CANVAS RENDER settings to FAST, so that we can easily see the material’s performance during rendering. To do this, click on the IN-CANVAS RENDER SETTINGS icon and Click on the Fast tab. Then, Click OK to update the rendering method.

Next, we can preview the rendering, and see how the various maps work together under different lighting conditions. To do this, start the In-Canvas Rendering and then open Scene Settings, click on the Position Icon to bring up the Rotation and Scale Sliders. By changing the rotation, you can see how the surface of your floor object casts shadows at different angles, corresponding to the surface material.

Make sure to save your project to retain your rendering settings. If you’ve made it this far, then congratulations! You now have all of the information necessary to import Quixel Mixer materials in Fusion 360. In Part 3, we’ll explore some techniques for applying these materials to complex geometries, and how to post-process your images for additional realness. In Part 4, we’ll take these realistic models and generate Augmented Reality experiences for iOS.

Stay tuned!

Core77 Design Awards 2022

I’m very pleased to see Stuart Candy’s project “Imagination is a Commons” is the winner for Core77’s award for Speculative Design, 2022. Back in March of 2021, I received a somewhat unusual paid request — for studio photography services. One year into the COVID-19 pandemic, when vaccines were still out of reach for many, and facilities and institutions remained shuttered, I was suffering from cabin fever isolation and grieving the death of my uncle.

I was finishing up my second Master’s degree, and Pittsburgh had only begun to thaw after a long and difficult winter. Without access to the campus photography studio, Stuart had reached out to his network at Carnegie Mellon, seeking alternatives. As luck would have it, during my undergraduate studies I invested in my own studio photography setup.

My barebones digital photography setup

Scrappy resilience was a constant theme throughout 2020. Students without studio space were constantly finding ways to make do. This was one of those rare moments where few compromises were necessary, and I had everything I needed on hand. Imagine my surprise when I was handed a bag full of artifacts from the future…

T-shirts from a coding festival in the year 2030

Social distancing and staying home (for the better part of a year) had distorted my sense of time. In the first year of the pandemic, there were days and weeks that seemed to vaporize, and weekends that lasted a month. To hold these artifacts, and to focus on them through a viewfinder, I felt as though I had stepped completely outside of time and space. This was a perfect diversion from my mundane existence, and a reminder that this too shall pass.

Thank you, Stuart and Ceda. And congratulations!

Scripting in Blender

For folks who use blender to create animated characters for Unreal Engine, you might find the process of creating a rig fairly tedious. You may also discover that some constraints in Blender are causing problems with your exported skeleton in Unreal.

If you need to sanitize your bone constraints, this work can be labor intensive if done manually.

Here’s a simple script that will automate that process. (shoutout to manuDesk for providing this)

import bpy for bone in bpy.context.selected_pose_bones: for c in bone.constraints: if 'Constraint Name' in c.name: bone.constraints.remove(c) # Remove constraint

Replace “Constraint Name” with the name of the constraint you wish to remove from your rig, and then run the script. This could save you hours of work.

Suppose you need to copy constraints between a control rig and a deform rig. CGDrive has provided an entire video tutorial on this process. Below, is the code used in their script. Watch the video for a basic workflow, as there is some preparation necessary for this script to work.

import bpy sel = bpy.context.selected_pose_bones for bone in sel: name = bone.name bpy.context.object.pose.bones[name].constraints["Copy Transforms"].subtarget = name

Summer Internship: mid-July Update

It’s been nearly a full month since my last update, so what have I been working on?

The last few weeks have been incredibly busy. I’ve had several interviews and job application submissions. My gallery and design pages received some helpful updates. I’ve successfully built a standardized data format for a visualization project and introduced some of my fellow interns to the history and purpose of data visualization, and its role in matters of social justice and equality. I’ve also continued to make leaps and bounds on my 3D character work.

I’ve resolved the primary issue that was plaguing me early on in this project: mesh topology.

To be efficient for a game engine, it’s important to resolve mesh geometry with minimal faces. Unreal Engine 5 supports a new level-of-detail automation called “nanite,” which allows 3D modelers to create elements with a virtually unlimited number of faces (geometric detail). This does not seem to work with animated characters, however — if you know a way around this, please send me a message!

A “face,” in this context refers to a plane that has 3 (tris) or more points. Ideally, the geometry should have faces with 4 points (quads). While modern engines and graphics hardware have gotten significantly more advanced since the early days of 3D gaming, it’s nevertheless still important to avoid models with excessive polycounts (the sum total of faces in a model).


The challenge with my model is that it requires several tangent geometries that intersect with a central body. Image if you were modeling Tree Character, and you need to cover the various branches with leaves, blossoms, or pine needles. Blender can easily generate these elements using weighted vertex mapping and particle systems (hair). The difficulty, however, comes from trying to export this data for use in Unreal Engine.


Unreal Engine supports FBX models for import, but FBX models do not support shape keys. By default, Blender applies Particle System modifier as a shape key. You can, however convert particle instances into their own individual body/mesh component in Blender prior to export. This works, Unreal will interpret the model as being several individual bodies with shared coordinates. When exporting this type of arrangement, animation rigs cannot deform the free meshes — your leaves (or whatever) will stay in place while the rest of the tree animates.

You may already be thinking to yourself, “why not just combine the meshes?” And since the beginning of this project, I anticipated that this might be necessary at some point. The challenge then is how to create a combined mesh that is free of visual artifacts. Additionally, the boolean modifier in Blender only allows you to select a single target body to combine (union).

Thankfully, there is a Blender add-on for this last issue: BoolTool.

Suppose you want to combine all of your individual leaf meshes to the main tree body, but you know it would be insanely time consuming to go through the tedious sequence of selecting each leaf, one at a time, applying the modifier, creating a new modifier, and repeating that x700.

You could try writing a script, to automate this process, but you’d need to write it in python, and that would be a can of worms unto itself. You can already imagine the mission creep setting in…

BoolTool makes this process dead simple. You just select the mesh body you want everything to join with (active object), and then select all of those other meshes. Go to the BoolTool menu and select “union.” You may think that Blender has crashed, but give your computer a break. Grab a cup of coffee, meditate, embrace the here-and-now… *BING* [Your mesh combine operation is complete!]

BlenderBool.png

Assuming you created an efficient mesh, primarily out of quad faces, with intersecting meshes containing the same number of vertices, then you should have a single mesh geometry that is ready for rigging and animation.

I’ve tried dozens of other approaches so far, and this method produces the best results all-around: faster, easier to execute, high fidelity, and with the lowest poly count.

I’m still working out the kinks on my current model, and the animation sequences will be a next-level creative challenge for me (I have a lot of inspiration and ambition on this front), but I feel confident in the current direction, and early tests have been quite promising.

I’m also starting to make simple prototypes for augmented reality, and will add some notes about that later this month.

Week 3 and 4 Update: 3D content migration woes

This update is coming in late, as I’ve been trying to come up with some way to explain the difficulties I’m facing with this project, while also respecting the privacy and IP. I’m leaning heavy on metaphor, but…

Imagine there are three people sitting at a bar. One of them speaks Dutch, English, and German (but only okay German, not great). Another speaks Spanish, some English (okay-ish grammar, few nouns), and fluent German. The last person at the table is a little bit unusual. They’re a rapper from Japan, they speak fluent Japanese and a little English. As a child they were an exchange student in Germany, but they’ve forgotten most of what they learned. The rapper also insists on speaking-as-quickly-as-their-mouth-can-run. They never slow down. Speed is everything.

They all have some overlap, however imperfect, in their spoken languages, but none can understand each other perfectly. This is what it feels like to develop assets for a still-in-beta realtime engine, while leveraging a parametric modeler, an open source 3D creator tool, and adhering to standards from a professional VFX and CGI-specific 3D tool.

This week my primary focus has been on getting Blender and Unreal Engine to talk to each other. Unreal prefers FBX file format (Autodesk Maya Native). Blender can export most data in this format, but there are a few catches:

  • No shape keys

  • Subdivision data is limited to a single iteration

  • Dynamic animation can only be exported with baked physics, and is limited to vertices and face transformations (kind of, depending on what you’re doing).

Additionally, Unreal doesn’t quite understand Blender’s material system. It will still export textures and UV map data, but you’ll need to build the material blueprint to recreate whatever you have in Blender. It is far from being a 1:1 exchange.

There are also a many weird quirks:

In Blender, the default units are meters. In Unreal, the default unit is centimeters. Before exporting from Blender, you need to set Unit Scale to 0.01. If you switch units in Blender to centimeters and leave the Unit Scale at the default of 1.0, then you’ll experience strange anomalies for things like collider bodies, skeleton mesh, etc..

In naming the rigging elements (IK skeletons, etc.), you DO NOT name any bones “root,” because Unreal will assume a hierarchy which may differ from your hierarchy (parenting) in Blender. However, you may rename the Armature container to “root” to conform to Unreal’s hierarchy.

Lastly, the only rigging data that reliably translates between Blender and Unreal Engine are deform bones and baked animations/transformations.

I’ve reached a stumbling block with my current 3D character. I can rig the character to animate, and even output that data in a manner which Unreal Engine can interpret. This comes at the expense of a vital visual element that was procedurally generated. What comes next is a difficult choice:

I can either integrate procedurally generated elements into a single mesh geometry (potentially compromising some rendering performance and visual fidelity) but without having to rework existing animation and rigging, or I can attempt to recreate the procedural mesh instancing I developed inside Blender but natively within Unreal. The former will be labor intensive, but I understand the tools well enough to work consistently toward a known output. The latter involves many unknowns, but I will also gain a deeper understanding of workflows within Unreal Engine. I’m attracted to this latter option, but I don’t know if it is best for the client and their expectations.

Week 2: Summer Internship

This week I continued working on 3D asset creation. My basic approach so far has been to start with a simplified geometry from Fusion 360, then export that design as an .FBX (Autodesk Maya file format), import the FBX to Blender for UV mapping, material, and motion rigging. There’s probably a more streamline way to generate this content, but from a feasibility standpoint, this approach allows me to be flexible and to use different tools for discrete tasks. This week I will be importing these combined assets into Unreal Engine.

This week was also my final week for the term at PCC, where I have enrolled in their online course for Advanced Fusion 360. I’ve been working on a group project, and designing assemblies for use in a solar projector system. The design is based on COTS (commercial off-the-self parts), which required me to draft profiles to meet engineering specifications.

Picatinny rail specification downloaded from wiki-commons.

Picatinny rail specification downloaded from wiki-commons.

The final deliverables are due this coming Saturday, and there is still a good bit of work to be done before we get graded on this project. Nevertheless, I am very pleased with the current state of things. I’ve been using Quixel Mixer to produce more realistic rendering material than the library included with Fusion 360. I say, “more” realistic because Fusion 360 already has some excellent materials. Take a look at this rendering of a Bushnell 10x42 monocle (one of the components in this project):

Bushnell Legend 10x42 Ultra HD Tactical Monocular Black rendering v14.png

I haven’t yet added any details, but as you can see, the rubberized exterior and textured plastic hardware are fairly convincing. Now, take a look at the mounting hardware rendered with Quixel textures:

Picatinny rail bracket rendering v7.png

An important component in photorealism is the inclusion of flaws. Real life objects are never perfectly clean, perfectly smooth, or with perfect edges. Surface defects, dirt, scratches, and optical effects play an important role in tricking the eye into believing a rendering. With Quixel Mixer, it is possible to quickly generate customized materials. While this product is intended for use with Unreal Engine and other real-time applications, it does an amazing job when coupled with a physical based renderer.

Picatinny rail set with hardware and bracket.

Picatinny rail set with hardware and bracket.

I’m excited to see what can be done with these materials in a real-time engine, especially given the advanced features of Unreal Engine 5. Fusion 360’s rendering is CPU driven, whereas Unreal is GPU accelerated. With both Nvidia and AMD now selling GPUs with built-in raytracing support, it won’t be long before we see applications that offer simultaneous photorealism rendering within modeling workflows.

Additionally, GPUs also work extremely well as massively parallel computing units, ideal for physical simulations. This opens up all kinds of possibilities for real-time simulated stress testing and destructive testing. It wasn’t that long ago that that ASCI Red was the pinnacle of physical simulation via supercomputer. Today, comparable systems can be purchased for less than $2,000.

Of course, this price assumes you can buy the hardware retail. The current chip shortage has inflated prices more than 200% above MSRP. Fortunately, with crypto markets in decline and businesses reopening as vaccination rates exceed 50% in some regions, there are rays of hope for raytracing-capable hardware being in hand soon.

Week 2 mini-update: outputting video from Unreal Engine 5

Usually, I only update my blog on Sunday nights — I like to reflect after the week is done and I’ve had a full dose of daylight to consider what matters.

I’m breaking that rule because I’ve learned something that I think might be useful to others. Last week, Epic Games released a preview of Unreal Engine 5. If you haven’t looked at this tech, it’s worth your attention. We’re rapidly approaching a point where individual creatives (equipped with modern hardware) will be capable of producing photorealistic graphics in realtime. This is due to a convergence of procedurally-generated content, open libraries providing physically based materials, templates, and raytracing technology.

I’m a huge advocate for 3D technology. Being able to show something instead of telling it is huge. Consider all of the times in your life that you had an idea, something that you could clearly, vividly see inside your mind, but you felt was difficult or impossible to describe? What if you had the tools to quickly take your idea and represent it visually, with no limits to fidelity or realism? These tools exist, and they are getting better every day. Additionally, many of these tools are free and have a wealth of community-led learning and support.

Today I was asked to come up with a way to capture video from unreal, and I discovered a great way to do it in Unreal 5 Engine. Here’s how!

Geigertron’s Very Practical Guide to Exporting Video From Unreal Engine 5

For this example, I’m using a sample project based on MetaHuman Creator, you can download UE5 preview, it’s all free!

1) After opening the sample project, click on the clapperboard icon (“Cinematics”).

2) In this example, there’s already a sequence (MetaHumanSample_Sequence), so we’ll select that. To learn more about creating a cinematic sequence, click here.

UE5_Record_Cinematics_3.png

3) Within the sequencer window, there is another clapperboard icon on the top row menu. Click this to open the “Render Movie Settings” window.

4) From the Render Movie Settings window, you can select, output format for image and audio, resolution, compression, and output file location. After setting up, click the “Capture Movie” button on the lower right and wait for UE5 to finish rendering and saving your output file.

UE5_Record_Cinematics_5.jpeg

This operation completed in near realtime and the output is pristine. If your scene contains audio, then you’ll need to merge/combine it with the video output. I did this in After Effects, but other programs would probably work equally well.

Week 1: Summer internship

This week I kicked off an internship with Astrea Media. We began with a general meeting via zoom, with each intern and staff member introducing themselves. I’ll be generating some 3D content for an exciting new project and so far the work is going well. This is not my first experience working with a realtime engine. I’ve done some work in Unity, but never to a polished degree. Our team is working with similar technologies, and I’m eager to learn more about this process.

This week I began optimizations for an existing 3D model. I discovered several hindrances to efficient rendering: absurdly high poly-count, textures in needlessly large image format, non-solid mesh geometry, etc. Rather attempting to rework this model, I instead decided to explore a new design. Due to the organic shapes for this asset, I decided to use Fusion 360’s Sculpt workspace. The sculpt workspace enables designers to create complex shapes with smooth surfaces (e.g., car bodies). There are a few trade-offs to this approach:

  • Non-parametric design

    • Sculpt bodies are based on t-splines (I’ll say more about this later), and do not offer design history/timeline functions

  • Symmetry functions

  • Rapid mesh generation

  • Efficient generation organic shapes

  • T-splines

    • 3D modeling is achieved through a variety of mathematical models for defining shapes. T-splines allow for the creation of freeform surfaces that are defined by a matrix of control points. While meshes tend to be defined by triangles, t-splines work best when all faces are defined by 4 vertices (T-shapes and rectangles).

This last point is important.

Screen Shot 2021-05-30 at 21.19.24.png

This sphere seems like a perfectly logical shape. Indeed, if you click “FINISH FORM” in the upper right corner, Fusion will compute this body and create a solid. Seems simple, right? There’s just one problem: the top of that sphere contains faces with only 3 vertices. As mentioned earlier, t-splines work best when faces are made of 4 vertices. The sphere computes just fine, but as soon as you begin to manipulate this shape, there’s a very good chance that all of those converging 3-point faces at the top and bottom will begin overlapping each other. For example, see what happens when I attempt to apply symmetry to this shape:

Screen Shot 2021-05-30 at 21.51.47.png

Instead of maintaining the converging vertices, the solver calculated something like the iris on a camera.

To avoid problems like this, there’s another option: quadballs.

Screen Shot 2021-05-30 at 22.03.43.png

As you can see, this sphere doesn’t have the same aggressive converging vertices as the other model. The advantage here is that each face can also be split diagonally, efficiently creating triangle mesh faces with minimal distortion.

Screen Shot 2021-05-30 at 22.09.32.png

When exporting this geometry for use in a realtime engine, the mesh conversion produces a high-fidelity representation of the t-spline body, preserving shape details.

Week 15: Final Project Update

This will be my final update for the Studio II project. I feel a complex blend of emotions as I write this. I am relieved to be done. I am also sad to know that my time with this team has come to an end. I consider myself incredibly lucky to have spent so much time working with some truly amazing designers. I don’t know if I will ever experience anything like this again, but I hope so.

Remote collaboration has few perks, and I was lucky to be working with folks who helped to make this experience so much fun

Remote collaboration has few perks, and I was lucky to be working with folks who helped to make this experience so much fun

The work we have done this week feels different for many reasons. We had to prepare something for a large and diverse audience, not all of which knew or were familiar with the context of our work. Additionally, we also needed to use this time to tie up remaining loose ends—we needed to reach an end state where our process could feel somewhat conclusive.

Our efforts were just as collaborative as ever, as we divided up the labor of our remaining tasks. I was incredibly reassuring to know my team members strengths and capabilities. Knowing who was working on a particular task was reassuring. For my part, I was busy scrubbing through a timeline in After Effects, rapidly assembling visual representations and edited footage to make a convincing newscast from the future. Considering the constraints of remote collaboration, I’m very pleased with the final product.

I have continued to ruminate about over this notion that the future is something we cannot predict, but rather something we build through imperfect knowledge. I question the power our team has to influence this process, not because I lack the confidence in our shared abilities —as I said earlier and often, I’ve been working with an amazing team— but more of a concern around consequences of inspiration. Our process was far from perfect. The vagaries of a pandemic distorted every effort. The educators we sought to connect with were terribly busy. Our own team suffered from fatigue and sleeplessness as we juggled future careers and other academic expectations. The complexity of this topic is well beyond the scope of fifteen weeks of diligent inquiry.

I cannot speak for the entire team, but I know that for me personally our exploratory research was the most intimidating phase. It was immediately clear that we were engaging in a very difficult problem. Education intersects with so many other areas of study. It is a problem of policy, culture, funding, methodologies, and it is weighed down by a history of systemic inequality and racism. Generative research methods were the biggest surprise. I was astonished by what could be gleaned through a participatory process. Including educators in the generation of concepts was exciting, and I wish we had more time to engage in this work.

Our final concepts are a reflection of many perspectives and early prototypes generated by K-12 educators

Our final concepts are a reflection of many perspectives and early prototypes generated by K-12 educators

Every phase also felt too short. We needed to move on before we could fully digest what we were learning. Nevertheless, I stand behind the work we have done, because I know it represents the best we had to offer. I’ve known that design is a messy process long before my time at CMU, but I now have a much clearer sense of what it means to engage with that mess and to assemble something coherent. This work is not easy, and it is never, ever truly complete. The deadlines for a design project function like the layers of silt in a fossil record. The strata of every layer represents a progression with no clear ending or beginning. We can always dig deeper.

I hope these artifacts will inspire others as they have inspired us.

Screen Shot 2021-05-14 at 8.14.32 PM.png

Our team has assembled a project homepage. There you will find more comprehensive information about this work, the final outcome and documentation. Check it out!

Week 14 update: The Late Edition

The final push is now upon us. This past week I’ve been working nearly around the clock with my team, pushing to bring about our future vision. One of the most labor intensive, yet rewarding parts of this project has been the production of a newscast from the future. We’ve made countless script revisions, scraped stock images, sound, footage, and crafted motion graphics elements to bring this story to life. It’s been challenging, but I’m excited to see the final results.

What’s working: our approach to generating a video is deeply grounded in research. We’re incorporating concepts generated with participants — public educators who so generously gave us their time and perspectives on the present and future state of teaching in American schools. We’re also building our story to represent several systems-level shifts, including national legislation, teachers union contracts, and individual school reforms. We used several different futuring frameworks to develop these narratives, including: cone of possibility, backcasting, STEEP+V, Multilevel Perspective mapping, affinity mapping, and worldview filters.

Concepts+ MCCC - Version 2 MLP and STEEP+V Sorting.jpg
MLP_Past.png
futurescone-cdb-4.png

This process has been anything but precise. The future is something we build, not something we predict through careful measurements of trends. Understanding this truth has been very reassuring. Now that we are approaching a conclusion, I feel as though I have been on a long drive through undeveloped territory. The daylight of exploratory research gave way to the twilight of generative research and in the pitch of night we evaluated concepts. With only one headlight, we squinted off into the distance, to read the signs. Sometimes the precipitation of a pandemic obscured everything, but we relished the intermittent moments of clarity.

Those latter kinds of moment were by far the most exciting. “Oh, oh, what if…” was a common preamble to productive yet heady conversations with peers over zoom, as we scrambled together various visual representations in Miro and Figma. 

Concepts+ MCCC - Frame 26.jpg
Concepts+ MCCC - Frame 28.jpg

This workflow has been essential to synthesizing content and a visual language for our video, which we’ve been iterating on through various stages of prototyping. I’m concerned about the overall fidelity and recognize that this will be important to suspension of disbelief for our intended audience — policymakers and various stakeholders connected to PPS must find this artifact compelling enough to act and bring these concepts into a shared reality.

Concepts+ MCCC - Frame 29.jpg
Concepts+ MCCC - Frame 30.jpg
Concepts+ MCCC - Frame 31.jpg

On the technical side, video editing and motion graphics are computationally intensive tasks. I built a beefy workstation prior to starting at CMU, and this machine has been essential to so many tasks and assignments. Nevertheless, I’ve found that this work has strained my system’s capacity. I’ve purged files to make room for temporary caching and rendering outputs. I’ve reset my router in a desperate effort to speed up the transfer of data to Google Drive, and ran my system in a barebones state to maximize resources available to Adobe CC’s memory-hungry apps.

The stress I place upon the tools I use to design are complemented by the stress I’ve applied to myself. My sleep has been intermittent. I take short naps on the couch and found myself on more than one occasion this week working through the sounds of birds before the break of dawn. These late night hours are quiet and free of distraction, but tend to make the day that follows less than appealing. I’m staying awake through this last week of lectures, but finding my mind trailing off into thoughts about the timeline and how I might optimize frame rates for nominal render times. I’m obsessed with getting this video done, but know that this pace is not sustainable.

Week 13: Artifact Generation

We’ve began to generate assets for our final artifacts. This should be an exciting time for us. For the last 13 weeks, we’ve been living and breathing the problem space. The future of Portland Public Schools is not a matter of fate, it is something that will be built — not only designed, but also transformed by external forces and deliberate interventions. This work and our team’s research are only one tiny piece of this larger unfolding process, and we cannot know what impact (if any) will come from what we have done.

On some level, I cannot help but feel a little bit sad as we conclude this work. I have a very real sense of the scope of this issue and understand that fifteen weeks cannot generate anything conclusive. Nevertheless, we must honor this process and the deliverable. There is an underlying contradiction in this work. What this project calls for is “bold humility.” We know that our research is not conclusive, we also know that without bold presentation, we cannot inspire meaningful change or the greater vision by Prospect Studio.

Screen Shot 2021-05-02 at 23.16.55.png

Our primary concept is a news story about PPS holding their first ARC summit, and what it means for the future of Portland schools and teachers. We can use this medium to communicate the most salient details while glossing over the more bureaucratic aspects of our system level thinking. For secondary artifacts, we’re thinking about “swag” that is typical for a professional conference, as well as a custom logo for the ARC council.

Screen Shot 2021-05-02 at 23.53.35.png
Screen Shot 2021-05-02 at 23.53.27.png

I’m feeling a lot of pressure to resolve these artifacts to the highest fidelity possible. I know that the success of this project rests somewhat on our ability to persuade others, and we cannot know how this work will be interpreted if the artifacts are not convincing or feel too generic. I’m also worried that we have spent so much time working on the particulars that we haven’t given ourselves room for making these things.

I wish that we had a better sense of what is expected, and how craft will be factored into our grade. This is the first time that I’ve taken a studio class where nothing was made until the last two weeks. I expect that our team will be evaluated on the strength of our research and the clarity of our concepts, but as a studio class, I cannot shake this feeling that we should have been crafting prototypes along the way.

NewsMockup.png

My hope for this week is that the momentum of making and the joy of purely creative pursuits will have a feedback effect to keep us motivated through this final push. I’m excited about the potential for the project even though we are still grappling with an incredibly high degree of uncertainty.