The Problem With “I Was There”
Your Lived Experience Is a Primary Source, Not a Verdict
One disorienting thing about being a historian in the year of our lord 2025 is that you will occasionally be informed, with great confidence, and an alarming amount of punctuation, that you are wrong about verifiable history because someone else “was there.”
I was reminded of this yesterday when I saw this tweet that someone had posted a screenshot of on Facebook (God, that sentence made me feel old):
Now, no offense to Brittany Wilson here. I get what she was saying and even agree with it to some degree. Some young whippersnapper (God, I feel even older) telling me what actually caused the 2008 housing market collapse or something else they weren’t even alive to see would be annoying as fuck.
However, as a historian I have to push back a little, because there are literally people who tell me that they know more than me, because they lived through the mid-20th century. That they are right and I am wrong.
Not “I’ve read differently” or “that’s not how I remember it,” which are normal human sentences. But wrong. Incorrect. invalid. Because they personally experienced an event, and therefore their memory is being offered not as evidence, but as a winning hand. End of discussion. Pack it up, JSTOR. An octogenarian says it happened differently when he was five years old.
Here’s the issue: I actually love lived experience. Historians love lived experience. Diaries, letters, oral histories, memoirs, scrapbooks, home movies, gossip columns, marginalia, give me the messy humanity. That’s the good stuff. But in the historian’s toolbox, lived experience is not The Final Word. It’s a primary source. And primary sources are precious precisely because they are limited, partial, biased, situated, and honest in ways people don’t mean to be.
I literally wrote a whole article about it last week:
Your memory is real. Your experience matters. It is not, however, the whole historical record.
And lately, especially online, the phrase “I was there” has started functioning like a magic spell that turns one person’s perspective into universal truth. It’s not just disorienting. It’s how we get nostalgia myths, flattened histories, and “actually it wasn’t that bad” takes that only make sense if you ignore who it was bad for.
Which brings me to Paul Krugman.
A few years ago, he tweeted a take about 9/11 that I’m going to paraphrase rather than litigate word-for-word: essentially, that Americans took 9/11 “pretty calmly,” and that there wasn’t a mass outbreak of anti-Muslim sentiment and violence — which could have happened — and that to George W. Bush’s credit, he tried to calm prejudice rather than feed it.
And here’s the thing: you can see the shape of a truth in there. Bush did make public statements attempting to separate Islam from terrorism. The “Islam is peace” messaging existed, and plenty of people remember it.
But there’s a difference between a public statement and a public reality. And there is a difference between “it did not become a full-on genocidal pogrom” and “there wasn’t a mass outbreak of anti-Muslim sentiment and violence and a spike in hate crimes.”
Because there was.
There was harassment. There were hate crimes. There were workplace firings. There were mosque attacks. There was the sudden sense, for millions of people, that the air had changed that you could be a citizen one day and a suspicious figure the next, that your name, your beard, your scarf, your skin tone, your last name, your lunch, your accent, your airport security experience could turn into a liability. There was violence against people who weren’t even Muslim, but were perceived to be. There were families who learned the hard way that “mistaken identity” doesn’t keep you safe when the country is looking for someone to blame.
If you remember 9/11 as a moment of national unity and calm, I’m not here to call you a liar. I am, however, here to say that what you’re describing might be your social location more than the era.
Because here’s what historians do that Twitter rarely encourages: we zoom out. We ask, “Calm for whom?” We ask, “Whose calm?” We ask, “What do the patterns show when you stop using one person’s memory as the measuring stick for a nation of hundreds of millions?”
And once you do that, the rose-colored version collapses.
Even if we’re generous and say the backlash wasn’t universally violent, it was absolutely widespread. It was measurable. It was documented. It was the kind of thing you can track through official data and through the non-official record: community reporting, advocacy groups, local news, oral histories, the kinds of stories that never make it into the memory of people who didn’t have to worry.
This is the danger of “I was there” as a flex: it confuses a perspective with a panorama. It treats history like one camera angle.
And if you think this is just about politics and punditry, let me introduce you to the most personal version of this phenomenon: the guy who tried to mansplain the 1950s to me.
We were arguing about the decade, not as a Pinterest board of gingham and gelatin molds, but as a lived social and political period shaped by Cold War paranoia, racial segregation, legal discrimination, domestic containment, gendered labor, and a relentless propaganda campaign selling “normal” as a moral obligation.
He informed me that my research was wrong, because his lived experience beat my extensive research.
His evidence was charming, almost cinematic: safe neighborhoods, friendly neighbors, kids riding bikes until dark, doors left unlocked, moms at home, dads with stable jobs, everybody respectful, everything orderly. In his telling, this wasn’t just his childhood. This proved the decade itself was fundamentally fine. Sure, there were “some problems,” but the way historians talk about the 1950s, racism, sexism, redlining, domestic violence, anti-communist repression, queer persecution, was “overblown.” It couldn’t possibly have been that bad because, in his memories, it wasn’t.
And in that moment, you can practically see the mechanism at work. The “I was there” argument isn’t really about evidence. It’s about authority. It’s about insisting that your memory should outrank someone else’s scholarship because it’s more emotionally immediate. It feels more “real.” It’s yours.
But… sir. You were in elementary school.
A child’s experience is not worthless, it’s actually incredibly revealing, but it reveals something different than what he wanted it to reveal. A kid in the 1950s is a near-perfect witness to the surface narrative of the decade. The curated story adults wanted children to live inside. The part that’s supposed to feel safe. The part designed to reproduce the world as it is by making it feel natural and good. Children are often protected from the things that would complicate their version of reality. That’s the whole point.
So when he tells me the 1950s were safe and stable because he remembers them that way, what I hear is: the system worked for the kind of family he was in. It worked well enough that he could be a child and stay a child. It worked well enough that the violence of the system was kept out of his line of sight. It worked well enough that he took the comfort of being sheltered as proof that the world was comfortable.
But the “safe neighborhood” memory depends on who was allowed to live there. The “stable job” memory depends on who had access to those jobs. The “mom at home” memory depends on whose labor was unpaid and treated as destiny. The “doors unlocked” memory depends on who the police were protecting, and who they were policing.
What he was describing wasn’t “the 1950s.” He was describing his family’s bubble inside the 1950s, a bubble many Americans remember fondly precisely because it was not available to everyone, and because it required a whole bunch of exclusions and silences to sustain.
This is why nostalgia is so powerful and so dangerous. Nostalgia often isn’t a memory of the past, it’s a memory of your position in the past. It’s a story about where you fit, who you felt protected by, and what you were allowed not to notice.
And that’s the part that makes being a historian feel like stepping through a funhouse mirror lately: the job is increasingly not “teach people things they don’t know.” It’s “gently remind people that their memories are not universal, and that history is bigger than what felt true within their own radius.”
So why do people cling to memory like it’s a notarized document?
Because memory isn’t just information. Memory is identity. It’s story. It’s the mental scrapbook where you keep the evidence of who you are, what you survived, what shaped you, and what you believe the world is allowed to be. When someone challenges your version of an event, it doesn’t feel like they’re disputing a detail. It can feel like they’re disputing you. And most people do not respond to that by calmly opening a spreadsheet.
We spent literally weeks discussing the importance of this fact when I taught Community History. The difference between memory and history is real.
There are a few overlapping forces at play here, and none of them require anyone to be malicious, just…human.
First: availability. The things you saw and felt most vividly are the things your brain can retrieve fastest, so your brain starts treating them as most important. If your post-9/11 experience was candlelight vigils and “United We Stand” bumper stickers, that’s what comes to mind when you think “America after 9/11.” If your 1950s experience was bikes and block parties, that becomes “the 1950s.” It’s not that your memories are fake; it’s that your mind mistakes what’s most accessible for what’s most representative.
Second: scale. People are terrible at it. We’re built to understand our immediate environment: family, neighborhood, workplace, maybe our city. That’s already a lot. But history is often about systems operating at scales your nervous system can’t intuit. Racism doesn’t always show up as a villain twirling his mustache on your front lawn. Sometimes it shows up as who got a mortgage and who didn’t. Who could enroll in which school. Who got surveilled. Who got hired. Who got believed. And if you didn’t live on the receiving end of those structures, or if you were protected from seeing them, your memories will naturally skew toward “everything seemed fine.”
Third: the emotional logic of nostalgia. Nostalgia is not an objective relationship with the past. It’s a coping mechanism. It’s the warm blanket you pull over your head when the present feels confusing, unstable, or humiliating. A lot of people are not actually nostalgic for a decade; they’re nostalgic for a time when they personally felt secure, needed, and in control, or when the world made sense because it was smaller and they had fewer responsibilities. So when you challenge their rosy version of that era, you’re not just challenging history. You’re trashing their comfort object, dragging their baby blanket through the mud. And they will snatch it back like you’re trying to take it away.
Fourth: moral self-defense. If someone believes “the 1950s were good,” they are often also defending a chain of comforting conclusions: that their parents were good, that their upbringing was good, that their society was good, that the rules made sense, that the people who benefited deserved it. The moment you introduce evidence that the decade was “good” for some by being brutal to others, you force a moral rearrangement. And that can feel like an accusation: Were you complicit? Were your parents complicit? Was your comfort bought at somebody else’s expense? Even if you never said it that way, people hear it. And it’s easier to call you dramatic than to sit with that discomfort.
Fifth: status and authority. “I was there” is also a power move. It’s a way to claim seniority, credibility, relevance. In a world where a lot of people feel ignored or disrespected—especially men who grew up expecting their opinions to be treated like a public service announcement, “lived experience” becomes a way to reassert dominance in the conversation. The fact that you have expertise, training, and evidence can actually make them more annoyed. Expertise isn’t received as useful; it’s received as a threat to their ranking.
And finally: the internet has trained us into courtroom mode. People don’t talk anymore to refine understanding; they talk to win. Everything is a debate, everything is a dunk, everything is “receipts,” and the easiest “receipt” is your own memory. It’s always available. You don’t need to read anything. You don’t need to sit with ambiguity. You can just say, “I was there,” and act like you have delivered the mic-drop of the century.
But here’s the historian’s problem: this turns genuine human experience into a bludgeon against complexity.
When “I was there” becomes “therefore I know,” you erase everyone whose “there” looked different. You erase people who were there but didn’t feel safe. You erase people who were there but weren’t believed. You erase people who were there but were too busy surviving to write op-eds about national mood. You erase people for whom the defining feature of the era was not candles and unity, but surveillance and suspicion; not bikes and block parties, but exclusion and control.
Which is why, as a historian, the disorientation isn’t just that people argue with you. People have always argued with historians. The disorientation is that they argue by insisting their single perspective should be treated as the whole record—because acknowledging the wider record would require them to admit: my memory is real, but it is partial.
Closing: the historian’s problem isn’t memory — it’s monopoly
Here’s what I wish people understood: historians are not telling you your memories are fake. We’re telling you they’re not the only evidence that matters.
If anything, historians take memory more seriously than most people do. We care about what people remember, how they remember it, what they forget, what they never saw, what they were taught to normalize, what they had to hide, what they couldn’t say out loud at the time. Memory is a treasure trove.
It’s just not a crown.
The disorienting part of this era is watching “lived experience,” a phrase that should invite humility and plural perspectives, get used as a cudgel to shut down inquiry. The internet has encouraged this weird shortcut where someone’s personal recollection becomes not a contribution to the record, but the record itself. As if being present in a time period automatically made you the historian of it. As if proximity equals accuracy. As if your childhood or your workplace or your suburb or your university campus can stand in for an entire nation.
But “I was there” doesn’t mean “I saw everything.” It means “I saw something.” That something matters. And the responsible thing to do with it is what historians do with every primary source: place it in context, compare it with other accounts, check it against broader evidence, and ask who is missing from the story.
Because the past is not a single perspective. It’s a chorus. And some voices were amplified on purpose while others were muted by force.
So when someone tells me their lived experience is better than my research, I hear what they’re really saying: I want my version of the past to be the comfortable one. I want it to be morally uncomplicated. I want it to confirm that the world I remember was basically just and decent, because if it wasn’t, I have to ask harder questions about what I benefited from and what I ignored.
Which, yes, is a deeply human impulse. But it is also exactly how we get myths that harm people.
The point isn’t to shame anyone out of their memories. The point is to stop treating memory like a veto.
History isn’t a single camera angle.
It’s the whole damn film, book, and graphic novel adaptation.





Another really great post!
They may have been there when it happened. The truth is they see the event through their lived experience, to me and I could be wrong history is a conglomeration of lived experiences from other people too, of course there will be exceptions.