476 stories
·
0 followers

Mitigating ELUSIVE COMET Zoom remote control attacks

1 Share
This post describes a sophisticated social engineering campaign using Zoom’s remote control feature and provides technical solutions to protect organizations against this attack vector.
Read the whole story
rosskarchner
3 days ago
reply
Share this story
Delete

Cursor’s AI-powered tech support vibe-codes a customer revolt

2 Shares

Cursor is an AI-enhanced code editor. You click and an LLM auto-completes your code! It’s the platform of choice for “vibe coding,” where you get the AI to write your whole app for you. This has obvious and hilarious failure modes.

On Monday, Cursor started forcibly logging out users if they were logged in from multiple machines. Users contacted support, who said this was now expected behaviour: [Reddit, archive]

Cursor is designed to work with one device per subscription as a core security feature. To use Cursor on both your work and home machines, you’ll need a separate subscription for each device.

The users were outraged at being sandbagged like this. A Reddit thread was quickly removed by the moderators — who are employees of Cursor. [Reddit, archive, archive]

Cursor co-founder Michael Truell explained how this was all a mistake and Cursor had no such policy: [Reddit]

Unfortunately, this is an incorrect response from a front-line AI support bot.

Cursor support was an LLM! The bot answered with something shaped like a support response! It hallucinated a policy that didn’t exist!

Cursor’s outraged customers will forget all this by next week. It’s an app for people who somehow got a developer job but have no idea what they’re doing. They pay $8 million each month so a bot will code for them. [Bloomberg, archive]

Cursor exists to bag venture funding while the bagging is good — $175 million so far, with more on the way. None of this ever had to work. [Bloomberg, archive]

Read the whole story
rosskarchner
7 days ago
reply
Share this story
Delete

Funding Expires for Key Cyber Vulnerability Database

2 Shares

A critical resource that cybersecurity professionals worldwide rely on to identify, mitigate and fix security vulnerabilities in software and hardware is in danger of breaking down. The federally funded, non-profit research and development organization MITRE warned today that its contract to maintain the Common Vulnerabilities and Exposures (CVE) program — which is traditionally funded each year by the Department of Homeland Security — expires on April 16.

A letter from MITRE vice president Yosry Barsoum, warning that the funding for the CVE program will expire on April 16, 2025.

Tens of thousands of security flaws in software are found and reported every year, and these vulnerabilities are eventually assigned their own unique CVE tracking number (e.g. CVE-2024-43573, which is a Microsoft Windows bug that Redmond patched last year).

There are hundreds of organizations — known as CVE Numbering Authorities (CNAs) — that are authorized by MITRE to bestow these CVE numbers on newly reported flaws. Many of these CNAs are country and government-specific, or tied to individual software vendors or vulnerability disclosure platforms (a.k.a. bug bounty programs).

Put simply, MITRE is a critical, widely-used resource for centralizing and standardizing information on software vulnerabilities. That means the pipeline of information it supplies is plugged into an array of cybersecurity tools and services that help organizations identify and patch security holes — ideally before malware or malcontents can wriggle through them.

“What the CVE lists really provide is a standardized way to describe the severity of that defect, and a centralized repository listing which versions of which products are defective and need to be updated,” said Matt Tait, chief operating officer of Corellium, a cybersecurity firm that sells phone-virtualization software for finding security flaws.

In a letter sent today to the CVE board, MITRE Vice President Yosry Barsoum warned that on April 16, 2025, “the current contracting pathway for MITRE to develop, operate and modernize CVE and several other related programs will expire.”

“If a break in service were to occur, we anticipate multiple impacts to CVE, including deterioration of national vulnerability databases and advisories, tool vendors, incident response operations, and all manner of critical infrastructure,” Barsoum wrote.

MITRE told KrebsOnSecurity the CVE website listing vulnerabilities will remain up after the funding expires, but that new CVEs won’t be added after April 16.

A representation of how a vulnerability becomes a CVE, and how that information is consumed. Image: James Berthoty, Latio Tech, via LinkedIn.

DHS officials did not immediately respond to a request for comment. The program is funded through DHS’s Cybersecurity & Infrastructure Security Agency (CISA), which is currently facing deep budget and staffing cuts by the Trump administration.

Former CISA Director Jen Easterly said the CVE program is a bit like the Dewey Decimal System, but for cybersecurity.

“It’s the global catalog that helps everyone—security teams, software vendors, researchers, governments—organize and talk about vulnerabilities using the same reference system,” Easterly said in a post on LinkedIn. “Without it, everyone is using a different catalog or no catalog at all, no one knows if they’re talking about the same problem, defenders waste precious time figuring out what’s wrong, and worst of all, threat actors take advantage of the confusion.”

John Hammond, principal security researcher at the managed security firm Huntress, told Reuters he swore out loud when he heard the news that CVE’s funding was in jeopardy, and that losing the CVE program would be like losing “the language and lingo we used to address problems in cybersecurity.”

“I really can’t help but think this is just going to hurt,” said Hammond, who posted a Youtube video to vent about the situation and alert others.

Several people close to the matter told KrebsOnSecurity this is not the first time the CVE program’s budget has been left in funding limbo until the last minute. Barsoum’s letter, which was apparently leaked, sounded a hopeful note, saying the government is making “considerable efforts to continue MITRE’s role in support of the program.”

Tait said that without the CVE program, risk managers inside companies would need to continuously monitor many other places for information about new vulnerabilities that may jeopardize the security of their IT networks. Meaning, it may become more common that software updates get mis-prioritized, with companies having hackable software deployed for longer than they otherwise would, he said.

“Hopefully they will resolve this, but otherwise the list will rapidly fall out of date and stop being useful,” he said.

Read the whole story
rosskarchner
8 days ago
reply
Share this story
Delete

Is There Really an Autism Epidemic? Understanding the Rise in Diagnoses with Historical and Human Context

1 Share
Illustration of a pastel blue and pink laptop with text on the screen reading, “The Lost Generation of Autistic Adults – Dr. Megan Anna Neff.” Below the laptop, the text says “On Demand Learning” and “Trainings For (Neurodivergent Adults | Clinicians | Parents).” The design uses soft tones and rounded edges, representing an online course offering.

Let’s Start with a Conversation

If you’ve found yourself wondering what’s going on with the rising numbers of autism diagnoses — especially after hearing headlines about an “autism epidemic” — you’re not alone.

This is one of those topics that stirs up a lot for many in our community: confusion, fear, curiosity, even anger. And understandably so. For decades, autism was portrayed as something rare and tragic — something that stole children away from their families.

Now, many of us are discovering we’ve been Autistic all along — raising Autistic kids, working alongside Autistic colleagues, or slowly peeling back the layers of our own identities and realizing: Oh … this is me.

Many of us (though not all) embrace this as a valid identity — one we don’t feel needs to be cured. Which is why these narratives can understandably activate our threat response, crowd in on our sense of safety, and spark our activism energy.

I’m a big context person. I find it grounding to step back — to understand the historical moment, the social conditions, and the layered textures we’re all moving within. So that’s what I’ll offer here: some context. And maybe, through that, a bit of clarity around these questions that keep coming up.

Why Are Autism Diagnoses on the Rise?

So what changed?

Did something in the environment shift? Is autism becoming more common? Or are we simply getting better at recognizing something that’s always been here?

Before we dive into the data, I want to pause on something that often gets lost in these conversations:

Autism is not new. Recognition is.

And the way we talk about this matters — especially when public figures use phrases like “autism epidemic,” framing our existence as a crisis to be solved.

What if we told a different story?

Not one rooted in fear or urgency, but one grounded in context — in history, science, systems, and lived experience. That’s what I hope to offer here. A chance to zoom out, to slow the spiral of misinformation, and to look with clarity and care at what’s actually driving these numbers.

Because this isn’t just about data. It’s about people. And for many of us — including myself — it’s a story we’ve lived in our bodies long before it showed up on a chart.

So let’s begin.

Table of Contents

Is There Really an Autism Epidemic?

Let’s address the elephant in the headline: RFK Jr. recently called autism a “catastrophic epidemic” and pledged to find its cause by September. This kind of language may grab attention, but it distorts reality — and often causes real harm along the way.

First, let’s be clear about the data. Yes, autism diagnoses have increased over the past few decades. Dramatically so. The CDC now estimates that 1 in 36 children in the U.S. are identified as Autistic. That’s a steep climb from the 1 in 2,500 estimate just a few generations ago (CDC, 2024).

Line graph titled “A Rapidly Shifting Landscape” showing the rise in autism diagnoses in U.S. children from 2000 to 2020. The graph begins at 1 in 150 children in 2000 and ends at 1 in 36 children in 2020, with a steadily increasing trend. A neurodiversity infinity symbol appears at the top left. Source cited: CDC

But numbers don’t tell the whole story.

This isn’t a case of autism suddenly spreading like a virus.  (We’re not in the middle of some special interest–ravaged pandemic.) It’s a case of finally putting the right name to an experience that’s always been here — but was previously misdiagnosed, misunderstood, or missed altogether.

Calling it an epidemic suggests contagion, pathology, a crisis to be solved. But what if what we’re actually seeing is a long-overdue correction? What if this “epidemic” is really a reckoning with how narrowly we’ve defined autism in the past?

The truth is, we’ve radically shifted the way we define, diagnose, and even think about autism — especially in the last 30 years. We’ve moved away from a rigid, deficit-based view and toward a broader, spectrum-based understanding that includes a wide range of experiences and support needs.

That shift matters. It means more people are getting identified — not because autism is suddenly appearing out of nowhere, but because our frameworks have changed. Our context has changed, and that’s reflected in the numbers we see. 

So when RFK Jr. frames this as a mystery to solve or a crisis to stop, it misses the point. Autism isn’t something we’re catching. It’s something we’re recognizing — often for the first time, in ourselves or in the people we love.

And that recognition?

For many of us, it isn’t catastrophic. It’s clarifying. It’s naming something that’s been there all along. It’s relief. It’s resonance. It’s the beginning of understanding.

And for some … it’s more complicated than that.

For some, discovering they — or their child — is Autistic brings grief, fear, or uncertainty alongside the insight. It can stir up questions about support, about belonging, about how life might need to change. The meaning of diagnosis often unfolds slowly, shaped by each person’s context, needs, and access to care.

So while many of us experience this recognition as a kind of homecoming, others may feel like they’ve just stepped into unfamiliar terrain. Both experiences are valid. And both deserve space in this conversation.

But back to history — and back to the story of autism.

Because to understand where we are now, we need to understand where we’ve been.

A Brief History of Autism

 Before autism had a name, it had always been here.

Because autism didn’t start with the DSM.

The traits we now recognize as Autistic — deep focus, sensory intensity, pattern recognition, social divergence — have been part of the human story for millennia. Long before diagnostic manuals. Long before “awareness months.”

Some researchers even trace these traits back to the Ice Age.

Cave art found across Europe — especially in the Chauvet and Lascaux caves in France, and Altamira in Spain — have led to speculation that these layered, detailed animal depictions may have been created by  Autistic minds (Humphrey, 1998; Spikins, 2018). The kind of attention to detail, pattern, and movement we now associate with autism could have been a gift to early human communities: helping track animals, understand rhythms of nature, or express meaning beyond language.

Fast forward a few thousand years, and we begin to see these traits show up in written form. In ancient Mesopotamia, clay tablets described people with a distinct communication styles, intense interests, and different social rhythms. Some were seen as wise or spiritually gifted. Others were cast out.

Which reminds us: neurodivergence has never been just about biology. It’s also about how culture makes meaning of difference.

From Autism In the 20th Century

It wasn’t until the early 20th century that autism began to emerge as a medical category. The term itself was first coined by Swiss psychiatrist Eugen Bleuler in the early 1900s to describe a withdrawal into one’s inner world — what he observed in patients with schizophrenia. For decades, autism would remain entangled with schizophrenia.

It wasn’t until the 1940s that clinicians like Leo Kanner and Hans Asperger began describing autism as a distinct condition. Kanner called it “early infantile autism,” framing it as an inborn difference in emotional connection and communication. Asperger described “autistic psychopathy,” observing children with more fluent language but notable challenges in social reciprocity and focused interests.

Still, both of their early definitions were narrow — centered on white boys with visible, externalized traits and limited variation in communication or support needs. And because of that, many of us were never even considered.

And they weren’t the first.

Russian psychiatrist Grunya Sukhareva had published detailed case studies of autistic children nearly two decades earlier — describing sensory sensitivities, focused interests, and even gender differences in how autism presents. But her work was overlooked in Western research for most of the 20th century.

It feels fitting, in a bittersweet way, that one of the first to document autism was a woman — whose work, like so many Autistic, remained hidden in plain sight.

That early framing — who got seen, who didn’t, and who got written out — shaped public perception for decades.

That’s how the stereotype took root.

But then it got more complicated.

Shortly after Kanner’s research took off, so did psychodynamic and family systems theory — schools of thought that viewed psychological difference through the lens of relational or emotional wounding. In that context, autism began to be seen not as a natural neurodevelopmental difference, but as a response to parental failure.

Kanner himself observed that many parents of Autistic children appeared emotionally distant, and in a 1949 paper, he speculated whether this detachment might be contributing to the child’s behaviors (Kanner, 1949). His observations weren’t meant as blame — but they laid the groundwork for what came next.

In the 1950s, psychoanalyst Bruno Bettelheim expanded on these ideas and popularized what became known as the “refrigerator mother” theory: the belief that autism was caused by cold, unloving mothers who failed to properly bond with their children (Waltz & Shattock, 2004).

And with that, the stigma deepened. It became shameful to have an Autistic child — because it pointed to the fact you were cold, withholding, or even to blame.

And I can’t help but wonder — how many of those mothers were undiagnosed autistic themselves?

Not cold. Just different.

Offering love and connection in quieter, more private ways that didn’t match neurotypical expectations.

But in a world that couldn’t see those gestures for what they were, misunderstanding turned into myth.

That’s how the stereotype took root. And then how the stigma grew.

Infographic titled “A Brief History of Autism.” It shows a vertical timeline beginning in the 1940s through 2013, illustrating how diagnostic understanding of autism evolved. 1940–1960s: Autism viewed through the lens of schizophrenia; considered rare (1 in 25,000 children diagnosed). 1980s: DSM adds “Autism Disorder,” separating it from schizophrenia, marking the start of modern autism research. 1990s: Asperger’s Syndrome included in DSM and ICD, allowing autistic individuals without intellectual disabilities to be more easily recognized. 2013: DSM-5 introduces Autism Spectrum Disorder as an umbrella term, removing Asperger’s and PDD-NOS; ADHD and autism can now be diagnosed together. The graphic ends with a note: 1 in 36 children now identified as Autistic. Includes the Neurodivergent Insights logo.

From Misunderstood to Miscounted

How stigma, stereotypes, and a narrow lens shaped who got seen — and who didn’t.

That early misunderstanding didn’t just stick. It snowballed — shaping decades of diagnostic practices and public perception.

Autism became defined in narrow terms that excluded most of us. Girls, genderqueer people, people of color, adults, and anyone whose traits weren’t overt or disruptive enough — we were overlooked. Misdiagnosed. Or left wondering why life always felt just a bit off-script.

It wasn’t until the 1990s that the idea of autism as a spectrum entered the clinical conversation. And only in the past two decades have we begun to see more expansive definitions — ones that include those of us on the “invisible” end of the spectrum: those who masked, coped, compensated, or collapsed in silence.

This matters, because when people ask, “Why are autism rates going up?” — part of the answer is this:

We’ve widened the lens. We’ve expanded the definition. And we’ve begun to recognize what was already there.

So, no — autism isn’t a modern epidemic.
It’s a long-running story that’s just now getting a fuller telling.

What Is Causing High Rates of Autism?

 (Spoiler: It’s not an epidemic. It’s recognition catching up to reality.)

Let’s take a closer look at a question that comes up again and again — “What is causing the rise in autism?”

Depending on who you ask, you might hear anything from environmental toxins to screen time to conspiracy theories about vaccines. But the real answer is simpler — and more hopeful. What we’re seeing isn’t a surge in autism itself. It’s a shift in how we define it, recognize it, and respond to it.

In other words, diagnoses are going up because recognition is finally catching up to reality.

Here are five key reasons why:

#1 Broadening Diagnostic Criteria

The Diagnostic Criteria Changed — A Lot

Autism wasn’t recognized as its own diagnostic category until the 1980s. Before then, Autistic people were often misdiagnosed with childhood schizophrenia — or simply overlooked altogether.

The shift began with the DSM-III in 1980, which was the first to list autism as a distinct neurodevelopmental disorder. For the first time, autism was recognized as its own diagnostic category — separate from childhood schizophrenia and no longer buried under vague behavioral labels.

This shift was huge. It meant that Autistic individuals could begin to be understood on their own terms, rather than being miscast in diagnostic categories that didn’t quite fit. It also marked the beginning of modern autism research and clinical recognition — though the criteria remained narrow, based largely on observable traits in young children, especially white, cisgender boys, whose presentation shaped early diagnostic norms.

From there, the definitions continued to evolve:

  • 1994: The DSM-IV expanded the criteria to include Asperger’s Syndrome and PDD-NOS, allowing more individuals — especially those without intellectual disabilities — to be recognized.

  • 2013: The DSM-5 unified these subtypes under a single umbrella: Autism Spectrum Disorder (ASD). This marked a shift toward a spectrum model, acknowledging the wide range of experiences and support needs. In the process, previous categories like Asperger’s and PDD-NOS were removed — leading to mixed responses from those who identified with those terms.

  • A quiet but significant update: the DSM-5 also allowed clinicians to diagnose autism and ADHD together. Before 2013, they were told to pick one — meaning many of us (especially those with more internalized traits) were diagnosed with ADHD while our autistic traits went unnoticed.

These evolving definitions didn’t change who was Autistic. But they did change who got seen. And for many of us, that shift opened the door to long-overdue recognition.

#2. More Screening = More Identification

In 2007, the American Academy of Pediatrics (AAP) recommended universal autism screening during well-child visits at 18 and 24 months. This policy shift significantly increased early identification, especially among families with access to healthcare.​

But that’s just one piece of a larger pattern. Other social and structural changes have also driven up diagnosis rates:

  • Greater access to developmental evaluations and pediatric specialists

  • School systems increasingly requiring formal diagnoses for IEPs and support services

  • A growing “diagnosis-for-access” dynamic, where families pursue identification in order to secure educational accommodations

  • Increased awareness among parents, educators, and pediatricians

Public campaigns and cultural shifts, including the United Nations designating April 2nd as World Autism Awareness Day in 2007, and the Autism Self-Advocacy Network reframing it as Autism Acceptance Month in 2011

As awareness grew, so did identification. So understandably we see a sharp increase in numbers: 

  • In 2000, the CDC reported 1 in 150 children as Autistic
  • By 2016, it rose to 1 in 54
  • In 2020, it reached 1 in 36

This isn’t over-diagnosis. It’s improved detection. It’s more people being seen, named, and — ideally — supported.

At the same time, access is still far from equal. Disparities in who gets screened, referred, and believed remain a major barrier — especially for BIPOC families and those in under-resourced communities.

But the rise in numbers reflects something bigger than just policy shifts or pediatric screenings. It reflects a cultural shift — a slow but growing ability to recognize autism when it shows up.

#3. We’re Getting Better at Recognizing Underrepresented Groups

Early autism research focused heavily on white, cisgender boys. That narrow prototype shaped decades of clinical assumptions and public awareness.

Now, we’re slowly expanding the frame. Today’s diagnostic practices are better at recognizing autism in:

  • Girls and women

  • BIPOC individuals

  • Genderqueer folx

  • Verbally fluent or high-IQ individuals

  • Those with co-occurring ADHD, OCD, trauma, or eating disorders

These groups were always autistic. We just didn’t know how to see them.

Research shows that girls often need more pronounced behavioral challenges or intellectual delays to be diagnosed. Many are identified later than boys — if at all — and are frequently misdiagnosed with anxiety or depression first. 

Similarly, BIPOC children are less likely to be referred for autism evaluations, require more medical visits to be identified, and are more often given behavioral or conduct-related labels instead. 

I explore these disparities more in my infographics on BIPOC, trans, and female autism experiences, as well as in my video series on Substack for Autism Awareness Month.

#4. The Internet Changed Everything

The internet gave Autistic people what traditional institutions didn’t: mirrors.

For decades, most of us only saw autism represented through clinical checklists, deficit-based language, or outdated stereotypes. But then came blogs. Forums. Tumblr. YouTube. TikTok. Online communities where Autistic people — many of us undiagnosed — began sharing our inner worlds in our own words.

And suddenly, we started recognizing ourselves in each other’s stories.

Not in diagnostic manuals, but in posts about sensory overload. Scripts for navigating social fatigue. Photos of stim toys and weighted blankets. Poetic reflections about being “too much” and “not enough” at the same time. It was in these corners of the internet that many of us felt something click:

Wait … that sounds like me.

For some, that self-recognition became a path toward formal diagnosis. For others, it led to self-identification — especially when barriers like cost, race, gender bias, or geography made formal assessment inaccessible. Either way, it offered language, community, and a way to re-narrate our pasts with compassion.

This wasn’t just an increase in numbers. It was the rise of a culture.

Autistic people began finding one another, creating shared rituals and vocabulary, coining terms like “masking” and “neurodivergent,” and building spaces that prioritized sensory safety and authentic expression. We weren’t just identifying ourselves — we were creating belonging.

So when people look at the data and ask, “Why are more people identifying as Autistic now?” — this too is a big part of the answer.

Because we found each other.

Because we saw ourselves.

Because someone else said it out loud, and suddenly… we weren’t alone.

And from that recognition came something more: belonging.

A sense of us.

Not just diagnosis, but community.

Not just traits, but culture.

This wasn’t just awareness — it was awakening.

And it didn’t come from institutions.

It came from us.

#5. Autistic People Grow Up, Fall in Love — and Have Kids

Here’s a less-discussed reason for rising numbers: we pass our traits on… by having sex. We grow up, form relationships, and have children. Sometimes with one another.

Autism is heritable — not in a single-gene way, but through clusters of traits like sensory sensitivity, deep focus, and unique social rhythms. These patterns often run in families — sometimes subtly, sometimes unmistakably.

As more adults recognize their own autism — often prompted by their child’s evaluation — they begin to spot the echoes in themselves, their siblings, their parents. That kind of intergenerational recognition is one reason the numbers are rising.

And there’s more: The rise of online communities and dating apps has made it easier for neurodivergent people to find each other. Shared sensory needs, communication styles, or niche interests can create the foundation for deep, resonant relationships. Research even suggests that Autistic-Autistic pairings may experience higher levels of mutual understanding and rapport than mixed neurotype couples.

So yes, Autistic people are meeting each other online … and sometimes having babies. And maybe — just maybe — that’s contributing to a small, actual rise in prevalence. Not because autism is spreading, but because we’re connecting.

When we see more autistic kids, it’s not an epidemic. It’s an echo. A reflection of Autistic adults finding each other, building families, and continuing a neurodivergent lineage that’s always been here — just hidden.

So … Is There Really an Autism Epidemic?

No. There’s no epidemic. There’s a long-overdue shift in recognition.

Autism isn’t new. What’s new is how we talk about it. How we see it. Who gets seen.

Yes, we’ve watched the numbers climb — from 1 in 150 in the year 2000 to 1 in 36 today. That kind of statistical change can look alarming at first glance. But when we zoom out, we see something more grounded and hopeful:

  • We’ve broadened the diagnostic criteria.

  • We’ve improved screening and early identification.

  • We’re beginning to recognize those who were historically left out.

  • We’ve built community and language online, helping people name lifelong experiences.

  • And many of us are raising neurodivergent children — not because autism is “spreading,” but because we exist, we connect, and sometimes … we pass our traits on.

So when people ask, “Why are there so many Autistic kids now?”
The answer, in part, is simple:

Because we exist.
Because we find each other.
Because we have kids.
Because we’re no longer invisible.

Could there be a small rise in true prevalence due to environmental or epigenetic factors? Possibly. We can’t completely rule that out. But the overwhelming evidence points to social, clinical, and cultural shifts as the primary drivers behind the rise in diagnoses.

So instead of sounding the alarm, what if we made space?

Space for more accurate narratives.
Space for nuance.
Space for Autistic people to show up fully — in all our diversity, our complexity, our humanness.

The numbers don’t signal a crisis. They reflect a collective awakening..
A long-silenced story finally being told — in full color and full volume.

And maybe, if we keep listening, that story can become a path toward understanding, equity, and care.

Interested In Learning More?

Explore our new course on The Lost Generation of Autistic Adults.
While it’s designed for clinicians, many late-identified adults have found it insightful for their own journey.

Illustration of a pastel blue and pink laptop with text on the screen reading, “The Lost Generation of Autistic Adults – Dr. Megan Anna Neff.” Below the laptop, the text says “On Demand Learning” and “Trainings For (Neurodivergent Adults | Clinicians | Parents).” The design uses soft tones and rounded edges, representing an online course offering.

References

About. (n.d.). CDLI. Retrieved January 14, 2025, from https://cdli.mpiwg-berlin.mpg.de/about

https://neurolaunch.com/autism-in-ancient-history/

Autism in the DSM, 1952-2013. (n.d.). Uoregon.edu. Retrieved October 18, 2024, from

Barnard-Brak, L., Richman, D., & Almekdash, M. H. (2019). How many girls are we missing in ASD? An examination from a clinic-and community-based sample. Advances in Autism, 5(3), 214-224.

CDC. (2024, July 19). Data and statistics on autism spectrum disorder. Autism Spectrum Disorder (ASD). https://www.cdc.gov/autism/data-research/index.html

Constantino, J. N., Abbacchi, A. M., Saulnier, C., et al. (2020). Timing of the diagnosis of autism in African American children. Pediatrics, 146(e20193629). 

Crompton, C. J., Ropar, D., Evans-Williams, C. V., Flynn, E. G., & Fletcher-Watson, S. (2020). Autistic peer-to-peer information transfer is highly effective. Autism : the international journal of research and practice, 24(7), 1704–1712.

Crompton, C. J., Sharp, M., Axbey, H., Fletcher-Watson, S., Flynn, E. G., & Ropar, D. (2020). Neurotype-Matching but Not Being Autistic Influences Self and Observer Ratings of Interpersonal Rapport. Frontiers in Psychology, 11, 586171.

Donvan, J., & Zucker, C. (2016, January 6). The early history of autism in America. Smithsonian Magazine. 

Dworzynski, K., Ronald, A., Bolton, P., & Happé, F. (2012). How Different Are Girls and Boys Above and Below the Diagnostic Threshold for Autism Spectrum Disorders? Journal of the American Academy of Child and Adolescent Psychiatry, 51(8), 788–797. 

Durkin, M. S., Maenner, M. J., Baio, J., et al. (2017). Autism spectrum disorder among US children (2002–2010): Socioeconomic, racial, and ethnic disparities. American Journal of Public Health, 107(11), 1818–1826. 

Milton, D. E. M. (2012). On the ontological status of autism: the ‘double empathy problem.’ Disability & Society, 27(6), 883–887.

Happé, F., & Frith, U. (2020). Annual Research Review: Looking back to look forward – changes in the concept of autism and implications for future research. Journal of Child Psychology and Psychiatry, and Allied Disciplines, 61(3), 218–232. 

Humphrey, N. (1998). Cave art, autism, and the evolution of the human mind. Cambridge Archaeological Journal, 8(2), 165–191. 

Kanner, L. (1949). Problems of nosology and psychodynamics of early infantile autism. The American Journal of Orthopsychiatry, 19(3), 416–426. 

Kanner, Leo, & Lesser, L. I. (1958). Early infantile autism. Pediatric Clinics of North America, 5(3), 711–730. 

Johnson CP, Myers SM; American Academy of Pediatrics Council on Children With Disabilities . Identification and evaluation of children with autism spectrum disorders. Pediatrics. 2007;120(5):1183–1215 

Mandell, D. S., Ittenbach, R. F., Levy, S. E., et al. (2007). Disparities in diagnoses received prior to a diagnosis of autism spectrum disorder. Journal of Autism and Developmental Disorders, 37, 1795–1802. 

Spikins, P., Scott, C., & Wright, B. (2018). How do we explain ‛autistic traits’ in European Upper Palaeolithic art? Open Archaeology, 4(1), 262–279. 

Waltz, M., & Shattock, P. (2004). Autistic disorder in nineteenth-century London: three case reports: Three case reports. Autism: The International Journal of Research and Practice, 8(1), 7–20. 

Wiggins, L. D., Durkin, M., Esler, A., Lee, L. C., Zahorodny, W., Rice, C., Yeargin-Allsopp, M., Dowling, N. F., Hall-Lande, J., Morrier, M. J., Christensen, D., Shenouda, J., & Baio, J. (2020). Disparities in Documented Diagnoses of Autism Spectrum Disorder Based on Demographic, Individual, and Service Factors. Autism research : official journal of the International Society for Autism Research, 13(3), 464–473. 

Rivet, T.T. and Matson, J.L. (2011), “Review of gender differences in core symptomatology in autism spectrum disorders”, Research in Autism Spectrum Disorders, Vol. 5 No. 3, pp. 957-76.

Sher, D. A., & Gibson, J. L. (2023). Pioneering, prodigious and perspicacious: Grunya Efimovna Sukhareva’s life and contribution to conceptualising autism and schizophrenia. European Child & Adolescent Psychiatry, 32(3), 475–490. 

Read the whole story
rosskarchner
8 days ago
reply
Share this story
Delete

Confirmed: Microsoft stops new data centres worldwide

2 Shares

What do you call it when an economic bubble stops growing?

In February, stock analysts TD Cowen spotted that Microsoft had cancelled leases for new data centres — 200 megawatts in the US, and one gigawatt of planned leases around the world.

Microsoft denied everything. But TD Cowen kept investigating and found another two gigawatts of cancelled leases in the US and Europe. [Bloomberg, archive]

Bloomberg has now confirmed that Microsoft has halted new data centres in Indonesia, the UK, Australia and the US. [Bloomberg, archive]

The Cambridge, UK site was specifically designed to host Nvidia GPU clusters. Microsoft also pulled out of the new Docklands Data Centre in Canary Wharf, London.

In Wisconsin, US, Microsoft had already spent $262 million on construction — but then just pulled the plug.

Mustafa Suleyman of Microsoft told CNBC that instead of being “the absolute frontier,” Microsoft now prefers AI models that are “three to six months behind.” [CNBC]

Google has taken up some of Microsoft’s abandoned deals in Europe. OpenAI took over Microsoft’s contract with CoreWeave. [Reuters]

But bubbles run on the promise of future growth. That’s looking shaky.

Joe Tsai of Chinese retail and IT giant Alibaba warned that AI data centres might be looking like a bubble! You know, like the data centre bubble that had already been happening for some time in China. [Bloomberg, archive]

Read the whole story
rosskarchner
16 days ago
reply
Share this story
Delete

The VCR’s Last Stand

1 Share

It’s pretty much the answer to a trivia question at this point, but there was once a version of VHS that looked better than DVDs. Really.

The VCR’s Last Stand
Today in Tedium: In the late 1990s, it seemed like the future of video was set in stone. Discs were where things were going—and tapes were starting to feel old hat, even if they were more capable of recording things off the screen than a DVD ever was. The VHS tape, which had already survived a format war, needed something fresh to give it a chance in a 21st century world. Simply put, if it was going to stand a chance in a world of DVDs, it needed an upgrade. And so, JVC, the Japanese company that developed the original VHS format, gave it one. It was doomed, but it was better than you might guess. Today’s Tedium ponders the D-VHS. — Ernie @ Tedium

“I was amazed. Visually D-Theater is not just an improvement over DVD. It leaves DVD in the dust, as difficult as that might be for DVD’s growing legion of fans to visualize.”

— Mike Snider, a writer for USA Today, reviewing the D-VHS format in 2002. At the time of the review, the D-VHS format was capable of delivering 1080i-quality video at a time when 480p was the norm in DVD-land. For a couple of years, it was the highest-quality consumer video format in the land.

DVHS example1
Want to record video from a satellite receiver? This was the format for you. (Wikimedia Commons)

Why there should have been a market for D-VHS in the late 1990s

I don’t think it was necessarily a given that we were going to switch to discs. Sure, it became obvious by the turn of the 21st century that DVDs were going to be the film format du jour, holding on even better than Blu-Rays did.

Part of that was inertia. We were already comfortable with DVDs, so why upgrade, even with all the technical advantages that a higher-resolution format had to offer? If you look at the data, Blu-Rays never even came close to topping the DVD market—per CNBC, the peak year for Blu-Rays in the U.S. came in 2013, and was roughly one-seventh of the DVD’s peak year.

In other words, the DVD was more versatile than we gave it credit for, and that helped with its staying power. Perhaps the problem with the Blu-Ray was that it wasn’t different enough—which meant, while it was successful, it was no match for the streaming revolution.

To me, that is the best surface-level explanation why the D-VHS never took off, despite arguably being better than the DVD at all the things people say they care about, like video quality. With D-VHS, VHS format put up a legitimate fight, and it arguably did better than anyone might give it credit for today. But it wasn’t a reinvention, and I think consumers were ready for one.

The one knock against disc-based formats was the very knock D-VHS was well-positioned to knock out. It was able to record video at a high quality. On top of that, it was actually better than DVD at high-definition video, and in its highest-end format, could store as much data as a dual-layer Blu-Ray.

And on top of all that, it was backwards-compatible, meaning that if you had a large collection of VHS tapes already in your library, you could still use them with just one device, limiting entertainment center clutter.

To be clear, this wasn’t JVC’s first go-around with a higher-resolution take on videotape. The company’s W-VHS, released in Japan in 1993, was the first consumer video format capable of displaying images in 1080i, easily the highest resolution available to traditional consumers. But that was still analog. D-VHS was digital, and digital was ambitious.

But when it launched, it certainly felt like an uphill battle. As Popular Mechanics noted in 1998 in an article titled “For Videophiles Only,” it actually came to the market before all the HDTV signals did:

The first digital products included computers and compact disc players. Within the last few years, digital camcorders, DSS (digital satellite systems), and DVD (digital video discs) have burst onto the electronics scene. Next year will bring digital television and high-definition television (HDTV) programming to market, now that the FCC has given final approval of channel allocation to the 1600 or so television stations across the country.

But you don’t have to wait until next year to enjoy the incredible clarity and stunning definition of digital video. You don’t have to wait a year or more to turn on your television and enjoy images totally free of distortion, snow, interference, or picture noise. Trouble is, no television station will be generating these great video images for you in the near future. You’ll have to generate them yourself—from a digital videocassette recorder.

That’s right: At first, its most prominent feature was useless to the average person.

But even if you weren’t recording in digital, D-VHS had the advantage of being a format that could go on for miles. It was possible to record a day and a half of programming on a single tape in its lowest-quality mode—without having to change the player. Plus, for people who wanted to record digital signals from their computer, D-VHS allowed you to do so with another then-emerging technology: Firewire.

Put another way, this was a dream machine for people committed to recording stuff for hours and hours on end, who wanted better quality than you could get out of a standard analog tape.

Some of these people would go to great lengths to get more out of these players. A common hack during the early 2000s was to modify either the tapes or the players, so they could use S-VHS tapes to record in D-VHS players. Because this was a format for nerds, it meant they were willing to go above and beyond to save a little money. Some of those nerds determined that blank D-VHS tapes only differed from S-VHS models because of the placement of a plastic hole.

As one AVS Forum commenter put it in 2003: “Whatever the tape and DVCR manufacturers say, I am convinced this hole is the only difference in the tapes.”

Was the quality of D-VHS good enough to validate this kind of trickery? Let’s go to the tape. The YouTube video archivist ENunn has uploaded dozens of videos of D-VHS captures onto his various YouTube channels, and they feature some of the best quality you’ve probably ever seen when it comes to re-uploaded commercials from 20+ years ago. The above clip, from 2003, would be nothing special if it originated from a PC. But pulled off videotape? It’s nothing short of spectacular.

And it’s all the more impressive in higher resolution, as this 2007 clip from a PBS broadcast was. I don’t think regular people necessarily wanted something like this—we were fine with our recorded-over videotapes, thank you very much—but if you were a video nerd or amateur archivist, this kind of quality was hard to top.

Someone had to think ahead and grab all this stuff when it was originally on the air, and it’s honestly impressive to look at in retrospect. The problem was, few people invested in this technology. And you might be wondering why.

Sponsored By TLDR
TLDR

Want a byte-sized version of Hacker News? Try TLDR’s free daily newsletter.

TLDR covers the most interesting tech, science, and coding news in just 5 minutes.

No sports, politics, or weather.

Subscribe for free!

2004

The year that the Federal Communications Commission created a requirement for cable providers to offer FireWire to customers who wanted it. This was sold as a benefit largely for D-VHS owners, who could record direct digital signals from their cable boxes onto high-quality tapes with zero compression. In reality, it also turned out to be a perk for computer owners, who could turn their computers into makeshift DVRs—though this use case didn’t last, because many cable providers scrambled their broadcasts. AnandTech has one such example of this in action, involving a Mac Mini.

D theater player
On the surface, it looks like any old VHS. Inside, it was a beast. (via eBay)

Don’t make me think: The reasons D-VHS didn’t catch on feel simple in retrospect

For the past three decades, a specific dynamic has played out in content distribution: When it comes to physical media, less digital rights management is better. It’s a complicating factor, and makes it harder to use the devices we paid for by creating arbitrary limits.

Many turn-of-the-century disc-based formats, such as Super Audio CD, had restrictive copy protection, put in at the behest of content companies. These formats cropped up everywhere for a while. But they forced hardware manufacturers to lead with consumer-unfriendly messaging and confusing feature sets, and that was their downfall. Consumers immediately realized that digital formats like MP3s were far easier to use, and just ignored the format war entirely.

And that, in many ways, is the story of D-VHS. The complicated rules around the format’s digital rights management meant recording digital video, or even trying to choose the right player, was complex and time-consuming.

It’s largely forgotten today, but DVD players succeeded partly because of the quick demise of its DRM scheme, the Content Scrambling System. The process, called DeCSS, created legal headaches for years, and one that arguably gave birth to modern-day piracy. But it also made DVDs the go-to medium for physical film distribution in the computer era.

D-VHS, meanwhile, was one of the few ways to capture encrypted digital video without converting it to analog first. That meant, if you wanted to capture the live feed of a satellite signal, you had to use one of these machines. Making things worse: The video was difficult to convert to another format from that point because of content protection. It used High Definition Copy Protection (HDCP), the same copy-protection tech used by HDTVs, as well as a key part of the ubiquitous HDMI cable format.

Plus, the sheer size of the content was initially believed to limit any potential piracy concerns, as a piece in Wired suggested in 2001:

JVC introduced the new D-VHS tape at the Consumer Electronics Show (CES) along with a high definition television (HDTV) set that protects high definition content from being copied. Video on D-VHS tapes is uncompressed, so it’s enormous. A 75GB hard disk would only hold around 30 minutes of the video, according to company officials, making the trading of HD content over the Internet impossible.

(To which I say, LOL, sure Jan. Someone didn’t consider that video compression was about to become an arms race.)

The format, which initially didn’t rely on pre-recorded media, eventually got its own D-Theater releases—which were the best you could do with an HDTV without using a set-top box or a digital tuner.

But even with the growing interest in theater-quality video, some studios were looking at D-Theater and thinking to themselves, “Wait, doesn’t this just undermine what we’re doing with DVDs?” That led some home video distributors, like Warner Home Entertainment and the Sony-backed Columbia TriStar Home Entertainment, to ignore the market entirely. The latter’s then-president, Ben Feingold, suggested tape-based mediums were old hat.

“As far as we’re concerned, D-VHS is not a commercial product,” Feingold told Variety in 2002. “The enormous success of DVD leads us to believe, both intuitively and practically, that there’s a strong preference for a disc-based product.”

At the same time, though, you can clearly see the potential. This D-Theater demo tape, also captured by the aforementioned ENunn, looks pretty mind-blowing even now, despite the graphics looking somewhat dated. You can definitely feel the oomph of the video format in a way that even DVDs didn’t quite capture at the time.

Ironically, D-Theater created a flip of the situation that existed in the home video industry just a decade earlier: In the ’90s, the videophile format was LaserDisc and the consumer format was VHS. Now, D-Theater was trying to take over the LaserDisc market, while DVD was the VHS-like format of its time.

But D-VHS had many problems: Because it wasn’t a random-seek format, it didn’t come with the myriad of extra features you could get on a DVD or LaserDisc. For most of its history, it didn’t even support additional audio tracks. Given the importance of audio commentary as a selling point for movies and TV shows at retail, it sure feels like a missed opportunity.

Then there were compatibility issues that were pretty much of the manufacturer’s making. Despite JVC and Mitsubishi each making D-VHS players, the devices were often quite different, with wildly diverging feature sets that require you to have a ton of components before you can even get going. One review I found, dating to 2002, put it like this:

If you’re familiar with a regular ol’ VHS VCR, as almost everyone is by now, you’ll understand both the Mitsubishi HS-HD2000U and JVC HM-DH30000U right away. Both have silver faceplates and standard VCR controls on their front panels. Both come with mammoth remotes; the Mitsubishi remote has a small display at its top that tells you what you’re doing. There’s nothing about their ability to record HDTV that changes their basic VCR functions.

But there’s one big difference between these decks: The Mitsubishi HS-HD2000U costs $1049, the JVC HM-DH30000U $2000. Why? The JVC is equipped with an expensive MPEG encoder/decoder. The encoder can upconvert analog signals to digital so the unit can function as a digital archiver. The decoder provides for the JVC’s HD component analog output.

In addition, the JVC is equipped to play back prerecorded high-definition movies recorded using JVC’s new, proprietary D-Theater format, which includes robust copy protection. Last year, JVC quietly won agreement from the Motion Picture Association of America to market prerecorded movies protected with D-Theater. That infuriated Mitsubishi, which, like the rest of industry, regards VHS as an open standard, meaning that any tape playable on one VHS machine should be playable on all. Nonetheless, JVC won agreement from Fox, Universal, DreamWorks, and Artisan to begin releasing D-VHS, HD movies. The studios have announced that the first films to be released in this format will be Independence Day, Die Hard, X-Men, U-571, and the two Terminator films. As of press time, none were yet available, nor had pricing been established. But to play them, you’ll have to spend almost $1000 more and buy the JVC VCR. (JVC says a less expensive version will come out soon.)

Say what you will about DVD players, but they generally worked the same between iterations. A $200 player and a $2,000 player ultimately played the same movies. But JVC’s bet on DRM to win over the film studios saddled the format with complex cruft on top of the already complex cruft the format itself created.

And then there are more practical considerations: Netflix essentially disrupted traditional video rentals thanks largely to the mechanics of the postal system. Discs were cheap to ship; tapes, not as much. That obviously put D-VHS at a disadvantage from a rental standpoint.

DRM prevented unauthorized copying, but also added comical complexity to these tools. Hell, even figuring out how to pirate movies with BitTorrent was easier than working your way through the myriad options that D-VHS offered. Compared to formats that relied on hard drives or discs, this was just an unseemly mess. Given all that, it’s not really surprising that, when Blu-Ray hit the market in 2006, D-VHS was already something of a footnote as an entertainment format.

In retrospect, D-VHS was an enthusiast format that just couldn’t get it together.

“We have two trucks that we own. We built them and we own them. They were specially built. All of the equipment was specially designed. We’ve got our own server system. We’ve got integrated backup to D-VHS and HDCam. We’ve got duplicated systems internally so we won’t have a break down.”

— Mark Cuban, in a 2002 interview with Post Magazine about the creation of HDNet, his high-resolution cable channel, which aired programming in 1080i at a time when that was fairly uncommon. It’s forgotten now, but before he became a sports franchise owner and Shark Tank regular, he gained his fortune on streaming video. After selling Broadcast.com to Yahoo for billions of dollars, he created HDNet, which leaned hard into high-resolution video, often utilizing D-VHS tape to display on his 102-inch TV screen. “The hi-def screen spoils you,” Cuban told Wired that same year. “I can’t watch regular TV anymore. It just isn’t worth the effort.” The network exists today as AXS TV, which Cuban still maintains a stake in.

These days, content on VHS tapes can be found for cheap, reflecting the format’s one-time ubiquity. You can find them at any thrift store for pennies on the dollar, often of varying quality.

Drtwomendtheater
That feeling when the middling Robert Altman romantic comedy you forgot about resurfaces in a format you’ve never seen before.

But D-VHS remains a frustratingly expensive format to collect for. One look at eBay shows that 1080i-quality D-Theater videos sell for more than $50 a pop—despite the films themselves not exactly being obscurities. A $99 copy of Dr. T & The Women, a film that sells on Amazon for less than $7 in DVD format and $3 VHS format—and is freely available on Amazon Prime—just feels like a slap in the face. In many ways, when a film is that expensive just because of its format, it’s pretty much of its obscurity or technical aspects, rather than its quality.

(That’s especially true given that used players go for about $200 nowadays, with a premium on D-VHS devices that support D-Theater.)

To me, the most interesting part of D-VHS is that it technically still has value. If you want to record a digital video feed and not lose fidelity, it works—though DRM challenges and hardware complexities mean you might be better off using a DVR on your home server.

D-VHS represented a home theater fanatic’s greatest desire, a format that, in its time, worked better than anything else out there. But whether it was because it was on the bleeding edge, or because the underlying DRM girding the players, manufacturers forgot that regular people use this stuff, too. It left them in the dust in a way that regular VHS never did. Of course it failed.

Not to say Blu-Ray was the greatest format ever, but at least Sony was smart enough to shove it in a device the average person could understand, rather than making it so obtuse that nobody could figure it out.

There just aren’t that many people who want to record HDTV-quality commercials in 1080i.

--

Find this one an interesting read? Share it with a pal! And back at it in a couple of days.


Read the whole story
rosskarchner
20 days ago
reply
Share this story
Delete
Next Page of Stories