The major browsers natively trust a whole bunch of certificate authorities, and some of them are really sketchy:
Google’s Chrome, Apple’s Safari, nonprofit Firefox and others allow the company, TrustCor Systems, to act as what’s known as a root certificate authority, a powerful spot in the internet’s infrastructure that guarantees websites are not fake, guiding users to them seamlessly.
The company’s Panamanian registration records show that it has the identical slate of officers, agents and partners as a spyware maker identified this year as an affiliate of Arizona-based Packet Forensics, which public contracting records and company documents show has sold communication interception services to U.S. government agencies for more than a decade.
In the earlier spyware matter, researchers Joel Reardon of the University of Calgary and Serge Egelman of the University of California at Berkeley found that a Panamanian company, Measurement Systems, had been paying developers to include code in a variety of innocuous apps to record and transmit users’ phone numbers, email addresses and exact locations. They estimated that those apps were downloaded more than 60 million times, including 10 million downloads of Muslim prayer apps.
Measurement Systems’ website was registered by Vostrom Holdings, according to historic domain name records. Vostrom filed papers in 2007 to do business as Packet Forensics, according to Virginia state records. Measurement Systems was registered in Virginia by Saulino, according to another state filing.
Been out of commission for a bit, but I'll be back this month. Some people have been picking up and moving on from Twitter. If you want to follow me in the fediverse, you can find me on Mastodon.
In the mid to late 80’s, a group of undergrads met at Brown University. They were immersed in the liberal arts—semiotics, philosophy, visual art—and in any other generation bound for careers as professors or editors or marketers in the media field. But this group, spurned by an unwelcoming job market and a rising computer literacy in the general public, turned to the web instead and became the backbone of the popular tech scene in New York known as Silicon Alley.
Among them was Steven Johnson, who joined Stephanie Syman to launch Feed, one of the earliest online magazines. The group also included eventual head of MTV interactive Nicholas Butterworth, who got his start creating the web-based bulletin board SonicNet. Mark Tribe was also part of that class, and he would go on to create a virtual community called Rhizome. But one of the last of this class to get started—and certainly the most enigmatic—was Rufus Griscom, who created the erotic online magazine Nerve alongside his partner Genevieve Field.
Griscom met Field when at a publishing house, where they both worked as editors. They hatched an idea early for a literary erotica website, that leaned into nude photographs of real people’s bodies, and longform articles where nothing was off limits. Griscom watched from the sidelines with interest as his friends from Brown in Silicon Alley began to shape what a site like that would even look like. Then they gave it a name: webzines. Webzines like Word and Suck and Feed were cultural products born on the web with their own editorial mannerisms, aesthetic, and attitude. Zines were pithy but often not short (word count doesn’t cost a thing when its digital), blending sarcasm and wit with sharp criticism of the mainstream cultural status quo.
There was something else that stopped Griscom and Field from launching their idea: the law. It wasn’t until the Communications Decency Act was amended in 1996—which enabled a much wider set of content to be published to the web without censorship, a huge victory for the open web—that Field and Griscom felt they could launch something official, and maybe even attract commercial investors which so many of the soon dot-com bubble had already done. The day after the Communications Decency Act was passed, they launched Nerve.com.
Conceived as a spiritual successor to the Playboy of the 1950’s and 1960’s, Nerve’s take on the webzine featured experimental formats and bold nude photography paired with longform editorials exploring sex, gender, sexuality, and relationships. As a product of the information age however, the content on Nerve was decidedly more cerebral and high-minded than most pornographic sites of the early web—”Playboy’s body with the New Yorker’s brain,” as a writer from Entertainment Weekly would later put it. The site—and its writers—were able to create a unique editorial voice that was friendly, open, and sex-positive. “Literate smut,” as its slogan read.
But what really set Nerve apart was the way it centered its two enigmatic founders, Griscom and Field. The first version of the site even featured a picture of their coffee table that subbed in for the site’s navigation. But that was just the beginning. Because of the explicit nature of the site’s content and the fact that the two were actually dating, Griscom and Field found themselves in the spotlight. They were interviewed by Charlie Rose, featured in Time and Newsweek and had a whole HBO documentary produced about them.
But it was a position they soon relished. The duo would throw lavish (and occasionally obscene) parties on behalf of Nerve. And each—consciously or not—cultivated a public persona that intended to match the voice of the site: out, loud, and not safe for work.
Their public exposure and popular editorials helped launch Nerve to a huge level of success very quickly, leapfrogging some of Griscom’s classmates. By 1999, against the backdrop of what would soon become the dot-com bubble bursting, they were able to make their first acquisition, a like-minded community known as Bianca.com.
Bianca.com was one of the earliest sites ever on the web, and certainly among its first official communities. It was developed by Jill Atkinson, Chris Miller, and David Thau in 1994 in the night hours after they logged from their day jobs at Hotwired (making it the second site launched by a Hotwired employee in secret). They built the software that ran the site themselves, because nothing like it existed yet.
In its simplest form, Bianca.com was a series of chat rooms, forums, and guest books grouped into different categories where people could go looking for a bit of x-rated fun mixed in with their everyday conversation. Its community congregated primarily around topics of sex, from the mildly suggestive to the outright pornographic. But it also blended high brow discussions of culture and media with its smut. “It was smart high, minded, a peep of class amidst orgiastic moans of crass,” went one description from Wired
The site used the fictional persona of Bianca as a way to give it a bit of personality. Stories about Bianca were littered all through the site. She welcomed visitors into her “home” on the homepage, and visitors could navigate around by clicking through a crude, hand-drawn map. Each “room” in the house centered on a different theme and salacious backstory. Enter the kitchen for recipes to get in the mood, and the bedroom for wildly explicit dreams, for instance. The site was maintained by Bianca’s Trolls—a word that had a different connotation that in it does today—a group of moderators who’s job was to “maintain the shack, but they do not govern it… [to] quietly try to provide you with the tools to accomplish all your hearts could desire.”
But it was the crowd of people that gathered at Bianca’s smut shack that made it a cult hit of the early web. It would never quite hit the traffic of publications like Playboy, or even Nerve, but it did gather a larger and larger base of loyal visitors. Between the raunchy exchanges and links to pornographic images, members would chat about books, movies, philosophy, and trade in occasional life lessons. Beneath its smut-covered veneer, Bianca was simply a place to have a conversation.
Which made it a good compliment for Nerve in the late 90’s; Nerve provided the editorial and Bianca could provide the discussion. Bianca was never quite able to figure out how to make money, after all. It experimented with memberships and advertising, but neither proved sustainable in the long run. In May of 1999, when Nerve acquired Bianca, it was as much a lifeline for the site as an equitable partnership.
But this was 1999, and within a year, the stock market would come crashing down, brining most of the dot-com world with it. Nerve couldn’t hold on for very long. In the aftermath of the dot-com bubble burst, Nerve struggled to foot the bill for the bandwidth created by Bianca’s large community and they announced plans to shut the site down. Still, Bianca held on for several more years. But after years of neglect, the site was swarmed by vandals looking to spam or diminish the site. It eventually was taken offline completly.
Nerve continued to operate for several years following the dot-com collapse, and grew its magazine publication, and even won a few awards in the later half of the 2000’s. But it too eventually folded under new ownership and by the 2010’s was no longer publishing new content.
Bianca and Nerve almost struck a partnership that could’ve lasted. If only they had got the timing right.
When I first wrote about Copilot, I said “I’m not worried about its effects on open source.” In the short term, I’m still not worried.
That's good to know. But he then goes to wax nostalgic about his specific experience in open source and how GitHub Copilot is something new and different.
But as I reflected on my own journey through open source—nearly 25 years—I realized that I was missing the bigger picture. After all, open source isn’t a fixed group of people. It’s an ever-growing, ever-changing collective intelligence, continually being renewed by fresh minds....Amidst this grand alchemy, Copilot interlopes.
It reads like someone threatened by innovative technologies, gatekeeping because this isn't how he did open source over the last 25 years. Butterick throws in some vague handy-wavy anti-Microsoft rhetoric, also outdated and trite (but sure to appeal to the FSF crowd), for good measure:
We needn’t delve into Microsoft’s very checkered history with open source...
Almost none of the people mentioned in the article he linked to, which is about Microsoft in the late 1990s and early 2000s, are still at Microsoft. Since that time, Microsoft has seen a profound pivot towards open source software. The CEO, Satya Nadella, has not been implicated in any of the anti-FOSS activities at Microsoft during that time, has embraced-even promoted-the pivot towards open source software, and, for what it's worth, came to Microsoft from Sun Microsystems.
Most of the product managers, engineers, and decision-makers at Microsoft these days were barely out of high school in the early 2000s. In tech, it's ancient history.
Butterick even knocks Bill Gates for his open letter to computer hobbyists, which contained the radical idea that developers should be able to define the terms on which their software is distributed, which is the fundamental basis of modern free and open source software. Ironic, because if there are no rules on software code, then he would have no basis for his lawsuit.
Sadly, the Copilot case also seems prepared to bring open source software patent claims, something free and open source software advocates largely solved with the GPL v3 and Red Hat's Open Innovation Network efforts:
I thought we were all against software patents, particularly in open source. I guess not.
Why F/OSS Advocates Should Support GitHub Copilot
The legal underpinnings of GitHub Copilot are based on two basic principles:
Incidental inclusion. (See also 1.)
Fair use protection is broad under US copyright law and is codified in EU copyright directives, although adoption at the member state level varies. Many other non-US and non-EU countries have similar exceptions, though the US is the broadest to my knowledge.
Fair use is a doctrine that allows the limited use of copyrighted material without obtaining prior permission of the copyright owner.
Since 2016, US law has held that automated scanning, indexing, and minor reproduction of copyrighted works, specifically Google's indexing of books for Google Books, is protected fair use.
Fair use has come into play in open source software most recently with the Google v. Oracle case, in which the courts held that Google's clean-room implementation of the Java API did not violate Oracle's copyright on those API calls.
Fair use protects the reimplementation of APIs, such as the Win32 API in ReactOS or WINE, in open source. It protects reverse engineering of proprietary applications to create open source alternatives. It also, like in the Google Books case, protects scanning copyrighted datasets to develop indices, ML models, and other derivative works.
GitHub's scanning of the source code available on its servers to develop an AI model is protected fair use for this reason.
Incidental inclusion is a legal term I have borrowed from UK copyright law, it does not exist as a separate concept in US copyright law but is embedded in the broader US fair use protections, specifically 17 U.S.C § 107 (3).
In the UK, incidental inclusion says accidentally including small bits of copyrighted material is protected:
A typical example of this would be a case where a someone filming inadvertently captured part of a copyright work, such as some background music, or a poster that just happened to on a wall in the background.
This specific carve-out is needed in UK and other non-US countries where fair use protections are not as broad.
In the US accidentally including small bits of copyrighted material is protected under the umbrella of its broad fair use protections, but it is worth specifically calling out this branch of fair use.
Under US law fair use protections, the intent, amount, and effect of infringement determines whether an infringement is protected by the fair use doctrine.
GitHub Copilot is not intended to violate GitHub contributors' copyrights.
While there have been a handful of viral examples of verbatim reproduction of code by Copilot, GitHub has produced reports that state the actual rate of verbatim reproduction is exceedingly low. There is reason to believe the model will continue to improve and that rate will go down.
Finally, the effect of that verbatim reproduction is also minimal. GitHub Copilot is not currently capable of reproducing whole software projects, undermining other companies, or destroying open source communities.
It is an AI-assisted pair programmer that is great at filling in boilerplate code we all use and borrow from each other in free and open source software, protected by FOSS licenses and fair use.
While this is a general overview of the legal basis of GitHub Copilot, there are several valuable in-depth analyses that go into further detail:
It is also worth pointing out that organizations like Free Software Foundation do not actually disagree with the legality of GitHub Copilot, they just also raise similar vague concerns about it and throw in anti-Microsoft rhetoric for good measure, to appease their base. They must fundraise, after all.
What Could Happen If GitHub Loses
What are some of the potential outcomes of the GitHub Copilot litigation?
Fair use and "incidental inclusion" in open source software becomes more restrictive.
Ever copy and paste code snippets from StackOverflow? Did you remember to properly cite and add the relevant Creative Commons license to your LICENSE.md file for that code? How about borrowing some sample code from a blog or looking at a GitHub Gist and reproducing it in your code?
We all know we should apply that attribution/license, but do we always? How much of that code is running in production in your company or open source community right now?
Thankfully, that kind of usage is likely protected under fair use. If that goes away, copying code like this could open free and open source developers up to additional liability, expensive lawsuits, and even troll lawsuits.
We could see a whole industry crop up of open source software copyright trolls, going after open source projects for minor infringements of licenses.
Training ML datasets on copyrighted materials becomes more restrictive.
The ability for ML developers to train their models on copyrighted datasets under fair use right now is dramatically accelerating AI/ML. The advances we have seen in open source AI/ML being developed on datasets that are otherwise copyrighted is unprecedented. Just in the last 12 months the advances we have seen in AI/ML have been extraordinary. Models and advances in model development that used to take years are taking weeks and sometimes just days.
If training ML models on copyrighted datasets becomes more restrictive, AI/ML development will slow.
For example, I know of one AI/ML project (PDF) that scraped publicly accessible webcams during COVID lockdowns to measure social distancing. Those webcam images were copyrighted and, if fair use did not apply, could not be used without obtaining written permission from thousands of webcam owners.
This will have profound impacts on medical research, science, models that improve accessibility for users, and other practical applications of AI/ML that improve the lives of humans and benefit our planet.
This means more lawyers involved in model training, which will then become more expensive, and slower.
It will also likely take ML model training out of the hands of hobbyists, open source developers, and individual researchers and limit it to big corporations who can afford those compliance costs.
Individual ML developers, like individual open source developers, will suddenly face much more legal ambiguity and exposure, if we do not defend fair use.
tl;dr Based on squeamish feelings that GitHub Copilot is something new and different, and gripes about Microsoft from 20 years ago, a tech lawyer has teamed up with a class action plaintiff's law firm to sue GitHub over an incredibly helpful tool that improves open source quality and output, the potential outcomes of which could include:
Making free and open source software harder to share
Re-implementing proprietary applications, hardware, and protocols as free and open harder to do
Making training AI/ML models more expensive, taking it out of the hands of hobbyists and researchers, limiting it only to big corporations with huge legal departments
Slowing development of real-world application of AI/ML models that will improve human life and longevity
Upending the current détente in the free software and open source communities over software patents
You do not have to love Microsoft, GitHub, or 'back them' in this case. But free and open source advocates who have concerns about GitHub Copilot should but just as skeptical of the GitHub Copilot plaintiffs based on what is at risk here.
TLDR: Software Freedom Conservancy and the Godot leadership are excited
to share their decision that the Godot project has reached a level of
success for which it makes sense for Godot to have its own independent
When Godot was first open-sourced in 2014, it was a very small project mostly developed by Ariel and Juan. Even after open-sourcing, contributions to Godot were almost exclusively made by volunteers. Over time, and to our surprise, many users expressed a wish to contribute financially to the project to speed up development.
Creating a foundation at the time would have been too costly and difficult, so we turned to Ton Roosendaal for advice. Ton introduced us to the Software Freedom Conservancy (SFC), which is a charity located in New York.
The SFC was a fantastic fit for Godot. They work as a non-profit home for several high profile FOSS projects (such as Git, Samba, Wine, etc) and they have tested and proven rules to ensure that donations are only used for the benefit of projects, as well as rules to avoid conflicts of interest. They allow open-source projects to grow and prosper and focus on their project while the SFC handles non-profit governance, accounting, and legal issues (including successfully walking back on Non Disclosure Agreements to ensure that all our work can happen in the open); essentially aggregating the work required for the operation of a not-for-profit organization.
Ariel and Juan signed a fiscal sponsorship agreement with the SFC, allowing them to receive donations on behalf of the project. They also managed the creation and growth of the Godot PLC (Project Leadership Committee), formed by some of the most veteran contributors at the time.
Thanks to the SFC, Godot was able to become what it is today as many of its most prominent contributors were able to work part or full time, paid by donations. We were also able to meet in person thanks to their excellent policies for travel and hosting reimbursement. Without this, many contributors would not have been able to make it to events. They have also masterfully negotiated large donation grants by companies, ensuring that anything signed is beneficial to the project.
In all, from the Godot project leadership, we are immensely thankful and proud to have been part of the Software Freedom Conservancy.
Godot joined the SFC when the project was still in its infancy and its needs were fairly limited. Now the Godot project is many times larger, it employs multiple people, and it has more complex needs and aspirations. Accordingly, as the project continues to grow even more, it makes sense to have the control, independence, and flexibility in managing funds of an organization that is solely focused on Godot.
For this reason the Godot Project Leadership Committee (the PLC) and the SFC have agreed that it is time for the Godot project to leave its home at the SFC and form its own organization: the Godot Foundation. Like a lot of other Open Source projects (Blender and Krita, for example), the Foundation will be located in the Netherlands, which means Godot will be Blender’s neighbor! The structure of the Foundation is modeled after the policies of SFC, which will ensure continuity in the way Godot operates.
Why create a foundation?
As Godot keeps growing, so do our needs. Godot's size merits the flexibility of having its own organization and the opportunity to explore broader funding sources.
Examples of this are crowdfunding campaigns (like Blender or Krita do), the highly requested ability for users to sell assets on an asset library (and have a share going to the Godot Foundation), selling merchandise, and other types of funding. An independent entity will allow us to make decisions with solely Godot's benefit in mind, instead of as one of many important open source initiatives.
Having our own foundation will also project a stronger image of the Godot Project, which will allow us to have stronger footing when negotiating big donations with donors. The SFC has done a stellar job at negotiating for us so far, showing us how to ensure that the FOSS nature of Godot is never compromised in those agreements. We intend to continue with the same passion and dedication to FOSS that the SFC has shown us in all those years.
Ultimately, we want the Foundation to serve as a home for community initiatives, by allowing it to have its own funding lines (this means, so they can raise funding on their own for a specific goal, but the Foundation receives and uses it according to what was agreed), such as initiatives to promote education, communication and diversity.
We plan to regularly post public reports of our financial situation and the usage of funds, similarly to what the Blender Foundation does.
What will change?
From the perspective of the Godot project, not much will change. Governance/Financial decisions were previously made by the PLC with input from the advisor group. The PLC will become the Foundation’s Board of Directors, so it will be the same people, but just with a different name. Additionally, the SFC will be part of the new foundation in advisory capacity, to help us with this new adventure, and to ensure to you all, the community, continuity in the way Godot is managed.
This transition does not affect in any way the technical development of the engine, which is detailed in our Governance and Teams pages. The development process of Godot will not change with the Foundation.
The Godot Foundation is dedicated to creating Free and Open Source Software and to ensuring that work on the Godot project is sustainable. The Foundation’s Mission is to “financially support the growth, initiatives and activities of the Godot Engine project, an open-source project that provides a free suite of tools and educational materials around the Godot Engine.
The Foundation strives to help the Godot Engine continue to break down barriers to video game development and make it possible for everyone to create high quality video games, regardless of who they are and where they are located.”
Additionally, the Godot Foundation is adopting most of the policies from the Software Freedom Conservancy. Their policies worked very well and we can’t think of a better way to manage the subtleties and challenges that come with a non profit organization. From the composition of the board in respect to conflict of interest and transparency, to the fair usage of funding, we are committed to granting the same ethical quality as the SFC has ensured so far.
We have just started the process of moving to the Foundation. For now all of Godot’s funding and contractors are still managed by the SFC. The SFC will gradually reduce its work for Godot and the new foundation will slowly ramp up. Stay tuned for announcements in the future as we finalize the Foundation’s organizational structure and officially begin operations.
If you have any questions, please reach out to email@example.com. We will compile a list of questions and update this page with an FAQ shortly after.
In other words, while Apple will provide security-related updates for older versions of its operating systems, only the most recent upgrades will receive updates for every security problem Apple knows about. Apple currently provides security updates to macOS 11 Big Sur and macOS 12 Monterey alongside the newly released macOS Ventura, and in the past, it has released security updates for older iOS versions for devices that can’t install the latest upgrades.
This confirms something that independent security researchers have been aware of for a while but that Apple hasn’t publicly articulated before. Intego Chief Security Analyst Joshua Long has tracked the CVEs patched by different macOS and iOS updates for years and generally found that bugs patched in the newest OS versions can go months before being patched in older (but still ostensibly “supported”) versions, when they’re patched at all.