Friday, April 19, 2024
featured storyTechs

The open-source initiative refuses to recognize Elon Musk’s Ai chatbot.

Elon Musk’s artificial intelligence (AI) chatbot Grok is not as open source as he says and neither are the many companies that claim they are, according to the steward of the definition.

Founded in 1998, the Open-Source Initiative (OSI) is a not-for-profit organization that defends the open-source term. It is currently working on updating its definition to include open-source AI.

Open source generally means the software’s source code is available to everyone in the public domain to use, modify, and distribute.

The OSI definition states it must also comply with 10 criteria, including having a well-publicized means of obtaining the source code at a reasonable cost or for free, not being discriminatory, and the license not restricting other software.

But AI systems are more difficult to assess against the OSI’s 10 points.

“The definition of open-source software that we’ve been using for software, it does not really apply cleanly to AI systems,” Stefano Maffulli, the OSI’s executive director told Euro News.

AI companies that claim they are open source, such as Musk’s latest AI venture, open source the weights the numerical parameters that influence how an AI model performs but not the data it is trained on nor the training process.

Maffulli says that this means it cannot be open source because it is not transparent in what data is used to train the weight which can cause copyright issues and ethical questions around if the data is biased.

“These the weights are new things, and it’s a very new time in history where we have a new artefact that is still the production of human ingenuity and intelligence and creativity,” he said.

“But at the same time is also the elaboration of semi-random calculation that is done by large computers”.


Musk previously said in a post on his social media platform X that his open-source AI is by far the most transparent and truth-seeking, and sued Open AI for abandoning its original mission to benefit humanity by partnering with Microsoft. Yet, Musk’s Grok does not disclose what data the weight was trained on.

In fact, there is little incentive for companies to do so because the moment you announce what you build your data on is the moment you open yourself up to copyright lawsuits.

Another reason so-called open-source AI companies may not want to be fully transparent is to preserve trade secrets.

The OSI has, therefore, a difficult task on its hands when it comes to defining open-source AI. The organization began its quest two years ago after Open AI’s ChatGPT catapulted onto the scene in November 2022.

The biggest hurdle, he said, when it comes to defining open-source AI is understanding the dependency between the data in the training sets, and the model weights, Maffulli said.

How to define open-source AI

The OSI started by assembling a group of initial experts from organizations such as the Mozilla Foundation and Wikipedia, but also with civil society groups, universities, and Big Tech companies such as Microsoft and Meta.

The working groups then assessed three generative AI models: Meta’s Llama, Illusion AI, and Bloom as well as a non-generative AI model that uses machine learning. The working groups voted on the minimum requirements for an AI system to be open source.

Maffulli said that the working groups all said that at the very least, there is to be a minimum requirement of data transparency.

The OSI is now refining this draft definition, which it expects to release to the public over the summer. But it does not mean that after the definition is finalized the OSI will come after Musk or other self-proclaimed open-source AI companies.

“We are the stewards, maintainers of the definition, but we don’t really have any strong powers to enforce it,” Maffulli said.

He added that judges and courts around the world are starting to recognize that the open-source definition is important, especially when it comes to mergers but also regulation.

Countries around the world are finalizing how they will regulate AI and open-source software has been an issue of contention.

“The open-source definition serves as a barrier to identify false advertising,” said Maffulli.

He added that if a company says it’s open source, it must carry the values that the open-source definition carries. Otherwise, it’s just confusing.


Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *