OpenAI is working with the Pentagon on a number of projects including cybersecurity capabilities, a departure from the startup’s earlier ban on providing its artificial intelligence to militaries.
The ChatGPT maker is developing tools with the U.S. Defense Department on open-source cybersecurity software — collaborating with DARPA for its AI Cyber Challenge announced last year — and has had initial talks with the US government about methods to assist with preventing veteran suicide, Anna Makanju, the company’s vice president of global affairs, said in an interview at Bloomberg House at the World Economic Forum in Davos on Tuesday.
The company had recently removed language in its terms of service banning its AI from “military and warfare” applications. Makanju described the decision as part of a broader update of its policies to adjust to new uses of ChatGPT and its other tools.
Read More: 2023 CEO of the Year: Sam Altman
“Because we previously had what was essentially a blanket prohibition on military, many people thought that would prohibit many of these use cases, which people think are very much aligned with what we want to see in the world,” she said. But OpenAI maintained a ban on using its tech to develop weapons, destroy property or harm people, Makanju said.
Microsoft Corp., OpenAI’s largest investor, provides several software contracts to the US armed forces and other government branches. OpenAI, Anthropic, Google and Microsoft are assisting the US Defense Advanced Research Agency with its AI Cyber Challenge to find software that will automatically fix vulnerabilities and defend infrastructure from cyberattacks.
The Intercept had previously reported the changes to OpenAI’s terms.
OpenAI also said that it’s accelerating its work on election security, devoting resources to ensuring that its generative AI tools aren’t used to spread political disinformation.
“Elections are a huge deal,” Sam Altman, OpenAI’s chief executive officer, said in the same interview. “I think it’s good that we have a lot of anxiety.”
Contact us at letters@time.com