Artificial intelligence experts have announced they will boycott a South Korean research university that is looking into how to use AI for weapons, according to CNN.
The university, Korea Advanced Institute of Science and Technology, started a research center that will look to create AI-based missiles, submarines and quadcopters by the end of this year.
More than 50 leading experts penned an open letter to the university that denounced the killer technology research, Fortune reported.
At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons, the experts wrote in the letter. We therefore publicly declare that we will boycott all collaborations with any part of KAIST until such time as the president of KAIST provides assurances, which we have sought but not received, that the center will not develop autonomous weapons lacking meaningful human control.
The letter's signers said AI-based weapons would lead to a third revolution in modern warfare, which will be hard to stop.
"They will permit war to be fought faster and at a scale greater than ever before," the letter said. "They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened."
Institute president Sung-chul Shin responded to the letter, saying the school does not have any intention to engage in (the) development of lethal autonomous weapons systems and killer robots.
KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.
Even so, Toby Walsh, an AI professor at the University of New South Wales who signed the open letter, told CNN he still has a few question marks about what they intend to do, but broadly speaking they have responded appropriately.
Still, as The Guardian reported, a South Korean company called Dodaam Systems already builds autonomous combat robots, stationary turrets that can detect targets from almost 3 miles away. The United Arab Emirates and Qatar have already bought that technology.
Walsh told The Guardian the common threat of AI-based weapons should concern the world.
Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better, he said. If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South.
According to The Verge, world leaders have asked the U.N. to craft an international treaty to regulate AI-based weaponry. Egypt, Argentina and Pakistan have all backed the idea.
However, the U.K. and the U.S. are against the proposal because of the impossibility of defining what does and does not constitute human control. Many systems already have at least some autonomous capabilities, including drones and missile defense networks, according to The Verge.
The university, Korea Advanced Institute of Science and Technology, started a research center that will look to create AI-based missiles, submarines and quadcopters by the end of this year.
More than 50 leading experts penned an open letter to the university that denounced the killer technology research, Fortune reported.
At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons, the experts wrote in the letter. We therefore publicly declare that we will boycott all collaborations with any part of KAIST until such time as the president of KAIST provides assurances, which we have sought but not received, that the center will not develop autonomous weapons lacking meaningful human control.
The letter's signers said AI-based weapons would lead to a third revolution in modern warfare, which will be hard to stop.
"They will permit war to be fought faster and at a scale greater than ever before," the letter said. "They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened."
Institute president Sung-chul Shin responded to the letter, saying the school does not have any intention to engage in (the) development of lethal autonomous weapons systems and killer robots.
KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.
Even so, Toby Walsh, an AI professor at the University of New South Wales who signed the open letter, told CNN he still has a few question marks about what they intend to do, but broadly speaking they have responded appropriately.
Still, as The Guardian reported, a South Korean company called Dodaam Systems already builds autonomous combat robots, stationary turrets that can detect targets from almost 3 miles away. The United Arab Emirates and Qatar have already bought that technology.
Walsh told The Guardian the common threat of AI-based weapons should concern the world.
Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better, he said. If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South.
According to The Verge, world leaders have asked the U.N. to craft an international treaty to regulate AI-based weaponry. Egypt, Argentina and Pakistan have all backed the idea.
However, the U.K. and the U.S. are against the proposal because of the impossibility of defining what does and does not constitute human control. Many systems already have at least some autonomous capabilities, including drones and missile defense networks, according to The Verge.