Dozens of scientists, health care professionals and academics have written a letter to the U.N. calling for an international ban of autonomous killer robots, saying recent advances in artificial intelligence "have brought us to the brink of a new arms race in lethal autonomous weapons."

The letter, which has been signed by more than 70 health care professionals and was put together by the Future of Life Institute, states that lethal autonomous weapons could fall into the hands of terrorists and despots, lower the barrier to armed conflict and "become weapons of mass destruction enabling very few to kill very many."

"Furthermore, autonomous weapons are morally abhorrent, as we should never cede the decision to take a human life to algorithms," the letter continues. "As healthcare professionals, we believe that breakthroughs in science have tremendous potential to benefit society and should not be used to automate harm. We therefore call for an international ban on lethal autonomous weapons."

USING 'KILLER ROBOTS' IN WAR WOULD BREACH INTERNATIONAL LAW, ADVOCATES SAY

In addition to the letter, a study written by Dr. Emilia Javorsky posits that recent advances by a number of countries working on lethal autonomous weapon systems "would represent a third revolution in warfare," following gunpowder and nuclear weapons.

The effort put forth by the Future of Life Institute follows a 2018 pledge from more than 2,400 individuals from companies and organizations around the world. Those from Google DeepMind, the European Association for AI and University College London and others said they would “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.”

Past concerns

Others have raised concerns to the U.N. as well about the benefits and costs of killers robots. Experts from several countries met in August 2018 at the Geneva offices of the U.N. to focus on lethal autonomous weapons systems and explore ways of possibly regulating them, among other issues.

In theory, fully autonomous, computer-controlled weapons don’t exist yet, UN officials said at the time. The debate is still in its infancy and the experts have at times grappled with basic definitions. The United States has argued that it’s premature to establish a definition of such systems, much less regulate them.

Some advocacy groups say governments and militaries should be prevented from developing such systems, which have sparked fears and led some critics to envisage harrowing scenarios about their use.

In 2017, Tesla CEO Elon Musk and other leading artificial intelligence experts called on the United Nations to issue a global ban on the use of killer robots, which includes drones, tanks and machine guns. “Once this Pandora’s box is opened, it will be hard to close,” Musk and 115 other specialists from around the globe wrote in the letter.

IS SKYNET A REALITY? AS TRUMP SIGNS EXECUTIVE ORDER ON ARTIFICIAL INTELLIGENCE, TECH GIANTS WARN OF DANGER

'The biggest risk we face'

Musk has repeatedly worried about the rise of artificial intelligence, having previously stated it could be the "biggest risk we face as a civilization." The tech exec has even gone so far as to say it could cause World War III.

Research firm IDC expects that global spending on robotics and drones will reach $201.3 billion by 2022, up from an estimated $95.9 billion in 2018.

Over the years, several luminaries, including Musk, legendary theoretical physicist Stephen Hawking and a host of others have warned against the rise of artificial intelligence.

In September 2017, Musk tweeted that he thought AI could play a direct role in causing World War III. Musk's thoughts were in response to comments made by Russian President Vladimir Putin, who said that  "who becomes the leader in this sphere [artificial intelligence] will be the ruler of the world."

In November 2017, prior to his death, Hawking theorized that AI could eventually "destroy" humanity if we are not careful about it.

CLICK HERE TO GET THE FOX NEWS APP

The AP and Fox News' Christopher Carbone contributed to this report.