mastodon.green is one of the many independent Mastodon servers you can use to participate in the fediverse.
Plant trees while you use Mastodon. A server originally for people in the EU, but now open for anyone in the world

Administered by:

Server stats:

1.2K
active users

#autonomousweapons

0 posts0 participants0 posts today

📣 Call for Abstracts: "Technology assessment and future warfare: The Good, the Bad, and the Ugly".

The Special topic of issue 35/1 (2026) is guest edited by
Prof. Dr. Karsten Weber and Prof. Dr. Markus Bresinsky, OTH Regensburg, DE

📅 Submit your abstract by: 11 April 2025

Further information:
lnkd.in/g7gxBh2V

#FutureWarfare, #GameTheory, #NuclearWar, #CyberWeapons, #AutonomousWeapons, #DualUse
#Zeitenwende, #TechnologyAssessment, #TAJournal

oekom Verlag
@ITAS_KIT

lnkd.inLinkedInThis link will take you to a page that’s not on LinkedIn

TEDIC and the Regulation of Autonomous Weapons on International Human Rights Day

On December 10, we commemorate International Human Rights Day, a date that underscores the importance of dignity, freedom, and justice for all. In an increasingly digitalized world, our rights face new challenges due to digital dehumanization and te

tedic.org/tedic-and-the-regula

Army of None - Paul Scharre - bird.makeup/users/paul_scharre

This book is horrifying and important.

The section on the psychology of what it takes for a human being to take someone's life at close range is precient. How we value human life and human dignity is significantly affected by circumstance.Without intensive indoctrination and "training" human beings tend to not actually want to kill each other. It requires a serious dose of dehumanization and otherization.

The "naked soldier problem" raised in the book highlights how soldiers will act differently if they find an enemy in a valnuerable position like for example naked in a bath or smoking a cigarette while they watch the sunset. Apparently statistically many soldiers will deliberately miss targets. At least this is the rationalization and mental gymnastics used by these arms dealers and military industrial complex stooges while selling autonomous weapons to the world. "Eliminate the moral burden of killing" and in doing so actually end conflicts faster and save lives. Yikes.

share.libbyapp.com/title/40435

bird.makeupbird.makeup - User
#security#ai#war

#AI #Warfare #AutonomousWeapons #IntelligentWeapons: "It should be of little surprise, then, that states and civil society have taken up the question of intelligent autonomous weapons—weapons that can select and fire upon targets without any human input—as a matter of serious concern. In May, after close to a decade of discussions, parties to the UN’s Convention on Certain Conventional Weapons agreed, among other recommendations, that militaries using them probably need to “limit the duration, geographical scope, and scale of the operation” to comply with the laws of war. The line was nonbinding, but it was at least an acknowledgment that a human has to play a part—somewhere, sometime—in the immediate process leading up to a killing.

But intelligent autonomous weapons that fully displace human decision-making have (likely) yet to see real-world use. Even the “autonomous” drones and ships fielded by the US and other powers are used under close human supervision. Meanwhile, intelligent systems that merely guide the hand that pulls the trigger have been gaining purchase in the warmaker’s tool kit. And they’ve quietly become sophisticated enough to raise novel questions—ones that are trickier to answer than the well-­covered wrangles over killer robots and, with each passing day, more urgent: What does it mean when a decision is only part human and part machine? And when, if ever, is it ethical for that decision to be a decision to kill?"

technologyreview.com/2023/08/1

MIT Technology Review · Inside the messy ethics of making war with machinesBy Arthur Holland Michel

2018: A Global Arms Race for Killer Robots Is Transforming the Battlefield

"The meeting comes at a critical juncture. In July [2018], Kalashnikov, the main defense contractor of the Russian government, announced it was developing a weapon that uses neural networks to make 'shoot-no shoot' decisions. In January 2017, the U.S. Department of Defense released a video showing an autonomous drone swarm of 103 individual robots successfully flying over California. Nobody was in control of the drones; their flight paths were choreographed in real-time by an advanced algorithm. The drones “are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature,” a spokesman said. The drones in the video were not weaponized — but the technology to do so is rapidly evolving.

"[April 2018] also marks five years since the launch of the International Campaign to Stop Killer Robots, which called for 'urgent action to preemptively ban the lethal robot weapons that would be able to select and attack targets without any human intervention.' The 2013 launch letter — signed by a Nobel Peace Laureate and the directors of several NGOs — noted that they could be deployed within the next 20 years and would 'give machines the power to decide who lives or dies on the battlefield.'"

Read more:
time.com/5230567/killer-robots

Time · A Global Arms Race for Killer Robots Is Transforming the BattlefieldBy Billy Perrigo