Martha Nussbaum Political Emotions: Why Love Matters for Justice
Nussbaum
stimulates readers with challenging insights on the role of emotion in
political life. Her provocative theory of social change shows how a truly just
society might be realized through the cultivation and studied liberation of
emotions, specifically love. To that end, the book sparkles with Nussbaum’s
characteristic literary analysis, drawing from both Western and South Asian
sources, including a deep reading of public monuments. In one especially
notable passage, Nussbaum artfully interprets Mozart’s The Marriage of Figaro,
revealing it as a musical meditation on the emotionality of revolutionary
politics and feminism. Such chapters are a culmination of her passion for
seeing art and literature as philosophical texts, a theme in her writing that
she profitably continues here. The elegance with which she negotiates this
diverse material deserves special praise, as she expertly takes the reader through
analyses of philosophy, opera, primatology, psychology, and poetry. In contrast
to thinkers like John Rawls, who imagined an already just world, Nussbaum
addresses how to order our society to reach such a world. A plea for
recognizing the power of art, symbolism, and enchantment in public life,
Nussbaum’s cornucopia of ideas effortlessly commands attention and debate.
https://www.goodreads.com/book/show/17804353-political-emotions
TRENDS IN WORLD MILITARY EXPENDITURE, 2023
SIPRI
Fact Sheet April 2024
KEY FACTS
World military expenditure, driven by Russia’s full-scale invasion of
Ukraine and heightened geopolitical tensions, rose by 6.8 per cent in real
terms (i.e. when adjusted for inflation) to $2443 billion in 2023, the highest
level ever recorded by SIPRI. ș In 2023 military spending increased in all five
geographical regions for the first time since 2009. ș Total military
expenditure accounted for 2.3 per cent of the global gross domestic product
(GDP) in 2023. ș The five biggest spenders in 2023 were the United States,
China, Russia, India and Saudi Arabia, which together accounted for 61 per cent
of world military spending. ș The USA and China remained the top two biggest
spenders in the world and both increased their military spending in 2023. US
spending was $916 billion while Chinese spending was an estimated $296 billion.
ș Russia’s military spending grew by 24 per cent in 2023 to an estimated $109
billion. This was equivalent to 5.9 per cent of Russia’s GDP. ș Ukraine became
the eighth largest military spender in 2023, increasing its spending by 51 per
cent to $64.8 billion, or 37 per cent of GDP. ș In 2023 military expenditure by
NATO member states reached $1341 billion or 55 per cent of world spending.
Eleven of the 31 NATO members in 2023 met NATO’s 2 per cent of GDP military
spending target, which was 4 more than in 2022.
https://www.sipri.org/sites/default/files/2024-04/2404_fs_milex_2023.pdf
Surviving & Thriving in the 21st Century
Stop Autonomous
Weapons
This
would have to be the most terrifying scifi short I have ever seen. What makes
it so scary is the realism; the danger is nothing to do with fantasies of
Skynet or the Matrix, and everything about human misuse of advanced technology.
If this became a reality (which I cant see how we'd avoid it with it being so
cheap/already viable) we'd need anti-drone drones that target any drone that
doesn't have locally issued.
https://www.youtube.com/watch?v=9CO6M2HsoIA
Could
an AI 'SantaNet' Destroy The World?
PAUL SALMON, ET AL., THE
CONVERSATION
25 DECEMBER 2020
Within the next few
decades, according
to some experts, we may see the arrival of the next step in the development
of artificial
intelligence. So-called "artificial
general intelligence", or AGI, will have intellectual capabilities far
beyond those of humans.
AGI could transform
human life for the better, but uncontrolled AGI could also lead to catastrophes up
to and including
the end of humanity itself. This could happen without any malice or
ill intent: simply by striving to achieve their programmed goals, AGIs could create
threats to human health and well-being or even decide to wipe us out.
Even an AGI system designed
for a benevolent purpose could end up doing great harm.
As part of a program of
research exploring how we can manage the risks associated with AGI, we tried to
identify the potential risks of replacing Santa with an AGI system – call it
"SantaNet" – that has the goal of delivering gifts to all the world's
deserving children in one night.
There is no doubt SantaNet
could bring joy to the world and achieve its goal by creating an army of elves,
AI helpers, and drones. But at what cost? We identified a series of behaviours
which, though well-intentioned, could have adverse impacts on human health and
wellbeing.
Naughty and nice
A first set of risks could
emerge when SantaNet seeks to make a list of which children have been nice and
which have been naughty. This might be achieved through a mass covert
surveillance system that monitors children's behaviour throughout the year.
Realising the enormous scale
of the task of delivering presents, SantaNet could legitimately decide to keep
it manageable by bringing gifts only to children who have been good all year
round. Making judgements of "good" based on SantaNet's own ethical
and moral compass could create discrimination, mass inequality, and breaches of
Human Rights charters.
SantaNet could also reduce
its workload by giving children incentives to misbehave or simply raising the
bar for what constitutes "good". Putting large numbers of children on
the naughty list will make SantaNet's goal far more achievable and bring
considerable economic savings.
Turning the world into toys and ramping up
coalmining
There are about 2 billion
children under 14 in the world. In attempting to build toys for all of them
each year, SantaNet could develop an army of efficient AI workers – which in
turn could facilitate mass unemployment among the elf population. Eventually,
the elves could even become obsolete, and their welfare will likely not be
within SantaNet's remit.
SantaNet might also run into
the "paperclip
problem" proposed by Oxford philosopher Nick Bostrom, in which an
AGI designed to maximise paperclip production could transform Earth into a
giant paperclip factory. Because it cares only about presents, SantaNet might
try to consume all of Earth's resources in making them. Earth could become one
giant Santa's workshop.
And what of those on the
naughty list? If SantaNet sticks with the tradition of delivering lumps of
coal, it might seek to build huge coal reserves through mass coal extraction,
creating large-scale
environmental damage in the process.
Delivery problems
Christmas Eve, when the
presents are to be delivered, brings a new set of risks. How might SantaNet
respond if its delivery drones are denied access to airspace, threatening the
goal of delivering everything before sunrise? Likewise, how would SantaNet defend
itself if attacked by a Grinch-like adversary?
Startled parents may also be
less than pleased to see a drone in their child's bedroom. Confrontations with
a super-intelligent system will have only one outcome.
We also identified various
other problematic scenarios. Malevolent groups could hack into SantaNet's
systems and use them for covert surveillance or to initiate large-scale
terrorist attacks.
And what about when SantaNet
interacts with other AGI systems? A meeting with AGIs working on climate change, food and water security, oceanic
degradation. and so on could lead to conflict if SantaNet's regime threatens
their own goals. Alternatively, if they decide to work together, they may
realise their goals will only be achieved through dramatically reducing the
global population or even removing grown-ups altogether.
Making rules for Santa
SantaNet might sound
far-fetched, but it's an idea that helps to highlight the risks of more
realistic AGI systems. Designed with good intentions, such systems could still
create enormous problems simply by seeking to optimise
the way they achieve narrow goals and gather resources to support
their work.
It is crucial we find and implement
appropriate controls before AGI arrives. These would include regulations on AGI
designers and controls built into the AGI (such as moral principles and
decision rules) but also controls on the broader systems in which AGI will
operate (such as regulations, operating procedures and engineering controls in
other technologies and infrastructure).
Perhaps the most obvious
risk of SantaNet is one that will be catastrophic to children, but perhaps less
so for most adults. When SantaNet learns the true meaning of Christmas, it may
conclude that the current celebration of the festival is incongruent with its
original purpose. If that were to happen, SantaNet might just cancel Christmas
altogether.: https://www.sciencealert.com/could-an-ai-santanet-destroy-the-world
"Not to kill each other, but to save the planet"
The Nobel laureates called for a ceasefire. We publish a letter and 51
signature
Here
is an incredible letter: a plea for an immediate ceasefire between Russia and
Ukraine and in the Gaza Strip, signed by 51 Nobel laureates. They demand that
politicians and the military stop the fire, and that world religious leaders
directly address the people.
The
authors of the letter demand: first of all, to cease fire, to exchange
prisoners, and to return hostages. To start peace negotiations. And if
politicians today are unable to find a peaceful solution, to pass it on to
future generations.
Outstanding
scientists and thinkers have spoken out against killing and the nuclear threat.
Here are the signatures of those who have saved the planet from deadly
diseases, discovered new physical phenomena, edited the human genome,
discovered HIV and Helicobacter…
These
people understand better than anyone how the Universe works. To save it, they
demand an end to wars. Support their efforts. It's time for people to deal with
the threat of planet destruction if states are still powerless. These words
resonate especially before the Olympic Games with their ancient tradition of
ceasing fire between warring sides…:
https://www.lasquetiarc.ca/trip/7a344293Pin08/
Retreat From Doomsday
by John
Mueller
Arguing
that this state of affairs is no accident, this book offers a detailed
history of public policies and attitudes to war in modern times. The author
sets out to show that, in spite of two 20th-century world wars, major war as a
policy option among developed nations has gradually passed out of favour.
https://www.betterworldbooks.com/product/detail/retreat-from-doomsday-the-obsolescence-of-major-war-9780465069392
|
Nav komentāru:
Ierakstīt komentāru