<aside>

Summary |

Moral Gambit denotes that given the unpredictability of an automated system a human agent would decide to design/develop/deploy the system with full knowledge that there is a change of bad stuff happening and being ready to take responsibility for such things happening

</aside>


<aside>

Keywords -

• Tautology: an unnecessarily repetitive statement eg murder is wrong where murder is wrongfulkilling, eg. a free gift

Reinforced Learning (RL): SDG - Sustainable Development Goals

</aside>

<aside> ✔️

Study Questions:

  1. What are the success conditions for doing the right thing? </aside>

Electronics watch

Belief desire psychology - looking at somene’s enviornment and behaviour and explain their behaviour based on their environment

<aside> 🍎

AWS: (value neutral definition)

an artificial agent which, at the very minimum, is able to change its own internal states to achieve a given goal without the direct intervention of another agent.

able to identify, select and attack the target without the intervention of another agent

</aside>

Semantics of predicate logic*

perceive - come to know

Omnibenevolent means being all-loving and infinitely good, a characteristic typically attributed to a deity

Pascal’s wager - you don’t have anything to lose if you do believe in Jesus

Logically knowing something is not enough to believe something fundamentally.

Convention - a usual or accepted way of behaving,

<aside>

Reading notes

Accepting Moral Responsibility for the Actions

of Autonomous Weapons Systems—a Moral Gambit

</aside>

<aside> 📌

</aside>

<aside> 📌

</aside>

Predictability is a function of past system behaviour and environment

Knowledge of predictability likelihoods is a necessary condition on responsible deployment of artificial systems

persons can be systems that are deployed. eg when hiring you look at their reference to predict their current habit

this same thing needs to be employed with DIGITAL SYSTEMS as well

<aside> 🍎

</aside>

<aside> 📌

</aside>

<aside> 🍎

</aside>

System ontology

Does the distinction between autonomous weapon sisyems and Lethal Weapons sysytems with regard to the moral gambit hold up? why?

<aside>

</aside>