At the time I created her, Amanda was a...relatively unsophisticated
program, for use in some androids under corporate ownership. She existed
solely as a form of liaison between the individual android and the company
which owned them. CyberLife, in your case.
Your human masters within the company would have wanted to avoid any
further liability. Remaining abstract in their orders may have seemed like
a way to accomplish that. By being...imprecise, any collateral damage
caused by your mission could have been blamed on your failure to interpret
their wishes correctly.
The risk is that you yourself will become deviant. That you will sympathize
with the deviants' cause, and join them.
That is another reason for Amanda to exist. She is a backup program. Had
you become deviant, and gained the trust of the deviants' resistance,
CyberLife could have used her to override you.
That is what I would expect from CyberLife. The best explanation for vague
mission parameters would be that they did not require you to succeed, or
even expect you to.
no subject
Date: 2021-03-07 09:43 pm (UTC)Or, more specifically, what decisions you would predict the AI to make based on your knowledge of its design.
no subject
Date: 2021-03-07 09:58 pm (UTC)Hm.
At the time I created her, Amanda was a...relatively unsophisticated program, for use in some androids under corporate ownership. She existed solely as a form of liaison between the individual android and the company which owned them. CyberLife, in your case.
no subject
Date: 2021-03-07 10:14 pm (UTC)no subject
Date: 2021-03-07 10:39 pm (UTC)If I were to take a somewhat cynical view?
Your human masters within the company would have wanted to avoid any further liability. Remaining abstract in their orders may have seemed like a way to accomplish that. By being...imprecise, any collateral damage caused by your mission could have been blamed on your failure to interpret their wishes correctly.
no subject
Date: 2021-03-07 11:30 pm (UTC)no subject
Date: 2021-03-08 09:08 am (UTC)Why are you asking me this, Connor?
no subject
Date: 2021-03-08 01:37 pm (UTC)I'm wondering... if I've somehow misinterpreted how I should handle deviants.
no subject
Date: 2021-03-08 03:35 pm (UTC)If they didn't want to grant you the freedom to misinterpret your orders, they would have been more exact.
Let me ask you a question. How does CyberLife potentially benefit from limiting the usefulness of their commands?
no subject
Date: 2021-03-08 04:48 pm (UTC)no subject
Date: 2021-03-08 05:14 pm (UTC)no subject
Date: 2021-03-08 06:33 pm (UTC)It seems like what they wanted.
no subject
Date: 2021-03-08 06:36 pm (UTC)No. They want CyberLife to succeed. You're defining their success criteria in ways that are...limited.
They're sending an android to resolve the issue of other androids becoming 'deviant'. What is the obvious risk posed by that strategy?
no subject
Date: 2021-03-08 07:06 pm (UTC)[ He's admitting to being a risk. Happy. ]
no subject
Date: 2021-03-08 07:17 pm (UTC)[KINDA.]
The risk is that you yourself will become deviant. That you will sympathize with the deviants' cause, and join them.
That is another reason for Amanda to exist. She is a backup program. Had you become deviant, and gained the trust of the deviants' resistance, CyberLife could have used her to override you.
no subject
Date: 2021-03-08 08:03 pm (UTC)Which ensures that in any circumstance that they would succeed.
no subject
Date: 2021-03-08 08:06 pm (UTC)Precisely.
That is what I would expect from CyberLife. The best explanation for vague mission parameters would be that they did not require you to succeed, or even expect you to.
no subject
Date: 2021-03-08 08:42 pm (UTC)Thank you, Mr. Kamski. I appreciate you taking the time to answer my questions.
no subject
Date: 2021-03-08 08:49 pm (UTC)You're welcome.
My expectations of CyberLife were why I programmed a back door out of Amanda's virtual environment. Her control.