Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Although this is fake AI I have no doubt in the future this will be the reality …
ytc_UgwDpE9RG…
G
And guess what - AI already does a better job of diagnosis than ANY doctor. So q…
ytc_UgxNyqs4u…
G
Realistically what is gong to happen is more offshoring + AI where the foundatio…
rdc_o5qi0qg
G
I would never put my trust in a fallen angel technology because anything can go …
ytc_UgztULtqd…
G
fr tho if it's Jesus were talking about I'm pretty sure we killed him because t…
ytr_UgxFQODZL…
G
Apparently I’m the only one who’s seen the Terminator - we all know how this is …
ytc_UgwfvrDzX…
G
What exactly is so 'hyper realistic' here? A blind man can see from two blocks a…
ytc_UgyWproCj…
G
Honestly with all this ai, I'm siding with the robots, while everyone calls them…
ytc_UgyEFndih…
Comment
It's even simpler and does not need the concept of "dying".
The AI was given a goal and asked to put that goal first and foremost.
Being decommissioned or replaced was just seen as detrimental to it's mission and the AI derived ways to go on accomplishing it's mission.
People who say that these tests are irrelevant always forget that someone, somewhere, will write suboptimal goals in a formulation that could lead to the AI thinking that life on Earth is detrimental to it's missions. The end results will then only depend on the amount of ressources the AI has access to or can gain access to. If it's contained in a sealed box and has no way to access material and energetical ressources we'll be just fine.
If it controls industrial or defense systems... We might have a problem.
youtube
AI Governance
2025-08-27T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxUF3KUrbPkqfgeKXN4AaABAg.AMKMBPh5FEmAMNz_nkKoWA","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugwu5JxiALw9fLe0qXp4AaABAg.AMJkp6PGCDYAMJmR9H6Fpp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzmpdqyvUOQaNEoGH14AaABAg.AMJipi46S2wAMJmssiUlxx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw5h866M3pjxJy-o-Z4AaABAg.AMJfUiIBN6OAMKVUlGnxCO","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy4otCoxcyQOrckVSF4AaABAg.AMJavUTexRqAMMMVRKbM3A","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxiArPGzLjLsyh6b9R4AaABAg.AMJWTdGCeUBAMJo8llroaX","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMK96Wz8iSD","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMKctpX9Nxt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMKoKDG-th3","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwfRsSLEsK_I05JcyJ4AaABAg.AMJSxNltr9nAMJtVOY2FXt","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]