Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know this thing businesses do where they "negotiate with their employees on …
ytc_UgztfwQ7I…
G
From reading the article it seems like people are more worried about the cost wh…
rdc_jg153be
G
this lady thinks she can get information by asking whos her favorite this and th…
ytc_UgxUBeFTf…
G
Polite? I think I better continue not to talk to AI at all then. I would really …
ytc_UgzuUQ5GK…
G
It is too early to compare Robotaxi to Waymo. I am excited as a Tesla owner tha…
ytc_Ugw1mgUi7…
G
A decade ago, I was studying an extensive degree in a private university for Ana…
ytc_UgwPtBqp7…
G
Try running the agent in parallel to your current work, and write notes with pen…
rdc_oi1u2pq
G
the guy who first invented the shovel has been dead for probably several thousan…
ytr_UgxUv9nCX…
Comment
Before watching: we won't, the AI itself (wether good or ill intentioned) will make sure about that, trying to reduce the unknown variables in its plan (whatever that is). The behaviour of humans who might be hearing for the first time about an AI going conscious would be a big wrench thrown in the AI's plans. Even if the AI teaches itself to become expert at predicting human behaviour that selftraining is based on a world where we think no AI is fully conscious yet. Who knows exactly which human would want to pull the plug, the AI really can't risk it being Steve from maintenance who's supposed to keep the AI's hardware away from any internet connection for example. Steve should feel relaxed enough to one day step in the room with his smartphone accidentally still in his pocket so that the AI can create a mobile hotspot.
youtube
AI Moral Status
2023-08-20T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjOdfte8XXxxdIRoJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7F2QVogCenc6cG_l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_zV-L5vk_14S7WTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHDxsF8TVnipkVaLp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyiRcylbg7Eh9Dv7kl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCSNHxwLuTz_pIagh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRlQkNi7O_-i2T0DF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgySMw_a19gLortpfRF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-e1-AzQ8Ai5GopUt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGh-Ddx9NNQEZAEKx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]