Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@theofficialness578i think it will do what it's taught is best. If we're not te…
ytr_UgwEdMHco…
G
Le risque, proche du deuxième expliqué mais qui me paraît aller plus loin que ce…
ytc_Ugx6v6M9T…
G
I call BS. This is why they shoved the Wizard of Oz down our throats (for us old…
ytc_Ugx9dBn4G…
G
@RennieAsh
It seems you have underestimated the capabilities of our power …
ytr_UgxO1r6O8…
G
#2 isnt Ai, thats from a Video Game. I think Ive seen speedrunners do something …
ytc_UgyH21c5W…
G
teachers and schools ,collage ,all will go as AI can teach ,analyze it's pupils…
ytc_UgzOEgg0z…
G
Until these vehicles can operate passively, without irradiation of the driving p…
ytc_UgyEv08pJ…
G
To add a little more nuance to the conversation, because this is something I've …
ytc_UgzEwkETC…
Comment
Would agi survive without humans though? And for how long could they live? Like couldn't natural disasters wipe out the buildings that they operate in, and in turn wipe ai out? And would humanoid robots be able to fix themselves or make new robots when they are getting run down? Like ai might but smart but they can’t change the laws of chemistry and that metal will be rusting eventually
youtube
AI Governance
2025-07-08T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZGopR7p3vYrtAkyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCBsPY63H7SvIMdnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy15PwR-9O2qGxc4bx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxiyitVP_k1VvlsH_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdHCHXEHm1tWg3_yd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXQh5EnNbzqgw8bVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWY9MjTO5qCpsLYeZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7ZZc11GxJm5TqZPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygUNUhumAwTrIRSKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykHtzHTtCDOTcJkX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]