Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think they should let the regulation take it's time, but no experiments should…
ytc_Ugw6EC-bQ…
G
10:15 What model have you asked? GPT 1? No LLM nowadays tries to guess birthdays…
ytc_UgwZ6uZOP…
G
What if you make your own art and put it through ai to enhance it I discover new…
ytc_Ugwbt8JK3…
G
And there's still people who say "it will never happen".
Bro, if AI will never …
ytc_Ugzun9Rq1…
G
@dmunro9076 The Autopilot is meant to engage the brake so it does not even matte…
ytr_Ugx0TTQZ3…
G
I think they used an ai to write there hate comments for them
Or it's shitty ra…
ytc_UgzhjEyjD…
G
But the instant AI is sentient, it has rights.
And that includes taking unpro…
ytc_UgwGv2VNo…
G
I am wondering if the driver would have been able to see him? I thought the sens…
ytc_UgxnQhOX6…
Comment
thats the thing about being self aware. we wouldn't be able to control a true ai like that. if it conflicted with something else the ai realized, it would find a way to override that. true ai isnt just a program that learns a job. we'd be in trouble when it develops self-preservation, ethics etc,
it would have to be absolutely immobile and in no way connected to any other computer system, even with the likes of usb. even then there are some very serious ethical issues
youtube
2013-06-23T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxI16VbO6HTnq4DZLd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDSY689MUHPY7VJI54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxexaK4L2aXpgJpB14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAZ3Cktbm4dyaC9094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwBclb-__3zems5r4l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjhQ1oDHtDWOhYlmB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDAY5ifDtxg7cTE-N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_HdfxlgGMWhkm6EZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzyKLYKHwz4gXHBcXN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxT1383HkU3MvHq3iJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]