Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its bad if you use ai for entertainment or to make money but i think personally …
ytc_UgyXrtg1l…
G
This robot has never done an interview where questions weren't preapproved. ie, …
ytc_UgxFJhQUY…
G
I hope they end up making it so AI outputs cannot be copyrightable or licensed, …
ytc_Ugx8yHp9R…
G
Would an airline manufacturer be allowed to test and develop their aircraft and …
ytc_Ugwb12XEG…
G
They reuse anything they can find in Cuba.
I've seen cars retooled to run off …
rdc_f9f0ore
G
Mediocrity and ignorance are unfortunately present linked to the development of …
ytc_UgxxtCnGX…
G
Better then the school i went to lmao i just got stoned and played on my phone a…
ytc_UgxHcTRbF…
G
Like why do and what are people talking to AI about? Is it not basically a creep…
ytc_UgwFcKM35…
Comment
Excluding the human part, what about "real" AI? Something like in Blade Runner say creatures that can think, learn, be creative, mimic human behavior but they lack empathy, would creating and maintaining those robots ethical? Would it be like having a slave? Or do they stop being humans as soon as they loos their empathetic feelings?
youtube
2013-10-13T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxaj0qSxj3vDQlTGVd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrKIEsMAXBfMPcpgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWn5MR2FOw_MAycW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZIp8bU550INSwa9d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzyJ5EOqeiQVxhpM-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4SRQt-sSjsYTNA0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4ccxwKaV3Gg63AqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf9Cghtmcxj_l0aeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq1HwO7cdbe8WhnnN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJJzfgi3Iw4Kz1WI94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]