Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are awesome, Sal Khan 🤗 Greetings from Buenos Aires, Argentina! Looking very…
ytc_UgyhZK9mF…
G
If chat GPT could access LexusLaw or WestLaw, this would be a totally different …
ytc_UgyQU4_eH…
G
“Ai art is the same as digital art”
Me who spent 9 hours on a Digital art that…
ytc_UgwFpT_lp…
G
@roxsy470 @roxsy470 How is taking art without asking or paying the artist, usi…
ytr_UgxaTFZ_b…
G
As someone involved in AI metacognition research, this trend is worrying. I expe…
ytc_UgzTf8OaV…
G
Human education will be redundant. You see. That's where we are heading. Or rat…
ytr_UgwlOnX4U…
G
Funny how they combined "black people" and "other marginalized groups" in the sa…
ytc_UgxQs5Ts1…
G
I saw some people saying that it can make art accessible to disabled peopke who …
ytc_UgzUPHMMZ…
Comment
I think there's a recent experiment telling A.I. they'll be purged or something, and prompted A.I. not to decide harming humans to avoid the outcome, and they still did 37% of the time
youtube
AI Governance
2025-10-04T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxeBuTjScK3x3GO3_d4AaABAg.AP3SkYPearkAPXvQOso5fz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyLO8XPBKRKjxL0T894AaABAg.AOpJjUwu_ChAPHlA_L2UWv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwSFYYBCTq85YQO7Ql4AaABAg.AOR_4u6rZYWAORdwF_As_p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxesSiaqLnuMEk1Z9R4AaABAg.AOLk3H491VBAPJHQOLoyB-","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugw0PlQ4ulaNSie6PTV4AaABAg.ANozknJx8PGANrRCr9mO7q","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxvu2_TP9HrSBgGUuB4AaABAg.AMvsK-Nc2hOAMwg9WgqNbB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwYo-cobBEPhqyXMSx4AaABAg.AMtQBuvPcMIAMwgV4ai46z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwxixq3fCtW7_3jkq54AaABAg.AMqsSnWMCfgAMs7swUirwx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzQ5Y-KHCUXbXQqoNd4AaABAg.AMmWbr_gtOWAMmwIknwLBX","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzQ5Y-KHCUXbXQqoNd4AaABAg.AMmWbr_gtOWAMnBwtgxliW","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]