Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the fact that sam altman himself said that people will lose their jobs because o…
ytc_Ugx3XKs2K…
G
the most laziest obnoxious guy ever no wonder he's stupidly stucked inside self …
ytc_Ugzycu8jt…
G
I studied computer science in university back in 1994. One of my modules is AI. …
ytc_UgwipXK3r…
G
I would be more than fine without the need to work because I can focus on my hob…
ytc_UgxiAeiYC…
G
The Internet connected people and ideas like us having a piece of jigsaw now we …
ytc_UgwfL2qJt…
G
This is NOT AI. These are programmers. These are people using computers.. This i…
ytc_UgxbTtpAs…
G
The government already controls how much HP ur car legally can have on the stree…
ytc_UgxfZCPXJ…
G
I hate the fair use argument from AI lovers also because when this law was creat…
ytc_Ugzvv3YN6…
Comment
Well we can all learn from the Anime. Sword Art Online: Alicization. If we design an A.I that learn from the ground and up. Maybe it will understand us Humans better. And not destroy us. Hopefully. But instead help us
youtube
AI Governance
2023-07-07T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwzsvT0R8QaLUyNUIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxtb04KfZMnmnwKn4x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5zeaDZkjk9MLbG_54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOy-FBaa-ajhMXoMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykacu35_BkzEzD8hN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyEH-u-qtlab019hkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxl48isDCChCc0PY594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzKB-h2aVWk7JxR03x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR0GPh_1T1t2ExF0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX0cewoR8nHZWY4Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]