Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
most of the issues of self driving cars are the surrounding bad car drivers anyw…
ytr_UgxYi8GCw…
G
My employer is trying to employ AI to manage a small fraction of my role. As a r…
ytc_UgxZWtEmv…
G
I disagree on one point with Bernie - Losing jobs to automation will be humanizi…
ytc_Ugw7G0l-k…
G
This is because it's an AI made by white folks living in a racist society. There…
ytc_UgzbDZnxM…
G
To be honest, given how f'up things are in the world right now, this may be the …
ytc_Ugz8W0XDD…
G
Nope. It is not terminator. There are lot of unresolv problems with copyright. C…
ytc_Ugw1ZCHLB…
G
So we have autonomous cars that kill people by accident, now we have autonomous …
ytc_Ugzj0utiZ…
G
These guys😂😂😂Your youtube channel will be replaced by a better AI channel you ha…
ytc_Ugxgkf5Hq…
Comment
No they literally planned on stealing everything. It's all manipulation and these idiots are now saying things like this so they can avoid any accountability..... you're literally seeing the reason for ALL robot films play out....they pushed it all for decades to attempt to get as many believers as possible. Let's see what happens.....I don't and will never listen to someone who got paid to talk....😂
youtube
AI Governance
2025-06-22T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzKoLv-PzAm-LhV8ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxp_U1q07iztPHcr6p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWH4ietbUL3-tPdr94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_HSTyv6MB8755cot4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-NJ61zfFBcEpRWhV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCDEWaCDp0nwGnJHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxzl-hiOiJlUG7zbk94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzkNhlU9uhlBJ95xd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyngWy6jd1UnwCXstx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzuPrOorFSI5DwYgRZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]