Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish they can make a AI that figure out how to draw by not stealing that would…
ytc_UgwdB_yCg…
G
@ezzielunchdawg AI only knows what it was trained to… what if AI learned, these …
ytr_UgwanggmJ…
G
Perhaps people being lazy should actually drive the car at night. This leaves AI…
ytc_Ugz1fsgbu…
G
When the front steer tire blows at 65 and the trailer flips what they gonna do c…
ytc_Ugz6BvSZZ…
G
We make ai and the first thing we do is automate the fun stuff :/…
ytc_UgwI8BedN…
G
This whole thing is horrible, but youd better believe that theres a segment of t…
ytc_UgwHDtBbw…
G
AI would spare 5 psychopaths than to save the person that accidentally tripped a…
ytr_Ugzb0q3Q_…
G
I would have expected a more tougher and interesting questioning by the StarTalk…
ytc_Ugx9TJkHw…
Comment
so the pursuit of Ai is for money and control and power..at the risk of world termination and the end of human civilization..wow so good..pathetic...trying to turn human into Robot/Ai....killing human consciousness,the beauty of being human...only a handful is makng decision...we did not approve..the govt accepted $40bill...and 6 people decides everything with what their investors are interested in.sounds pathetic..
youtube
AI Governance
2025-12-06T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxezesmSMPcQF3FEht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7qlOKQCaVE-O_jd94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlIUmop_aySlwFA714AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwK4qt3eHf9h_-nklp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMXd1RTA_vVqKkJ7Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwbLVs8sboawNy1s14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsGik4IBQgvvs8v5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGqcaO5jMkWR52cVd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw9c-Sc0rU-NJjt7Lx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgymjetgbBorg5MJO_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]