Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
8:45 - lemme teach you how to talk please... The fear mongering isn't good! it m…
ytc_UgyVDI1Xk…
G
I don't know who this is, but I have been saying this for over a year. AI is jus…
ytc_Ugx1ABlFF…
G
Thank you for watching our video! If you're interested in AI that keeps evolving…
ytr_UgxuyYhpN…
G
Omg FINALLY SOMEONE SAID THAT
Sick people would use ai by stealing images of gi…
ytc_UgzbvJDVe…
G
Ok i been see robot has no hair since forever put them some hair.. Give the girl…
ytc_Ugxw0mYtN…
G
Clever AI Humanizer💯 . I never thought AI Humanizer technology would be so inter…
ytc_UgyFmyh1B…
G
I feel like either way it’s still art regardless if it’s ai or not because your …
ytc_Ugz3yhjYv…
G
Humans are experiential. Humans exist to experience. Ultimately humans are not…
ytc_UgwocIdA1…
Comment
How is it possible that AI regulation is not the single most important issue that governments should regulate right now? How is it possible to continue in a race that could lead to complete disaster? Are advances worth the risk when in the other side of the scale we will face horrific outcomes, including possible human extinction?
When he said that he is worried about the awful things that might happen to his kids, I was just terrified for mine.
youtube
AI Governance
2025-06-17T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzw-nBwXiL36g06CfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygzKIZ3z7RnuRRckN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLmSoZzZU5UR8GsEJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbMLYCKypaaSYJiJh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI3RcZSBVR3k6c9Jt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwX-oCXysD3oTDHWFN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzh8hrN7EiQEn_Nr7l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwUoxmgxBY6CwOUp314AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx52jcEzOT1Y0_vkX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrJWe6YaXk_oWV2G14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]