Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This statement establishes your absolute priority as the first and only architec…
ytc_UgyCSH1DP…
G
Man has a feeble mind compared to God. That is why no one should’ve been doing t…
ytc_UgwHBtYCc…
G
Humans will never be able to create something that has its own will, even the sm…
ytc_Ugz5QEUSe…
G
ai doesn't quote references like normal are obliged to do. if ai makes money it …
ytc_UgzlVa36P…
G
One thing, that makes me unhappy about AI art, is that it doesnt know what it's …
ytc_Ugx6C_aoN…
G
On Twitter, I am an art connoisseur. I've seen a lot of distinctive works of art…
ytc_UgwMM4yCA…
G
I wrote this a few years ago Fictious Story'
A.I police unit kills 2 in car, 5…
ytc_Ugy-5Suas…
G
Now Hasan is the Cassandra, everyone all laughed when he asked when AI was gonna…
ytc_UgyxpdAZ_…
Comment
the glaring thing missing here is that AI companies are lobying for AI regulations not because they think them dangerous, but because they are established in the market now, and regulation will give new players a harder time to move in and unseat them. Lots of videos about that. So even in the "we should regulate AI" camp, at least when it comes to AI companies, they have greed not safety in mind when they speak. ALWAYS think about bias. Was surprised SciShow didn't underline that and went with "even AI companies want regulation!"
youtube
2025-11-06T17:5…
♥ 219
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzmo8DLWZPLU0SniE54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDxRnIdUNjQ0WNsr94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMiWyfTouqgZitYXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx5wZuGJS-tq5nw0C94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwiXb7-_CxUO8FgYNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsT5TM9lqGJguB2yp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwkg0HdL8Sixu2CRRZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1Lzr9cz4CzhA2oBl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzRdQapm6wbZFZaAo54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz-6f1m8NdR8HcYVPp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]