Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah but you have to understand the AI has learned bias and we are aware of all …
ytc_UgxTBq6PA…
G
If jobs are lost by 100s of millions worldwide, there will be a huge recession a…
ytc_UgyEVlhpq…
G
I think in the long term, autonomous weapons will be better and safer than human…
ytc_UgixNw6ZI…
G
Calling ai art is like getting abs from plastic surgery instead of actually exer…
ytr_UgzehaMb8…
G
Alex sounds like a parent after a parent teacher meeting... I can feel ChatGPTev…
ytc_UgwKD6Wqv…
G
My job is so boring AI wouldnt want it or just get bored to death and shut itsel…
ytc_Ugy9EqH0e…
G
If those keeps all the drivers who don’t take their breaks or drive too fast whi…
ytc_Ugx9OTapO…
G
“All hail are Ai overlords. We shall do boring stuff and let ai do the stuff hum…
ytc_UgyGu5NGr…
Comment
It’s intentional that the development of AI isn’t “safe” mainly with US developers but the EU has made attempts to regulate AI early on. 3:09 This explains while privatization and the increases of monopolies will allow AI to remain less regulated.
youtube
Cross-Cultural
2025-09-28T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxPqjz-blhDdyZSn1R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyW9jdPaEp21Ml4uIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVlQMcqojc39nvkQJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugysg7XSAWsAvlh10dt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywjBrcse2ow-ALcWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyf7IRJXxPIVD_nbaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyD8ErZSi5lEGFhlGd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDvmGEWPBYcQN8DJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPLzcb7souZeXFpLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDyMTboZgBJRy7lP14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]