Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There could be depressing reasons as to why most people use AI in general.
1. A…
ytc_UgxI9Ozsm…
G
At school we were FORCED to use AI for some projects and I feel sick to my stoma…
ytc_Ugxh_Sx81…
G
that's slander; the godfather of AI is papa Jensen Huang 🤑🤑🤑 whom I'll be eterna…
ytc_UgyQzaVFK…
G
The single, lonely men in the comments are so pathetic 😂 Guys, if you can only h…
ytc_UgxPPjIG-…
G
So mfs didn’t learn from giving apes guns that this is a bad idea? Now we got 20…
ytc_UgyIVLyJP…
G
I never thought about how much AI will take over.
Thanks Charlie:)❤
We will see …
ytc_UgxZWofvi…
G
well its easy to see , all laws have a loophole , all moral compass do too, its …
ytc_UgxWIohvQ…
G
1. If social media dies, then good riddance.
2. Everyone is being fooled by fak…
rdc_le4t9j6
Comment
In theory, everything sounds great. I've worked with automated systems that control pressure systems and other parameters, and I can tell you that they must be under human supervision for many reasons. They are not very adaptable to the environment like humans are; sometimes these systems tend to be dangerous because contradictions arise in their functional whole. They don't predict collateral damage. Machines with intelligence only know that their parts are replaced. Like an autonomous car—if it crashes, it will request to be replaced.
youtube
Cross-Cultural
2025-11-27T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwjCcstAOv-jeCVOAt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoY9NUi7X_m6lpKjd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGrjX6Dwfjv6Nna5J4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHviOE0e5SIZqPxhB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz74omPj-8N1KzlEBt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1j7TS_huiKVX24fl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLKD50b8LbBtHvX_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDZW6vNQvZZntbZAx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdceQhNq2dZTC36qN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRj1BMix3c2UkcU2d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]