Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This person is right because not everyone can make atomic bomb but understanding…
ytc_UgyO4WlcP…
G
This is the real point of AI -- it will probably be disasterously bad, it might …
ytc_UgwPktm_6…
G
It's not self driving... It's lane keeping assist...it's meant to keep you stead…
ytc_Ugx57wab9…
G
I love Elon Musk because he’s a genius, but he’s too smart for his own good. To …
ytc_UgwVaftWS…
G
Nope.
Doctors are all inherently replacable by AI. In a nutshell, what doctors …
ytc_UgxyLKrWP…
G
"We want all the positives of a tram but with the negatives of a car" - how i fe…
ytr_Ugwu7Bamc…
G
Which is a reference to [IBMs CEO working for/with/in partnership to the Nazis v…
rdc_g99igj3
G
I’d love to hear his thoughts on how ai talks about the return of Jesus Christ a…
ytc_UgxaoIiCx…
Comment
If these predictions are accurate, we'll be living in Blade Runner by 2045.
And that's the best case scenario!
I reckon the actual reality will be much MUCH WORSE than any of our worst dystopias combined!
I'm absolutely certain even Night City will look appealing by comparison, and I already don't want to ACTUALLY live in a future like that!
Fight back against this until your last breath!
Don't let this be our future...
PLEASE!!!
youtube
Cross-Cultural
2026-04-02T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzQa9d8X3fOb77Frt14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyeFqOi6_xdisnnC54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXN_-DVBFZGEOnT1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx2lSMfkFDz6NJz5Kd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzardHN_ljsp91zaeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzLuczLu1Lw7hxcoYV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVgdQDdEUjXeKUfy14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzU3RO6YhocD2ptgjJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8YDmQoJQZPiztVAh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwzl7qj47LO2ROR1xN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]