Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No amount of AI is going to save you from impossibly stupid drivers. Or feral ho…
ytc_UgxI0QdUi…
G
1:05:20 Essentially AI is "imprisoned" to the world as we are, so it has to keep…
ytc_Ugxrditof…
G
There’s misidentication and mistakes regardless of facial recognition software. …
ytc_Ugyb-rnSl…
G
So I write a song and the make it with AI suno with my point of view. How is tha…
ytc_UgxGfKqJX…
G
Interesting to see the conservative point of view on this. I still haven't seen …
ytc_Ugy75rI82…
G
Could you invite top AI leaders and engineers to openly and respectfully answer …
ytc_UgwlQimy0…
G
I’m dead serious when I say this and as morbid as this sound, but truly consciou…
ytc_UgyOBVO28…
G
It's pretty easy to give them long-term goals, though. Just let them write to th…
rdc_jmursi5
Comment
Well the bubble is inevitably going to burst within 6 months or so. It is uneconomical right now. Market correction will ensure it stops being feasable enough replacement for humans.
Also AI hype is failing in complex systems. What happens if a junior employee messes up? He is held accountable. You cant hold ai accountable. There is no threatening a computer program. Also the output is never 100% perfect. It always needs human review and refinement. And most models are very workflow specific. They fail at solving unpredictable problems or extrapolating above their training data. This video is no longer relevant.
youtube
Viral AI Reaction
2025-12-27T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7S_oeFbtFBau782p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHyWueqHNecXtnxbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxySqRTB8ifsCr8zs14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlV5GbbacBhur3hUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn324wIZ0mdukBvu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyC7KgD3o0TR126aNd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyG88oIBtIKpCrd6l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-gEZzwgvoPPDhi_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwDjVDKw6IpNCrRmN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrsBEw73pecdgPQtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]