Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The danger of AI isn’t the worst thing it’s man using it for “security and safet…
ytc_UgyzNpjhi…
G
Wow... I am blown away at how absolutely useless this video is.
If anybody want…
ytc_UgysQM04p…
G
The problem is that no amount of safety will ever be good enough for people but …
ytc_UgyACA_tP…
G
It sounds like you might have some concerns about AI! The dialogue between the p…
ytr_UgytER2Da…
G
You're absolutely right! The balance between efficiency and the human touch is c…
ytr_Ugw78IczR…
G
EXACTLY‼️‼️‼️‼️PEOPLE REALLY THINK AI IS GONNA TAKE OUR JOBS💯🤦🏾♂️BRO AI IS GONN…
ytc_Ugzo2pXkw…
G
Don't these people realize to working with these people realize it's not in the …
ytc_UgzbcJzTG…
G
AI art can be considered art if it’s truly AI. Eventually AI will develop to the…
ytc_UgxjoKqrj…
Comment
Very nice video and graphics to quality ❤
But I think we are far from general intelligence, however it's still dangerous if we have the inteligent + autonomous thing
Imagine in 20 years we will need no drivers, no gardeners, fastfood workers and other manual labor because of robots
And 90% less programmers, 50% less doctors, 80% less video editors etc.
All in all, no general intelligence but intelligent enough in multiple AI products to make unemployment high and salaries low
youtube
Viral AI Reaction
2025-11-24T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxH73MNIB2ymK0tLhB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8Z7GS2z0yzhz1crp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-QtuMn5SYrnZqv154AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVh7OHncSZo0BY0Zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJK-fe5yjpW_0cIxN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLETmjHh4sAkio3Kt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxHdc_m-tSlXOrTJl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHMRwgEuQIl4zVE-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsCosPTuzNNiIxz6h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMYyd1y7gtPjC-1XF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]