Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My question is: could intelligence be a property that emerges once you surpass a…
ytc_UgzY6hTRn…
G
Let's be real. Future existential risks are going to stay regardless whether we …
ytc_UgxHsjUOg…
G
I love your videos on AI because you're so confident and it's super reassuring w…
ytc_Ugy8vX76B…
G
the idea of the image was from them, just because they're using AI to make the a…
ytc_Ugwvq_LAW…
G
🥸🤔🥺🫣🌐 WORSE CASE SCENERIOS IF GOVT and LOBBYISTS OF AI TECH INDUSTRIES and AI B…
ytc_UgzwE-I9K…
G
I don't like automated machines that look like robots. Picture yourself as an im…
ytc_UgyJGIAg7…
G
Devs stop using AI. They are unknowingly training AI models to replace themselve…
ytc_UgyocYkLg…
G
Then why did you create that shit if it's so dangerous. I more and more think th…
ytc_UgyydyrWx…
Comment
So speculative. Yes a crisis is needed because nobody understands why a non-living "thing" would "want" anything. The threat is unclear. Job loss may be a way to control population. AI weapons are another category not talked about much. They are destructive by design and could be mishandled. Think Terminator, automated warfare not responsive to the needs of the people.
youtube
AI Jobs
2025-11-04T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzPP67RfXm8D6nNtIt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx60amac2lEMAACmuR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw5KtPOz4Gy1tKSu9R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyTLQ0TqVRSHx5Z27V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyaD5YACAwO-B5SYQZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyJqwfhlZ0r7GXjU1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbxEJOqYKCxxh3BkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXJwQ537dfa6nkY6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugykpn_aDVs0GN_9Nmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygNAhCuv8AmODy_3x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]