Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Feel like the ones defending it the most are the weird coomers that have been pa…
ytc_Ugy3gAIfX…
G
its not fair for one single person to sue an ai art company because it mishmashe…
ytc_UgwLzgIN9…
G
So why do we think of AI as one thing? AI will
mimic our conflicts, there will b…
ytc_Ugzn5INNq…
G
AI is our new worker now, so be the person who build and maintain AI. Otherwise,…
ytc_Ugxzjgqso…
G
The whole AI is a tool argument(and to a extent it can be) is fundamentally flaw…
ytc_UgyNkJ2Hs…
G
The "it's like inspiration" (and also "it's just a new tool") was something I th…
ytc_UgwSCac2I…
G
I had to look up the actual clip for context. He was actually warning about A.I.…
ytc_UgxxRuQub…
G
REAL EXPLANATION HERE:
It's a dentist robot it does not feel pain it's supposed…
ytc_UgwFNOKZl…
Comment
One thing AI does not have is the need to survive. We do things (produce, invent, search, gather, create) because we need to survive —that is our innate motivation. AI on the other hand does not have that —yet. In the current state, AI does things under the instruction of a human (because we have a need). Only when AI gets an "innate need" (a need to survive, a need to expand, a need to duplicate, etc) is when AI will be a real threat to us. Right now AI is a threat because the use other humans have or will have of it.
youtube
AI Governance
2025-06-17T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6ym0rLrIsqIJVecV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdVjJNFf_FnODhZDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5zTStcMe4gMxLDkl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLn1QPauvboKiGgTV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbXUJgOhnQ7GqR4fZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy67AQft1ZtCNuG28R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHBHvNG_MsdLoFdRl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvO9jJ0bivs85lO654AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8gs1AhMoGmM44jsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzW7MGZA2OWX49B0OB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]