Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Agree.
Op makes the mistake of thinking that the AI we have today, is similar t…
rdc_g0y7v05
G
AI generated images reduces art to the final money worth product which goes agai…
ytc_UgxFfTC9p…
G
At first I thought #2 was a real (edited or reversed) clip of someone just actin…
ytr_Ugw8fkvpV…
G
Something i've noticed: AI has a huge gap in knowing any short stories.
Bit of …
ytc_UgyEKyuia…
G
This is dumb. Why does people think chatGPT can fail? I use it everyday just fin…
ytc_UgyqNsqTQ…
G
It was over when they chose the kill the cat. Ai still has much to learn.…
ytc_UgxqYvQYB…
G
The real reason AI wants to run education is to train students to think the way …
ytc_Ugzo6NoT7…
G
At this point is it worth it? We clearly can't feed it any "unbiased data" as th…
ytc_UgyjWGjTW…
Comment
So, my thinking is, what would AI’s motivation be to take over? Jobs are a means to an end for humans. Machines don’t need food, entertainment, creature comforts.
So I understand the argument for humans becoming obsolete, but without humans the systems are not needed. Machines don’t need those systems. They’re not sentient.
I’m less concerned about humans being eradicated vs humans just becoming one dimensional. When no one needs to think for themselves, or innovate…what’s left? More technological advances ironically seem to make us more primitive.
youtube
AI Governance
2026-02-21T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwQQ85IdbVMgDbKscR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5jTgxUrRnXkoMuJp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMwMVBFJ3givRsmrd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_Y0tkM98kcdqXa6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmclQampi7WzF1OLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAaEHT0SQmSf9Blp14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugztfv-EohV_ls_5TAV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzjIfmRjOmTeageMJF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1NRimn3F7HtOMY_t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2h44AkYL_hcxbi4J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]