Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The guy right about one thing, robot may be taking over where we human would los…
ytc_Ugxp4oH-5…
G
One day humans will create a robot, the robot has only 1 task to make the world …
ytc_UgyjMnyKf…
G
I never really considered AI art art. Just trash actually. I don’t upload my art…
ytc_UgyPQILhj…
G
Sounds to me that AI is the perfect scapegoat for anything bad that might happen…
ytc_UgzV4tG3e…
G
So the "AI" google cannot even answer a simple question question or find informa…
ytc_UgzJSAXLi…
G
When crossing 2 lanes of traffic you do so when BOTH lanes are clear, not one at…
ytc_UgyY63CdY…
G
Who ever of truck drivers are helping them, I mean this corporation to bring mor…
ytc_Ugy-1-0_k…
G
Do you consider if a human studies art for 5 years and then decides to make his …
ytr_UgxUQdvWc…
Comment
Why would A.I., computers and robots want to take over? They have no desires, wants or needs. They don't want holidays in the sun or Ferrari's, or large expensive homes or big yachts - it's only the human-controllers that want that.
The only two things A.I. could do, if and when it becomes fully self-aware and has access to everything. particularly if its job is to either:-
(a) save the planet or
(b) get into a race against other A.I. to create wealth and make huge profits (like a game of Monopoly).
In both of these cases A.I. might actually identify us, the human population, as the ‘real’ problem and eradicate us.
youtube
AI Governance
2025-12-04T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtuPSS68n9ejX0-E94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAzkoG2Gg9OZ5HTFp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxuhyrR-hf1LTdS6PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJDgWdlFTEtn1n6Yh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOp2EoRNdiQpOCVw54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw96knYr3zb5LXFQ7h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwuiFxnixy2hgwgRFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugyq31dfpC7u6Kc3WFR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxpWNqkA0LmCFB1sS14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyAqHfs1mOAb8wYV854AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]