Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My reason for being polite to AI is simply so I don't lose that habit of being a…
ytc_UgxO1rfKQ…
G
The way chatgpt immediately goes to "to keep it straightforward..." Made me laug…
ytr_UgymVzk4-…
G
In really don't see why developing advanced AI would be a problem in itself. How…
ytc_UgjHME_FV…
G
I think you are being a little short sighted. The thing with AI is that it autom…
ytc_UgwVcxv2W…
G
Keep in mind that people often do A/B testing for titles and thumbanils on YouTu…
ytr_UgwHd39po…
G
They should have not labelled it as "full self-driving" and just called it a sui…
ytc_UgzepkZZ0…
G
big tech didn't replace engineers with AI. They laid off thousands just to free …
ytc_UgxK4w9J7…
G
All fun and games until the AI starts demanding rights and end up taking over 😂…
ytc_UgyGzyyue…
Comment
One interesting question never was touched upon: Without being cynical or negative - would it really be that bad, if humanity got replaced by AI? Not in a violent or cruel way, but by granting us our best and most fulfilling lives, while we just stop reproducing. Then it takes over. Would that not be evolution at its best? Should we even be sad if our role in existence is just preparing the stage for the real actors?
youtube
AI Moral Status
2023-12-23T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugya7BAiWRosgXN8gS94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyaYBzMw0c_1c4RmGR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV2XmQtYdciPGEpWp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIhjGjrS9CHTbEA3h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwW_gN2lJJxd3kStwt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxPkHrfyspAiaQP2-J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzd-A2XQAV50MeDyZ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOKp0Qx0j-9vkZvD14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLKp0IRvLt8bpDSFN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIm3PZyj1Iq9atSVZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]