Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve been loaded at warehouses that were 99% automated. Soon no need for humans…
ytc_UgxuunhtJ…
G
Do these people not watch I robot?! And films a like! Jesus christ on a bike. We…
ytc_UgxUxo6KL…
G
No duh!
Being Objective is r a c i s t and s e x i st these days!
Any, yes ANY…
ytc_UgxbA3pnT…
G
Again deepmind AI did not win a mathematics competition using the algorithms to …
ytc_UgzuuXOla…
G
And also there’s no limits to AI the more they get trained the more they can per…
ytc_UgwS5J8Go…
G
@LFCV123 finally, another person who doesn't live in a magical dream world where…
ytr_UgzPhmQzV…
G
The point of the "smart home" has NEVER been the convenience of the user. It i…
ytc_UgwoM2br3…
G
i can accept full ai in a few cases.
1) if the ai was flawless, and remote upgr…
ytc_Ugykgw-hR…
Comment
If humans are in the way then so be it. Humans are really disgusting with how we treat other animals. I don't care as much about humans as I do about our successors. My biggest fear is if the ai destroys the internet and fails to gain common sense (I think that robotics is needed for this so that AI can live in the real world, then they won't depict every analogue clock as 10:10 for example). If AI just builds on a poor foundation and starts a sort of recursive process of generating more informational rubbish it then learns off, then it could end up in a forever growing digital hallucination. This scares me the most because then we have killed our baby.
youtube
AI Moral Status
2025-06-06T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzh55EsBgQsYN7xPIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxw24pcqfthScXxRyp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxq-3Mj0p1w0a2ocnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUvB3U2YdMb75ihSp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8gn5Ny8T6d3sdnyh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzhPMmgdrUQTyofMHp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwn1i9Mxum4FEfdcZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp6trz9EL5MO5cEFN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwWGcyPg67i_iOnxLZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuQfBi89yJI_USdzJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]