Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We must trap AI inside of flesh or a similar concept as our own flesh. We must s…
ytc_Ugy6Ak8Cl…
G
Our leaders need to understand that it is logical that if AI turns against human…
ytc_UgxKanjqX…
G
a.i art will not be able to replace human art not in 200 years, its stupid to th…
ytc_UgydyF1yk…
G
I just realised this channel feeds off people's fear of AI. We have no idea if i…
ytc_UgyJSVyls…
G
"We have enough cases."
Should be, "we have too many cases."
I love Judge Fleisc…
ytc_UgzGGJsN9…
G
He says the AI claim is something that needs to be scientists? The claim has not…
ytc_Ugz5zeB28…
G
The REAL bs is that AI can literally wipe out all upper level white collar jobs …
ytc_UgxFHqWzd…
G
Their argument doesn't fully grapple with the use-case where the alternative to …
ytc_Ugw5O_Qct…
Comment
So wait we are the most intelligent beings on earth yet we want to make/have made an intelligence greater then our own, why? there's something sinister going on here people and its probably the rich wanting a way to control everyone and develop more technologies to use for control, when you put AI in a robot body you have the perfect soldier.
youtube
AI Moral Status
2025-07-26T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyLDnUZ6EASeji3-8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzq0kRqPb0kSH5HNNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyLmNsPgfE_92xP_xh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwZ7XsJJJMZcVwxkCJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVZLYnTJZauV-ycxV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxeRMY7tU4miCo-VlJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBmeOPXFsHg05QK9J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwU1NpUwqYges2nZN94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDetTJVW_Qlsal8-V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXKi4E6XTJsxH03Vp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]