Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I swear my mom saw me texting a character and when she saw me texting the ai ask…
ytc_Ugw0MwfAg…
G
Maybe it‘s not as bad as it looks like. When companies are utilising AI in white…
ytc_UgzJF822P…
G
i m a proffesional mobile cleaner. so the ai robot should drive and clean everyw…
ytc_Ugyl0Pmfi…
G
Boycott AI! Starting with the AI search result that is always positioned at the …
ytc_UgxdrUUIf…
G
Nahhh soon there's gonna be No Ai involved some shit. Coz it's too much easy
For…
ytc_UgwDXKCLE…
G
Ai has killed people. Israel uses lavender and executes "daddys home" automatica…
ytc_UgzvqmocB…
G
How limited are we? Can we not think of other productive things we can do aside …
ytc_UgzSXt_w-…
G
Not for Northern Ireland, Scotland, and the majority of young voters.
*Edit:*
…
rdc_fwhst3t
Comment
There's more to consciousness than just intelligence. The whole entire human organism contributes to it's consciousness. Sure, your arm might not have the ability to think for itself, but combined with the brain, it has the ability to feel when it is touching something hot. If you remove your arm, you no longer have the ability to feel heat through the arm, therefore you have lost a small part of your consciousness.
Therefore a computer won't feel any pain as we know it, unless we combine it with biological components that mimic ours. This is probably not going to happen with AI in the near future, but it may happen with biological engineering any time soon.
youtube
AI Moral Status
2017-02-23T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugh2_714Rr7943gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjLPKcFZRHiiXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UginEqjRd5em13gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggMqTUOENgjRHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgibRYK2TCV8jHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj7gYHfl-AHEXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghQcXo2NeEIBXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgivL0GvDTnRGHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggjxv2nscYrjngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugh30nYlNuJ7dHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]