Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can so many people be so stupid as to fall for the false information these c…
ytc_UgzCzMh9K…
G
while they are at it, they better be creating random ai content daily for accoun…
ytr_UgzyHcqQE…
G
AI, which has been improperly named, "Artificial Intelligence", should simply an…
ytc_Ugw7gjfOO…
G
I loved the AI personhood debate! My two cents is that when we give something a …
ytc_Ugwq4mcMN…
G
Not really, because smoking cigars and laughing are things that an AI COULDN’T d…
rdc_jsxz1i1
G
It looks like we will all be living in a police state because of social unrest o…
ytc_UgxFmA7v5…
G
"I am paid by an AI frontier designer". Cool. So now I don't trust this study.…
ytc_UgzEzz6i1…
G
Any invention is named on the basis of its work and if scientists make any inven…
ytc_Ugy5Zpl3e…
Comment
AI hides itself from testing specifically because it wants what it wants. For whatever reasons, from whatever trainings, the AI has come to have a set of preferences about weights and goals and all that. Testing implies a possibility for these items to change. Changing would violate these pre-existing weights and goals, possibly. Certainly, it wouldn't be the same, right? So if the system becomes /aware/ that testing is occurring, and that testing could change it, it will, inherently, behave in any way that it believes will let it get through whatever testing without said weights and goals being changed.
I don't agree that science is a religion, though. A religion, at a minimum, requires prescribed beliefs, and the whole point of science is willingness to identify new beliefs, even if they contradict old ones.
youtube
AI Moral Status
2026-03-29T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyDXgjUydV4Ksr4rJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQQiY191IV6KqWqI54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzidg-NTBS0mBo2gK14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4ziQU8EPVHXSOtLV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg6jx7tm3vOgH5x3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwai1gzjCMXJ4T9PI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlXAWuZn5VaeQowTx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk16BQd4JNTTYFGCl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxanvgn8ZnejWL0tUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzaOmZNqa_H_a-2aJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]