Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am a person who supports efficiency to make life easier. But that's generally …
ytc_Ugx7oB51x…
G
What's sad is that businesses and some AI creators seem to be okay with this. T…
ytc_UgxEJ7KkM…
G
Literally any question a human asks to determine if it is "sentient" is a questi…
ytr_UgzQgPjir…
G
The recent upbringing in AI art has been terrible and it makes me sad, but it al…
ytc_UgyGsoDtA…
G
I can't believe what I'm reading in the comments you people all talking about AI…
ytc_Ugz7jgKcn…
G
Banning AI weapons can wait....developing counters for Nuclear weapons is a thre…
ytc_UgyiCA71F…
G
Yeah right. AI will consume more energy than the whole of humanity by 2028 if it…
ytc_UgzDRtwqT…
G
The main problem I have with AI right now is that it's trying to replace the job…
ytc_Ugyu208O3…
Comment
the disturbing thing is that AI could actually keep track of the lies it tells. as an artist i can imagine scenarios and describe them fairly well even if they weren't true, and when i do lie i keep it to a minimum and incorporated into actual events i can recall, i rarely get caught out lying, but AI has a "perfect memory" so, that i find disturbing. of course the real problem is the intention of the "inventor" - if we have mad professors into world domination then we could have a problem on our hands. but i actually think "true" AI will come to the same conclusion humans have, it's more productive to cooperate and win-win than to terminate what you see as opposition.
youtube
AI Moral Status
2024-07-27T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyGXiWENJkc2MPySuR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx8vUlGDPwckpcGNlh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwWLgH8fH_sEpZ5NN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRepkSkugrQdZ8k0t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNU5KuAkzpOEUdigZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnhfSDbH1wGQnRZrV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPOVuQr1_SYqdfAKB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylQDssNvtpjpGm00F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPu77ZOMypxcpqdZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4sthz049jutHkwUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]