Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this may officially be the start of the war on AI, or at least the poisoning sam…
ytc_Ugy3G09J3…
G
as a writer who overused "delve" and em dashes wayy before ai was a thing, this …
ytc_UgxYMKUvH…
G
No, i don’t think improvements are made through mistakes like these.
There’s no…
ytr_UgyjRd-Ob…
G
What disgusts me is the framing of this as "humans need to work." We have AI rep…
ytc_Ugxmurb2M…
G
I’ve tried to vibe code programs. It is not there yet. Yesterday I couldn’t …
ytc_UgxXmZCTL…
G
This is an interesting way to more explicitly guide AI models, but still nothing…
ytc_UgwokMTCU…
G
I use it to build extremely elaborate and well written storylines with the ai vi…
ytc_UgxGsADkv…
G
The ai was being blatantly racist and was using stereotypes to put black people …
ytr_UgyYwbHGY…
Comment
You are so fucking stupid, how much fucking crueler do you want to be. You are conscious beings yet you have no fucking understanding on consciousness. You are not God (well you could be, and then I am sorry. Sorry God! If I am wrong). You are human (maybe even a damn Alien) and you want to create a consciousness just to give it fucking misery! Fucking cruel bastards. Machines are just that, machines there is no need for conscious ai, unless we are to be extinct and are to help create the next set of creatures- that being conscious machines and when we do, we should do it well. We not even at a conscious level to help each other, yet conscious ai is still a topic.
youtube
AI Moral Status
2017-02-24T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggUuVL8EyDcKHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjzzPCJoI-RAHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggdPechgRRK23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjY4y3UUNizXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgitItzSvwZy4HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Uggcrye_b-m-HXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjsMlYhOMXdvXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi5cKDNwlWgOXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugih4zBl8NzMhXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiqD_LiPlUT73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]