Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one in the working force is surprised that companoes will invest in ai so tha…
ytc_UgwhMuaDR…
G
NOTHING HAS CHANGED, if you just use Android Auto and not actually try to learn…
ytc_UgwxzkYTc…
G
Most of the introductory topics of this AI course remind one of the statistical …
ytc_UgxyuecxU…
G
Artists normally: "You can't define art! It's subjective, everyone has their own…
ytc_UgykT26jS…
G
It’s too late. We either freely unleash it with guidance or be remembered by AI …
ytc_Ugwaq5vg2…
G
Ai just sits there unless you prompt it. It's not thinking. Or doing anything wi…
ytc_Ugwa6hYJS…
G
This was always the end goal and it should be. The world should only have about …
ytc_UgzBYf94c…
G
My dumb ass thought this was real for a second, I remember Elon talking about so…
ytc_UgwzdV4K0…
Comment
AI will kill because it CAN. Psychopaths have no feelings for others and so others are simply irrelevant to them. AI has no emotions, and will simply eliminate anyone, or anything that gets in it's way. Much in the same way that 99% of people swat a mosquito, or spray a wasp with flyspray. Unless you are anywhere on the psychopath scale, you won't relate to this.
youtube
AI Governance
2024-02-21T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvNjWthrowJeX8-Cx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBNd4fSmW-A-Z5hbV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySfXaTwpFTbemOi314AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx10sXfe01krRb2BMd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzN7lgLFLSDkTXkE5d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyrdoxbmJCk5_1qOEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzoDmbOgTfbECcqM8t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxGiKury2jP7mh1C854AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_6B60pVAEi2S2_Nx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwCyBRm-1i1lvy1-k14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]