Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why I try and be nice to AI. You know just in case.…
ytc_UgySGmKXb…
G
Maybe an extinction level event?
Put AI on the shelf for ,mmm, 500 years should …
ytc_UgyLmNsPg…
G
Well no, the dozen or so people who will own all the robots will no longer need …
ytr_UgwvkBMxN…
G
Probably no one is going to see this, but what if AI is the natural evolution? W…
ytc_UgwbiR026…
G
if u were really in AI you wouldn't be saying that. an AI professionals job is t…
ytr_UgyywtWlz…
G
PEOPLE CANT EVEN CONTROL THEM SELF WHAT MAKES YOU THINK CONTROL A ROBOT WITH POW…
ytc_UgwVmePUk…
G
The real issue is how and when will some entity work on a new system on how to r…
ytc_UgzkLdbSw…
G
So what do you propose, Bern, if I may call you Bern, the prohibition of AI tech…
ytc_UgzUEJKyl…
Comment
its stupid to try and train "AI" on people stuff we dont need more humans we need working robots thats why they hallucinates and makes shit up because you give it human data give it fact and not feeling and your AI will work for you.
youtube
AI Moral Status
2024-08-30T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyvb4Gg99w7ltbtoKh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzu4uxUFDYT6Q_vyy54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDkCXBVxCuvwOO5gp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztEXtDNtW0utUnr1x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxw039njuNsWrSrrbh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx3vzprVaBlNuucg1p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwXitAvpkr_fWPJhZ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxqOnNimwLKjv_bnDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxoR5LeghIIK0Zcn1J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFO4vYTHYH2f6Eil94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]