Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Howie: "as long as you pay me" He's invested ha ha!
Long live AI :)…
ytc_Ugwr__n3a…
G
atp i might just trace ai if it gives me:
1: attention
2: contaversy (a reason t…
ytc_Ugy_WocRC…
G
@vforvendittaanonymous7809 Who gives money for AI images? People give money to t…
ytr_Ugx98YAWj…
G
Think about it the robot prefers men and Caucasian people maybe because they are…
ytr_UgyyHdNUu…
G
But I will pick a human over AI every day of the week. Companies can do what the…
ytr_UgxJaKJat…
G
Step 1: Unionize and seek edit ownership over your business/factory/etc.
Step 2…
rdc_glku64a
G
easy, just self host your model, like deepseek or some random llama model, easy …
ytc_UgyCT1kCe…
G
So all im hearing is the way our society is set up where companies aim only for …
ytc_UgwtjN8i_…
Comment
The only way to make AI safe is to build in emotions morals fairness these are the sort of things that sway our decisions if you have a swinging brick for a heart if you are a cheat or simply have no morals you are instantly labelled evil this would be true of AI look at AI videos it knew it was putting people out of jobs would it do it. It doesn't see the consequences unless its delibrately asked to
youtube
AI Governance
2025-07-04T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyj-rO4qpQBtexKxHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5T5PxErYnrhtZEF14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_F0sWsTz3G5HQFSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdTEyP5D-e6ylulbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzW9E33bn3LU6D6UcR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxgFRzFDuMqRsWrXlN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxLId5ib7Nu59Vb0OJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyjbppvf8pSpnRI1VR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzr999ZXHaFk-PhCSN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKyJb5vFXJ2YmzT0F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]