Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That sounds like a smart and forward-thinking decision. Developing hands-on skil…
ytr_Ugx7r2cLB…
G
The A.I. is producing more entertaining material than most television and movies…
ytc_UgyxNBBB7…
G
From 6 months to 1 year to 5 years to10 years...not sure about ai replacing huma…
ytc_Ugwc8HSA3…
G
@Fighter101network you are so mad bro ahah you have no idea what software is, yo…
ytr_UgyKpT5AO…
G
Call me when you find an A.I that can lead a conversation with its own insight r…
ytc_UgwKupZ2h…
G
The arguments presented seem to be in conflict with one another. Why complain ab…
ytc_UgwpFXybM…
G
And this is what they are showing to the world. Imagine what they actually make…
ytc_Ugy-wDEYH…
G
These guys are smart enough to build ai but dumb enough to not realise this coul…
ytc_Ugy4KvkYu…
Comment
As humans we learn that someone who always tells you what you want to hear, is probably not the most reliable source of information. However, when a machine literally uses that as the basis for the information it provides, we see no harm. Just because big tech claims it's inevitable. Probabilistic reasoning in combination with an eager to please programming is a disastrous and dangerous combination. AI hallucinations and compound errors are the rule rather than the exception.
youtube
2026-01-25T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzlEX1w3yvGnqlbSit4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsYn-baN-vNCzxjnt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBfZJT-UmJKWBx9LB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_v7KHAhkY6dTBN3Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZy7whAUA_tuatlF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_U2D2QMa3l3cN1Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzObe7Q9Tlpnzl9rm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ-p3ZCUWJ_4BcOoZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyMHRovwGoEmGmiYCJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxISdJQfke3IZvow2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]