Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@MattCurry-v6b4zI am personally aware of the absolute lethality of the robotic …
ytr_UgyYsFIkg…
G
I mean the reason why humans aren't that way is because we have morals, ethics a…
ytc_Ugw7zjMlF…
G
Why human art is better than ai's in one sentence
"Robots cant make human mistak…
ytc_UgxvIKpMJ…
G
If AI says unaliving people, we shouldn't ban it any more than we should ban car…
ytc_UgwUNpe6v…
G
Is it bad that I focused on the ridiculous walk that woman did when he said the …
ytc_UgxZcKnJi…
G
For legal research, Westlaw or LexisNexis. They are leveraging AI. Robin & Harve…
ytc_Ugz64zyaF…
G
If you grew up 1976 reading the first few years of 2000AD comic, you will see EX…
ytc_Ugyn7pyHO…
G
Yes, lets unemploy every human and replace it with AI. Economics will grow to un…
ytc_Ugwn6aUPC…
Comment
AI might not be actually sentient, but if acts, talks, and feels to us like it's sentient and when it's loaded into a weapons system, and it operates autonamosly and kills people, it really doesn't matter if it is sentient or not. It's dangerous to humanity if it develops it own self preservation code. Humanity would do well to not completely dispose of the old technology that the current non AI infrastructure is running on. There should be a selector switch on everything for/ hand/ off/ auto/ and AI/.
youtube
2025-12-02T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxTwC6gxFWolraCJUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyesCM0OGZM-2-UAtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxpgmtJr-FpuIP68ZV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz11OkXymUhV-p8czR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEFORGe_VM2FzO_6h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxj35kIHBRL8iWqjZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQKm0-3-HRWhQQPy94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxYLJiruJkL9fDcfHl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOmneNPjO5IiMk3F54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCCAyZ5U7RVa7KXDZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]