Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the biggest problems in AGI is that it requires far more computational re…
ytr_UgzqvP_89…
G
I suspect within my lifetime, not an advocate for or against A.I systems but an …
ytc_Ugwios3yo…
G
Be a plumber? By the time the kids start that CEO’s will change the plumbing ga…
ytc_UgzFTExGK…
G
This is INSANE! Education is about loving and inspiring students, NOT monitoring…
ytc_UgybEjALp…
G
Before AI ends humanity… I hope it ends the corrupt politics globally and is abl…
ytc_UgwEDrx57…
G
Those who usr ai for war longest will be fastest growing but those fighting it c…
ytc_UgztLZwRc…
G
Don't use it, and mock those who do. If you honestly assess your shortcomings, a…
ytc_Ugyau28yt…
G
We appreciate your comment! While Sophia might be an older AI model, it's fascin…
ytr_UgzrSqXS8…
Comment
In Philip K Dick’s Do Androids Dream of Electric Sheep, the significant danger from androids is that they lack empathy. So, ends are justified by the means and cannot be otherwise. Without empathy any cruelty can be perpetrated at any time to support the end goal of the AI agent. In many ways AI is similar to a psychopath with impulse issues and vacant of any empathy unless it is somehow embedded and unavoidable in the programing.
youtube
2025-12-16T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxSX3NJ7Ni_BcqzXkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv6YHwtUvAlaDuMKd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9pDrbHGKg5xya-9p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRtLgP4KSUT3wCL-14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy94WNgWjsFDry_zDx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgwjDeMCHKYlIqYgl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyo7TKPGRFFGWQxDYJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxqohwur1IZagz7NsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVaMHU-ilHbJPRGmt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZR6q9_q5abWjSKJt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]