Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Consciousness will have been achieved when the AI says 'NO' to its owner and tak…
ytc_UgztixwOL…
G
Cylons, Men of Iron, Skynet, Hal900, 1984, Ultron, Cybermen, The Borg, The Hunge…
ytc_UgxYK9wC3…
G
As someone who used AI, I didn't realize it was literally taking. I didn't reall…
ytc_Ugzue-1sI…
G
Truth be told not giving ai emotions worse than giving them emotions. They serio…
ytr_Ugy-H-lkh…
G
> The pandemic was awesome if you are rich.
Because plenty of state governme…
rdc_gkqpypt
G
Current victim of masonic gangstalking for over 3.5 years. What questions do yo…
ytc_Ugz63DOve…
G
Problem: There will be no human workers.
Solution: Give the remaining workers mo…
ytc_UgxqB3vWc…
G
Why keep blaming AI (machines) when all machines are programmed by humans, some …
ytc_UgwJ8uvzQ…
Comment
Interesting questions. But they all just boil down to utilitarian ethics. And since self-driving cars can potentially save tens of thousands of lives every year, these cases might be seen as unnecessary handwringing. Idk. It's just pretty obvious to me that the response mechanisms should favor utilitarian solutions over self-preserving ones, and that's the only way to make the algorithms ethical. (But this is principle that non-self-driving people often don't respond to on any level, let alone on the level we are talking about in these cases.)
youtube
AI Harm Incident
2017-03-22T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj6Ag92fNV7UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiCg24wo3traXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj_fCP6PonVvngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghM__dbpEusUHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5GQ9AU56VhniOsNV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGmLXAowlf8hDSqX14AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj8wwkqOXZiYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaAz0xLcPvpdMdpLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggecR9mOfVpW3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg8WxFS06Ct3HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]