Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is an insane amount of money going into AI right now. He definitely was ta…
ytc_UgzoU2y-1…
G
Ai is the "trigger" to move the moon.
It will reset the earth again and keep ci…
ytc_UgyUKs2iU…
G
You are in full control of the vehicle, step on the brake it will disengage auto…
ytc_Ugw5u5HtL…
G
There will be one religion in the end. That's going to be because everyone will …
ytc_UgzFPnGSd…
G
Whilst I welcome Dr Hinton's resignation, he was happy to make money and fame wh…
ytc_UgxlHMZxq…
G
this is hella stupid.
It is so full of holes that you could use each of the hole…
ytc_UgzoKvibx…
G
All I got out of your story is that you stated *_"and were driving "so" slow"_* …
ytr_UgxSTMCB6…
G
It looks like some of these people did not read Asimov's books. There humans we…
ytc_UgzMO8u5r…
Comment
The mistake AI programers make is that they program their robots to "know" things - when in actual fact humans don't know ANYTHING - we have a working idea of things that we try to describe - but we're absolutely cognoscente that our words and descriptions aren't accurate. They aren't knowledge, they representations of something beyond words that lives in our heads - that being our 'understanding'. That's what needs to be programmed - a division between understanding, knowledge and explanatory ability.
youtube
AI Moral Status
2016-03-23T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UggJDAmIWGCQSHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgghxHIY7v12M3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghWNz4gXW2ncHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgizxtiV1hvHfngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj9C6WUDQW-b3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiPuIVyLxH69ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghYx-WADXZzW3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggQGMjlEBKRRngCoAEC","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UghoxwAT4nQnlHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj6_7rBun4iHngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]