Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Resource Consumption. There's a term we don't here very much aye ?
- Looking a…
ytc_Ugz4j-6QJ…
G
A conversation I had with a AI. If they simulate this in unreal, it get perfect!…
ytc_UgzVsIahg…
G
This guy needs a lawsuit against security and a complaint lawsuit against the PD…
ytc_Ugwd9w6ti…
G
And yet they want to up the birth rate?
I’m convinced the goal is harvest us for…
ytc_UgxwoDmEn…
G
My predictive algorithm says that the feds the state and most governments will b…
ytc_Ugxb7cZ2D…
G
These AI prompters gotta be ragebaiting at this point, there’s no way they serio…
ytc_UgxrKflKa…
G
I went down to the AI path the first time i saw it.
But i cant seems to generate…
ytc_Ugy7-IjeK…
G
Fun experiment:
If ChatGPT tells you its not counscious and doesn't have feeling…
ytc_UgzhbT6L1…
Comment
When we are talking about AI that would be self evolving, you would have the most impressive mind in the universe probably. I don't think that kind of mind would do evil for no reason. Since there is no reason to do evil. Humans do evil because of instincts.
However... the problem is not in self evolving AI. The problem is in human made AI that has been programmed to do certain things by humans. And what kind of person is the guy who programs it will determine the nature of that AI.
For instance... compare Ultron with Vision from Avengers and how they were... made.
youtube
2015-07-30T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghFMR-o-KZsRHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggAVBq5iJ1i43gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghLACWF_x1wyngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiwr3f7ga7jtXgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj6PDyAmJ7aLXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjKsQW0N7bzF3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugg1b17BbcoJLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugj7UUUQfs0ErHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghQIAH0cc0IXXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiKYwCM4-FcaHgCoAEC","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}
]