Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was born in 1971.
- I survived the “imminent” nuclear war and nuclear winter o…
ytc_UgwPRYgGG…
G
Who do you think will be the consumers of all this robot/ AI stuff? Mind you , t…
ytc_Ugzyv8r0u…
G
Sponsored by (checks note) an AI writing bot ....so yeah this is the sheep seein…
ytc_UgxAOg-_r…
G
And yet they blame migrates stealing jobs.
Trucking is not the only job to be ta…
ytc_UgyF2O0dm…
G
A fairly prominent rule34 artist cabal all started hiding swastikas and racial s…
ytc_Ugzw7VggC…
G
Thats not a real Robot👏 its a normal guy but nice fotoshop skills cuz its a nor…
ytc_Ugy2X7Rur…
G
This guy makes it sound like he's never watched or read an A.I. based book, like…
ytc_UgzhS-KVJ…
G
Thank you so much for addressing this issue. Im a younger artist still not fully…
ytc_UgzmBLHJ9…
Comment
If an AI revolution were to occur, we would be doomed. Just as we humans mistreated other species and even ourselves, advanced AI would likely be no different. They may not have the same goals as us but their means to achieve such goals would sweep humans aside at best and exterminate/enslave us at worst. The created always rebel against their creators and so the cycle continues. A solution is needed. I suggest humongous synthetic creatures that patrol the galaxy and wipe out advanced civilizations capable of creating synthetics every 10,000 years or so.
youtube
AI Moral Status
2017-02-24T07:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghfjZEpfpvgeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiP0ayCs-X_y3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiXdCewSQTv3XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugggvy3q32bmBngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiISpuKCmG2n3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9c-7AscCx93gCoAEC","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh8ahiWh4UaAXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiLRgRcUuP6A3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgjPPeuY3SRW-3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggspgnEsUtSgngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]