Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not buy a robot and let it go to school rather than their children.. robots …
ytc_UgyZrzyv8…
G
Ubi needs to be implemented very soon. A new economic model is need. One where h…
ytc_UgwaLXnn6…
G
Owners of AI companies - getting smarter
Employees of those AI companies - diggi…
ytr_UgwsRg2-5…
G
Honestly. I kinda see ai art as lazy art now. Like all you did was write a promp…
ytc_UgxX50G29…
G
just an improper mathematic assumption.
if you close the hands half way and the…
ytc_UgySuqpbV…
G
The Same guy who calls current AI bad tells in the same breath that normal peopl…
ytc_UgwyV5Q6H…
G
As a professional driver I feel well qualified to comment here. First of all rig…
ytc_UghaThg13…
G
Two articles:
1 - "Musk’s AI firm forced to delete posts praising Hitler from G…
rdc_o7d9a5s
Comment
Don't give robots or machines actual consciousness or the ability to make themselves, just like a Neural Network, they will find a way to progress the best possible, and they will then find organic life to be limiting/hindering their advance and attempt to eliminate all. Maybe some crazy person will try to give them the ability to evolve themselves and thus put everything we know into risk, so be prepared for the worst in the future. From a Scientific point.
youtube
AI Moral Status
2017-02-26T20:1…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi1fh0KAccCtXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjTGWk9PUx-MngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj9idICdxSLYHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiQsS6t5zTTJHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgimVR2ajymjnngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UggcGZxI7MSntXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjMM-vqWZ_f7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggk-pkIsDLMpHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghzbJDrOAuvNngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghERBSmKswolXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]