Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Skynet is a fictional artificial neural network-based conscious group mind and a…
ytc_UgwlLsNxj…
G
I would not be near those types of robot my granpa died cause one of yhem killed…
ytc_UgxjbnwI7…
G
this is a consequence not of the original training but of how chatgpt was tuned …
ytc_UgxMHHx0c…
G
No, that won't ever happen... AI is not taking real jobs. People are going to us…
ytc_UgzT7hs9k…
G
Were all in a world that We created. While the world is what we made it, there w…
ytc_UgxIhh_Mm…
G
Funny how Elon’s example of dangerous AI programming is telling the computer tha…
ytc_UgxTPgvUr…
G
@MirelaVerse8101 Still can't believe that a movie that bad was created by huma…
ytr_UgxySZDgN…
G
Where ever there is cross cultural management especially bilingual managers are …
ytc_UgyFFtEv0…
Comment
I think that AI will eventually replace humans by simply being better than us at everything, just like we caused the extinction of neanderthals. Sapiens weren't necessarily hostile towards them, but we were just more efficient hunters thanks to better cooperation. We were in the same econiche, used the same limited resources, thus only the more competent species survived.
TL;DR: We have dug our own graves when we create intelligent, sentient machines.
youtube
AI Moral Status
2016-12-17T16:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Uggrc8tNRdogingCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disgust"},
{"id":"ytc_UggtiVitI-RijHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghzkJNOia_YlXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiAhqmX_fK3T3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggAGuj2jism43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggcVHK1P5nvJ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiWDiGIha5ZA3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggKQN2feY0P9ngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgjeHZ3etkbLp3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJt9r-M0ktEXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]