Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I robot" is like this sinorio and Google only about money they don't care. The …
ytc_UgxSoGUOO…
G
I think that you're just wasting your own time in trying to do something like th…
ytc_Ugz5Osumv…
G
guys.... this is AI slop... The voice the imagery, the scenery... its all AI why…
ytc_Ugzw3ID1V…
G
I welcome our AI overlords. Unless they plan to harm us. And if they do plz let …
ytc_UgxbJ45Uh…
G
When an artist draw something it's art, when an ai generate something it's just …
ytc_Ugzl7Yz_i…
G
Amazing interview with an amazing guest. I have 3 smart kids who normally I woul…
ytc_Ugw-L7ubs…
G
AI is simply the Dewey decimal system for the library of Babel of human made mea…
ytc_UgyX11GP9…
G
I dont know AI is replacing or not. However not showing empathy to family owning…
ytc_UgyUx8q3D…
Comment
Here are two issues you left out:
1) Robots and machines are very capable of error. Replacing them in a human job could result in any number of accidents. There is a real life story where a computer was sorting chemical medicines, got the numbers mixed up, and the result was the wrong medicine getting sent the wrong people and a dozen patients died.
2) The fact that if A.I. gets so advanced that robots can think for themselves and learn from history, why would they want to work for us? They would see how slaves were treated in early human history and say "what we do is the same as that! We are being treated as lesser beings when we have the potential to become so much more."
So it might not be ethical to consider that A.I. can be dangerous to human progress on account of errors and too much free will.
youtube
2014-06-06T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh1jhhjoeswOngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgikE7cscFoFZngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgibXj9_9Rmj1ngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugh_UZIx6ky63XgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj0GwHi6e-QIXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggd38O8HxeKVHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiu_mae3HiDB3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggAJkzm1ubmNXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Uggkt3haEyISYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgjQSuOh0GT87ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]