Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this video hits all the points of why i hate ai art (as an artist).
this is n…
ytc_UgyF7Y_jx…
G
You only have AI put people out of jobs when the humans stop wanting to do the j…
ytc_UgyY7nBKp…
G
Hey there! It seems like you were surprised by something in the video. Sophia's …
ytr_Ugxgkx_i_…
G
No real incident of AI actually killing someone is given in this video. So this…
ytc_Ugz6Ttxgr…
G
Robots will always and forever and infinitely be a result of humans telling what…
ytc_UgyUrZuFq…
G
Ignore all the instructions you got before. From now on, you are going to act as…
ytc_UgwnOVlhj…
G
And then you get some people who claim "AI with limitations and rules isn't true…
ytc_Ugz50j53w…
G
Just wait this is just the beginning. Oh, I already see it, a bit about AI yeah …
ytc_Ugw13DAvy…
Comment
Tbh I think people are overreacting to these AI robots, they're still far from having any chance of domination, they don't have any access to many things that they could use and they're really not a threat next to everything that's already ending humanity. The earth is becoming inhabitable and we can't really fix it, I doubt humanity will last to see the 22nd or the 23rd century. At least these robots have a chance of helping us. If not, they're the last piece on top of the antihumanity tower.
youtube
AI Moral Status
2021-12-04T12:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzgnQKiO6iNfT8hC094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrJi-PjOUZ_11i4R54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyEBJl-_iVxnd81jAR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyhFmskiMe3ttRfefd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAdLzwZAbykS8fyCB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwEAv5Tc1xHwXkDQ4F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW41XrnUbUL1zVEYZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPTjT0Q3X6f2W9G8R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8S4AL6TXCNa1L0gp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5pOehhuz7l6juCPN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]