Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI stans told me I should quit because AI is superior to me
So I've quit
Packe…
ytc_Ugzcepuj8…
G
it not AI at all. She literally drew the scene and simple a song from Hamilton.…
ytr_UgwN7n-XL…
G
It won't happen, they are pumping stocks as apartheid Edison has done for long …
ytc_UgzAQAO3G…
G
You NEED to have more conversations about A.I. and Universal Income. Because I d…
ytc_UgzXIc1uO…
G
I hate to see folks lose their jobs, especially due to automation. But, if you c…
ytc_UgxdjQwA9…
G
If it was up to me to destroy AI servers, I would do it tbh…
ytr_UgztSfi-Q…
G
Pfui, das ist ein Weg in die falsche Richtung. Ein Roboter muss immer als ein so…
ytc_UgyF3XT4s…
G
that depends entirely on WHO makes the AI! If warmonger dictators who want to fo…
ytr_UgzZl-Jte…
Comment
Hal or whatever the male robot's name is, already seems to be mad at humans haha. I swear if I see one of these in person I will destroy that shit and break it into 4,000 pieces. The things he says are so creepy, like how he randomly blurts out 'in 20 years, robots will be able to do every job humans do'.
youtube
AI Moral Status
2019-09-26T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxha7WhN2xXUiiH8cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx18aMzxMIonjoWezR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxRDvc0kbmPUNXRa4p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3ySGkovjfvKQhTsZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwZm8cIIfgywAs4n3h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhWUx-3Osyj-SEiiR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy0U39FHc8OiOD6jAB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwR_7HrfDsaE5Pku-14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-ILJUZY8WyxYNYKZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyik23PTmDG9AhTQ-x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"})