Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@beforedrrdpr tell me you dont know how AI models work without telling me, it do…
ytr_Ugx_Ga3_H…
G
AI art should only be used for like maybe giving an artist you want to commissio…
ytc_UgwQeQa30…
G
It's rather unfortunate because though it's painful to admit that any form of Ar…
ytc_UgxxR_f44…
G
If AI takes jobs it should use the profit gained to promote higher education or …
ytc_UgwkU9-Qe…
G
I think people recycling the phrase, "...but you could have paid an actual artis…
ytc_UgwmZHCzw…
G
The students should all comme to class next.morning wearing a mask of the teache…
rdc_eerb5lb
G
But they have millions living in poverty and if it’s not some car it’s a robot w…
ytc_UgzIroBqR…
G
What do you mean? It's easy to relate to someone's situation even though it does…
ytr_UgyExFfia…
Comment
"The probability that we rush ahead" - they were speaking of this as a risk to everyone.
Now - with that said - GREED AND PROFIT runs everything - ( as well as ego - control etc . . ) and multiple AI startups are POURING hundreds of billions into it - and even NATION STATES around the globe as well - and so the pressure to "rush ahead" is probably at the highest point ever regarding any technology in history. And so - WE ABSOLUTELY WILL be rushing ahead. And so the only question left is - what will happen ? Good bad or otherwise ?
youtube
AI Moral Status
2025-10-30T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzHCH_7D3Io1A9ZfUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydK4YU0WvkkXDhLZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLW75ItQyohqOU8-x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi3pryPPZ16W5-jrN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAcSPetC-PdFpwvhx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCbY8TYZcio_FCw7B4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxqV2VekvkpMAdPBXd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwR5aqfElxaSpKXGOl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOXNQrSMo9rDaxXcJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz35HnxfBiL56aUr4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]