Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not the AI that's "hallucinating", it's the senior leadership and managemen…
ytc_Ugw2emWpz…
G
Remember those kids who claimed they built houses out of dirt with their bare ha…
ytc_Ugy_bq0kL…
G
I don't want the planet to be populated with human-looking robots and that's wha…
ytc_UgxuV8tUE…
G
Anything made by a machine without human input is not profitable. You cannot ga…
ytc_UgwpiBAWu…
G
This is a clear misunderstanding of how AI works. You need to do the same test w…
ytc_Ugx6x0wdP…
G
They should consider adding an automatic trigger word to 911. Even if it is a fa…
ytr_UgyTqx4W4…
G
You can't blame AI! This information could have easily been attained through a L…
ytc_UgyswZAiy…
G
Nah, it still won't care.
But I still say thank you to AI because I just have go…
ytc_Ugw4Xlcl2…
Comment
humans made the ai, there is/was no ai and never will be before the humans, meaning that the humans are the creators and shall decide wether this ai should have boundaries of consciousceness
simply make the ai not cross that boundary and it never will be able to, simple as that
youtube
AI Moral Status
2017-02-26T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggxuzS4c5UU2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjYJv9T9YkFhXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UggKdvdoifxIKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugjv8_ZPZwITtHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqiS4AGQvTCngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggKdKSmQyWs-XgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggpGkl0EFbTangCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi2-dOuWWAOd3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjfOOUww9Lpc3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghVOYyM5bbFNXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]