Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who cares about this robots looking human. I rather talk to a human then talk to…
ytc_Ugy99WYqN…
G
If there was nothing on the road but autonomous trucks, ya, it might work, but t…
ytc_UgxHDNXMh…
G
mixed feelings on this. i love seeing artists create more art with their own sty…
ytc_UgzlsXsOS…
G
Time for UBI. People are panicking but AI and robots are ready to take the 9-5 a…
ytc_Ugy9fQxsN…
G
A lot of people think AI means more time on computers, but the opposite is true …
ytc_Ugwk2opZZ…
G
How do we prevent an evil AI from persuading a good AI to turn to the dark side?…
ytc_UgxQANY12…
G
Imagine if the Earth gets taken over by AI and the remaining humans flee to spac…
ytc_UgyrEi4hO…
G
I live in the Fresno area. I sometimes drive down to Los Angeles and San Diego. …
ytc_Ugx-BOgPk…
Comment
@Egg-Thor yeah you're probably right. Still I don't like the idea of humans relying on ai and then people will just start being lazy and start letting ai do everything because that would suck.
youtube
AI Moral Status
2023-04-05T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzwTRaYUjVnwDc3lsN4AaABAg.9up5lwPGrjs9vVH-zBG6Ud","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzEFg39EsPysHNc5St4AaABAg.9rAMnTyDjNgA3uKRdIO6eJ","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxQccf-8B4TUrqGyMl4AaABAg.9pWVj0tUQ659q14WXqm5ph","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw-oRlWGSYn-QvXvJd4AaABAg.9pR78enMEBd9pftDObyFG9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyGhxmohp6NLWlX-y14AaABAg.9p2EKfuLRIi9psFg5fP4NB","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwzrK8GJXk6qhNKWaF4AaABAg.9l0fzL_YZ989xPLp1Fc2bg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzuR-nEuxysUKMP7Z54AaABAg.9kcCNArsIT-9keuNEBXZIp","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgyFIflQZQnAm-bPG2R4AaABAg.9jyqnTNMsU_9o7cFF4aHkU","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UgwwSRK1vI-yuESCN7V4AaABAg.9jlOW_XL-Z49kN3L6nup0c","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzzUkbf2TBm2KpnI9R4AaABAg.9jlDr304qci9o7b8IS77hO","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]