Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it funny… they going 🦍 💩 on the so called illegals. Cause they are “supp…
ytc_UgxtVx2Nn…
G
Future generations will be like the therians in Argentina and AI plus the high I…
ytc_UgwiXURgf…
G
Ai in the US needs better clarity and governance. The Bill includes regulations …
ytc_UgzqTG3cB…
G
I think it’s closer to tracing if I was to take an artwork off the Internet, the…
ytc_UgyvqlLH4…
G
I think movies where AI is portrayed a threat has made some people paranoid, and…
ytr_UgxDJr0vN…
G
If a worker losses 10 hours a week of pay, how is that good if they need that mo…
ytc_Ugxs_2KNB…
G
@laurentiuvladutmanea3622 Artificial intelligence. Artificial mind. It doesn't …
ytr_UgzyhyF9R…
G
"2 years of experience" should've stopped there. You are not even remotely close…
rdc_mjv56u7
Comment
17:10 this is what humans do. We learn things that are true at the time, then later on science proves other things to be true, yet we still believe what was taught at first. I'm speaking in generalization here. AI learns off the average of all human interaction correct? (Correct???).
youtube
AI Moral Status
2026-03-19T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzM_r-IAAM_ZO8r3E14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXnolJKKB9Ov6zuqJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzzRoQGwAm3lYyvW-t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaRVkMByV2LC2Q-P54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzoXjsFWfPiNIShCPB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNPypgDp0y7R4ByCV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHiUBn5EHlYiyf0al4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzo4kLJL-RulOwq8GF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwqL3IHkT9t1r-iXpl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyYFnwqUibgCJfG894AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]