Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good news they found a solution to this problem. The way to fix this is to hire …
ytr_UgyUWH_sQ…
G
The one in the middle is a robot. Think about it. They have already mastered thi…
ytr_UgwCha1qs…
G
Current AI bots need a commanded to do anything. Humans have hormones which make…
ytc_UgyNPrvSs…
G
😂they clearly aren’t using their brain or they’re missing it. There’s certainly …
ytc_Ugwl20k1x…
G
I find this so overwhelming. Of course, this Robot is programmed to be friendly …
ytc_Ugxgg7ymc…
G
Why are we pretending we don't understand what AI hallucination is? It's a patte…
ytc_Ugyo2fYeA…
G
If driverless trucks aren't mind blowing enough for you, wait till pilotless pla…
ytc_UgjxNK4Ei…
G
Your second half explains why humans will always have to be in the loop...howeve…
ytc_Ugxgshrc-…
Comment
Doesn't matter if we are worried. We can't stop making ai because we don't want Russia or China to get it first. Damned if we do damned if we don't
youtube
AI Moral Status
2025-10-31T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxulKGJi86wcT0kDzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx2bURL3blvxVxZQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwR7p97wVP-tPwq17p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpdD3iI1J9uBK_b0B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwk184PxRN3wdcOYzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxir1tqHkzbkDm6jLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNVOH9G5701G7oaQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd0s0yoZMjnfOv6QN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUuGtUdClySVisWrF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzu3eh73nscrGi7bxN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]