Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
9:15 As someone who works in this space, let's be clear, today's "AI" does not K…
ytc_UgzEehQVv…
G
The test of a novel optical illusion (and not Googleable) presented to an AI. Si…
ytc_UgyXMbSGU…
G
AI is sycophant in its current chatbot iteration. It agrees with whatever narrat…
ytc_UgwagPxVb…
G
It's not going to be awesome.!! Okay once the robot teachers and kids walk the d…
ytc_Ugy8Yr8UA…
G
I think it would be good for context if you included some information about how …
ytc_UgxaHw3hb…
G
It's very disturbing to hear a supposedly intelligent human being talk about the…
ytc_Ugi4loXn5…
G
If I made a statue and it started killing people I'd want it destroyed. Regulati…
ytc_UgzhL-K3I…
G
Companies aren't plugging AI into sensitive, confidential, or messy data. They a…
ytc_Ugwhoizvw…
Comment
there's something i don't understand about agi vs narrow ai: today's ais can already collectively do any mental taks better than most humans, so what do we need AGI for when effectively already ahev the same functionality?
and wouldn't the step to get there be relatively easy? just ahve an ai decide which ai model to use for any given task, and voila.
another thing: how is chat gpt not already general when it can handle so many different language dependent tasks?
youtube
2026-02-06T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx-hvAecg1N2Ty_l854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUpwiscvkMHSS77rt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugymlz0wtY5lzKgXJjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTuxEjNG1xLwEpu3Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5eBNXrenTsifoRCB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0QAAVQ2oodZdV3vN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0Z5Mx2a1kUUbienF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYjJ4tgt4WQafuIrN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsFjcjIn4qQsrbXv54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwK0HLfjBRSMnAtC5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]