Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
IF FSD was better I'd expect FSD robotaxis to be driving around by now. Like Way…
ytr_UgwWwExMT…
G
Yess I don't understand why would anyone ever want to automatize art. Like I'm n…
ytr_UgwW8s_tP…
G
So the goal of AI is depopulation by destroying the water.. they’ve been complai…
ytc_Ugy6SeXJt…
G
You’re not getting it. Once AI is in control, whether you have the physical abil…
ytr_UgxPh4syL…
G
Totally. When I worked in credit card lending, we *never* refined our models on …
ytr_Ugxn2Y6Km…
G
Waymo charges too much per trip because they very little labor costs. They shoul…
ytc_UgwmSTeK3…
G
“Humans suck at specifying goals and machines are ruthless at following them.”
…
ytc_UgxIITIij…
G
It will never cease to amaze me how AI defenders will say “I just don’t have the…
ytc_UgzTcuZyf…
Comment
AI wouldn’t last as long as humans have, without humans. They could certainly wipe most of us out, but our extinction would lead to their extinction. At the end of the day, a strong solar flare is all it would take. I have a hard time believing the responses aren’t programmed somehow.
youtube
AI Governance
2023-07-07T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyft19WDc6KAqUSS2N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6RqEiCc74Qj6as1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgmK2M2qlHpmCFC7J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyqiIT6jkjFKFRvect4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwAxKFL70UfpfjPxrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu19479bLPHtsb2Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxcS-wuo0aOpYAueVV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgeTYr6OeFge0_pbF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzlCVMRe2grRMYSVRV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhS5zWhwpOQQUpOLN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}
]