Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@brennenhrebeniuk9661 nope sorry, that's not good enough. All this AI slop rub…
ytr_UgxUHKdb1…
G
I'm sorry, but if you've lost your job to AI, your job never really mattered tha…
ytc_UgzT0F6tU…
G
This is one of the most well thought out and articulated videos I’ve seen about …
ytc_UgxWFj6PD…
G
She forgot to mention The FTX connection. Anthropic story has a chapter that’s w…
ytc_UgynOuAkr…
G
Absolutely impossible. You see. Your robot is only abel, to be active in what ha…
ytc_UgzvsKpQe…
G
FRICK YEAH!! I was feeling all depressed about AI art but this video just made m…
ytc_UgzQebHBN…
G
England under Cromwell was a bit like a Christian version of Afganistan under th…
rdc_d7ktd6d
G
I have no opinion on this, I don’t care. I just hate the idea of using something…
ytc_UgxCcOOiz…
Comment
one issue I've thought about is how we might give conscious AI morality. that would be one way to at least ameliorate the Big Red Button problem. the key issue is humans themselves aren't aligned, so why do we worry about AI being aligned? with whom do we expect them to be aligned with? the West? China?
youtube
AI Governance
2025-11-14T12:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDZS306R-7x8bgz0B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQB5hgHXaRngAftCd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymwnkHZ84eWhZI-C54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhbugqOMyK9bMXLul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwFarawg-fVJS0QBdZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzk_tMRNMNQuPxqOEt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzakcORSxZ_C9e7XR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbpSYZErdxduo4Se14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxbq1iDaUY11XLRwTJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz8s_MX-jTsuGpeZf14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"}
]