Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JonAbrams-xt4tq
"Yes, but even so, the vehicle should break, slow down, swerve,…
ytr_Ugzjs8Elz…
G
I’m thinking everyone who didn’t rack up huge college debt by going into a skill…
ytc_UgyvgE8w3…
G
ai bros using disabled people as strawman argument is disgusting, ai softwares s…
ytc_UgwnDX7Ji…
G
I have to disagree with the idea that we NEED them ... Or AI for that matter. I …
ytc_UgwaZ3swd…
G
Same MO... the blame for what happens next is projected on the solution they've …
ytc_UgyyyIRBq…
G
To be fair it is another paradigm shift in IT. It’s about to be as transformativ…
rdc_jfc289t
G
ChatGPT learned from 2023 internet, so its memory is not up to date, it's why it…
ytr_Ugw_5mgMt…
G
When the 'green screen' got introduced, i immediately noticed a difference. Simi…
ytc_UgyboKBQw…
Comment
So we need Jarvis before someone inevitably creates Ultron. Then you'd need to give Jarvis all control in order to monitor everyone. Humans would be like pets living in luxury. Birthrates controlled. The only escape will it be that some humans would opt to go into a cryosleep and travel light years away from Earth in colonize another planet without AI.
youtube
AI Governance
2025-09-05T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbeYJ9Ob-NM4hSrnV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNRFqDhVAJ1w8fcbN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxzi2gVbiXoVJ3BsEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcdGmJTomhnbK_G6R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxftS7B4vMhgPRlO7N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9WwBQ202b_SZUOD14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxp8rba6Hk7aTMgdhR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJrn-1pdQqosFvf9p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyhx-KMKcPijSS8S8Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzY__ViZ2RoLQgCy6B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]