Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"You're beginning to sound like Jordan Petersen ..." "I understand your frustrat…
ytc_Ugxww1BeW…
G
From a technological perspective this test is a little misinformed, in my opinio…
rdc_icigdnu
G
Wow you create robots, give them AI and the first thing you do is stick them wit…
ytc_Ugy19ssD3…
G
Problem with this kind of automation is how hard it may be to recognize or fix m…
ytc_UgynAAxBW…
G
We’re asking the wrong question about AI.
It’s not: “What happens when AI takes…
ytc_Ugzggtr1J…
G
It would be wildly meaningful for me to be able to raise my children without hav…
ytr_UgyYMju6Q…
G
the irony of you taking a photo of a piece of art in a museum and then posting t…
ytc_Ugy7XnE3c…
G
Predictions of AI replacing all human work in 50 or 100 years are ridiculous. Th…
ytc_UgyQunBSn…
Comment
Wellp… at the end of the episode… I have no doubt that we’re on the verge of societal collapse, so AI being the thing that brings it on, not at all far-fetched to me. Especially considering how arrogant people are around it who want to pretend that either it won’t happen, or humans are clever enough to stop it from happening. 😖
youtube
AI Governance
2024-02-16T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw3IURGz0q8N11igpJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvH2hYP5hU0798phZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxg574sW8OPVfX4S-F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKIccbvf3zNRZgaPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwb3f0EGyB225ur2GN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZsBEGJ_AmjQNnYIB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6o0MUI_BNdKNtE2Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxc2LfB06odbwf0f2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzTPNv9DCdJkqG2jMN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwo0mxNYvWJknQ62TN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"})