Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You hear how afraid they are, our 'betters'? They don't care about famine, or po…
ytc_UgxHHNOVD…
G
I don't care about ai being a problem I only care its stealing all my ram and go…
ytc_UgxpwXYIH…
G
i left the video here 6:30 i hate AI taking over art, but in this case, its just…
ytc_UgzGLUtZw…
G
To anyone that actually believes AI is going to make everything better, all I ca…
ytc_UgyVg2D1J…
G
Human society only works because we are interdependent on each other. AI removes…
ytc_UgyAKryGA…
G
Hey this is all bullshit. Let’s get back to basics, all you scientists overthink…
ytc_Ugxz0byrD…
G
Art is also a reflection of the self, a response to the art before it and the ti…
ytc_Ugz1-jx_7…
G
AI controlled robots will most likely serve us, at least at first, then if AI ha…
ytc_UgxBvvh31…
Comment
good interview i wonder if he low key salty that openai making the first move and he focused on tesla and spacex now hes all like it needs to be regulated lol but again he did say he has been an advocate of long time regulation so who knows just found it funny
youtube
AI Governance
2023-04-19T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUQyP2S_eytOtGfEx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNuDKt6KIPbxQIS394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXeBb2Mryy1Fk0eaV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzOH5xZrPAsPTKGEjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRWBE9VHm-TMNGYeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxp17aeX9xqbdNesSd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8LE9VNV3IbG39zFR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZQ_vf-nBzEb59izh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzFdbvPdyvzGJjI8Ex4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxMIwYzMkxVi4n-rdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]