Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I never thought people would use digital art to justify ai art, and it's so stup…
ytc_UgwShcqHU…
G
@magen6233 Humans are driven by the primal need to find heat sources for food an…
ytr_UgxjNJGYH…
G
Absolutely! Sophia's insights are always fascinating, and we can't wait to see h…
ytr_UgwN3eU_k…
G
42:25 - The part that is missing in this conversation so far is self-preservatio…
ytc_UgxsDYGt5…
G
Is it just me or why do I think the robot who side eyed the camera Doja Cat? 😭…
ytc_UgxPjCnPt…
G
He has no social media presence the US government used AI to go through every kn…
rdc_gvak3zw
G
Even a simple AI with good intentions could be problematic.
Imagine the Gov. est…
ytc_UgzI9r_Pu…
G
Isn’t anyone interested about talking of how they were all bald 👩🦲🧑🦲🧑🦲
I thi…
ytc_Ugx2uSsOo…
Comment
i am wondering
if we assume ai will take over
and we understand that ai can do any job better than we could do it
why would it bother with doing things only we need?
i would think things like farming and plumbing and such tasks will remain important simply because by the time ai can do them better, it will not want to do them
i also think that with human evolution, we have become more humane both to eachotter and to "lower intelegence" as such one can only hope that a even higher intelelegence would be even more humane
youtube
AI Governance
2025-06-21T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzgMpQuX6_LuAZLmBV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzzAjtTDwmuXgxi5iJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzjMeM33SJcqDjoAqt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3h6hyG7txb_6H2VR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNDly0S5U8S4zCpDZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyn7cLoBNJbyNrITMl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx1hsx5DuMDiPnDeQJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLIKXmnOz0F05dmYF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy84Oviamhg913WAqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyUj7PwIc5RJqYYJVF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}
]