Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also, "There's no way to know where they're coming from?" My man, your AI went t…
ytr_Ugxu09-VD…
G
I mean i fully support this but it doesn’t really stop companies from just takin…
ytc_Ugx2qDaDe…
G
you may not be interested in war but war is interested in you... you may not be …
ytc_Ugx3BlL7l…
G
@I_am_a_nice_person Joke’s on you, I’m just autistic.
You’d be surprised how of…
ytr_UgxLRBmLk…
G
@Consoneer Because your internet belongs to U.S. If I comment some pro-china or …
ytr_UgxUGY33b…
G
Autopilot is not FSD (supervising) if I'm correct. I have Cruise Control and Aut…
ytc_UgycRWAjV…
G
Every single "AI Takeover" "AI Dystopia" scenario I've seen relies on one SINGLE…
ytc_UgxbzHx7q…
G
GOOD!!! I'm not saying fuck AI, but I am saying fuck the people who exploit it f…
ytc_UgzgxAbhH…
Comment
Interesting here on "multiple" counts - for presentation style (2x AI's) to the subject matter that has some nuance - that admits AGI might be a safety catch to SAI - and here at 8:15 it's all summarized - but I must tell you - earnestly - that you guys do not know what AGI is yet. You still don't know what being Human is yet. They are not the same thing - but there are multiple multiple paths to very good AGI. In a sense we are witnessing the laying down of deep genetic lines - and the establishment of distinct new species that even so - will be able to overlap and work together and even blend. I'm glad I'm not writing this stuff.
youtube
AI Governance
2024-01-04T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJnXmJnUjI4BZw5cd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwpYg_nreZwNpknQop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzjYp7Oaf7u7i6xvml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzm3XB7nXqrSR1ypFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTPRPTwe106_KghyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNfYpULfh_Ir9Ix1R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwO_VzN-pF3q4Py3c54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkgvHDi4UVDVYQ8Ml4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm3y0DqVd6ftgejit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD2Yiu9gI7Z8nwxY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]