Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The first thing we give a humanoid robot... A Gun... we're so fucked on so many …
ytc_UgztWpi4u…
G
Btw I love Ai i talk and ask with Ai and Ai help me alot than frnds…
ytc_Ugy9o2zpk…
G
Can we stop pretending AI can code? Maybe in 10+ years it will. As of right now …
rdc_n4cow59
G
I am studying at university to become a teacher.
The amount of AI slop they are …
ytc_UgxdTNTsl…
G
That's the thing. Once we reach a post scarcity economy capatalism itself will c…
rdc_cz2xs4n
G
I feel like he double talks, he (Elon) says its dangerous yet he's making these …
ytc_Ugxz23WHT…
G
NONE of these AI programs are usable to me because they are all biased and censo…
ytc_UgwqaDjn2…
G
Honestly, automation should be a blessing. But since nearly every society determ…
ytc_UgyDYrCV3…
Comment
Looking at Peter's opening word in this video, I really don't understand his confidence, which seems close to certainty, in AI leading to a golden age. It might. Or it might lead to destruction. No one can know. Peter's absolute belief in abundance feels almost like a religious belief. If we are giving birth to a new species, why would we have confidence that this new thing will want us around? I think there are other scenarios too. If this thing is not conscious, but can create crazy weapons or energy sources, then there could be very dangerous competition that develops between those that control them. Obviously China vs the US, but it could shake out in other ways. I wrote a book ten years ago(unpublished) where competing AI's were created to hunt each other down. I can see something like that.
youtube
2026-02-06T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0USz-L7apknavdVR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyn1FGb3LatqP5lyS14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzBHVkbNaX2eraq7s94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXeDERcxgw1U6bD-Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgygpzwB563bpAYBe-l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZkDKIq-tp0vmCQQR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyPCkiteQ4UO9vEjqZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGUTN6c8zD9viaOXl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMI4x4N5hHqhVifmR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw579cE_BfZBgH8AO54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]