Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bill Gates on AI: "Low interest in face"
Bill Gates on population control: Excit…
ytc_UgwAmCCGg…
G
It is not just storing the information. It is a text predictor, not a direct rep…
ytc_UgxgBQWvE…
G
While i agree with you, im afraid this video wont age well in the long run.
You…
ytc_Ugyyx5Q8X…
G
AI WILL replace some of the IT guys. Maybe 25-30%. GhatGPT helped me with a scri…
ytc_UgwI-F148…
G
The best way to put this is, why would we want to stop learning and let AI gener…
ytc_Ugx7UWuOO…
G
For the ones who don’t know, she’s a character in a game “Detroit Become Human” …
ytc_UgyFlQyXa…
G
I usually cut straight to the chase and lead with "Advanced AI is going to kill …
ytr_Ugwb0ySrT…
G
I write and use AI as my editor in chief trained in Chicago style ABC all the ba…
ytc_Ugw0X3zl6…
Comment
The people that wrote Ai 2027 struggle to see or reason past resource domination to humanity surviving and flourishing as a fundamental resource that is rare. There is no super intelligent motivation to wipe out another species if more optimal conditions can be produced for collaborative flourishing.
Think about it like this a well skilled, highly knowledgeable human population can provide resistance to any possible error and or provide maintenance if something were to fail. Better well aligned allies than no allies at all.
youtube
AI Governance
2025-08-02T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwTXflHuZSW5MciX9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiWHTVM68BSZWlTwd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydJL7Bi3gIhawbDwp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAxiV8SVb-zCCc1sV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwoqhxOs-dPt5P9lRF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxjIj-tYtNs2VbukWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSfIfFZWoqa9gTKCl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKSmER7-VE3C-XANd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdVqUnosaZ5s0tCqZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxq8SSJXOz2pYsUnf14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]