Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Compliance businesses such as banking, government, insurance etc. may not use AI…
ytr_Ugyb5c66Z…
G
None of this is a real issue. AI is extremely over-rated and it’s very stupid. I…
ytc_Ugwl6t91P…
G
Tesla fatalities were caused by human error not the driver in supervised FSD T…
ytc_UgyOX8e2U…
G
Obviously the solution here is legislation banning Joanna Stern from creating an…
ytc_UgxxyfXYy…
G
Not trying to be mean but most Americans who aren’t Black who have dealt with Bl…
ytc_UgyWizzpD…
G
Yes, AI is very dangerous. It depends on good and bad. It is not safe.…
ytc_Ugxw1ARKX…
G
Dealing with pets is more nuanced though, because an argument can be made that p…
rdc_mzw4sa8
G
The next “step” will be integrating super intelligence into human brains. We can…
ytc_UgzQ1h3W8…
Comment
Even if AI becomes highly capable, creating a meaningful world is different from creating an efficient one. A machine may optimize food, energy, productivity — but that does not guarantee joy, beauty, music, love, or purpose.
We don’t value life just because it’s organized.
Humans bring:
• emotion
• storytelling
• culture
• spiritual meaning
• irrational beauty
• sacrifice
• play
Without those ingredients, a perfectly optimized world could still feel empty.
youtube
AI Governance
2025-12-05T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzTDv66lDMgN6EKlYh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHBc4IsOZsWdQX3Pt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAg5aiBwXB25bgmvl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0Pl9jfHM8_BrGSCR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmWzb_NEzK_szb9rd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxA9t0nfglOKT3WcV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzceTtb6CGp5AVY_h54AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwJWkU3tluqUavoNw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYq6kp16BldpMKJwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9YuH-B1hseR7-qbZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]