Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice to see, I'm new to the podcast and was super put-off by the casual AI art p…
ytc_UgxNtxvWM…
G
The bit you blew past about AI is that it doesn't store facts and just "knows" w…
ytc_Ugylr7j8R…
G
This is complete nonsense. Look it up, do research. No A.I. has ever blackmailed…
ytc_UgzgkE06O…
G
Well at least we now know who’s gonna be safe and who won’t be when AI goes rogu…
ytc_UgwrvLVLj…
G
LLMs aren't the route to AGI. I think to achieve AGI we need to make an AI archi…
ytr_Ugzz5VMBm…
G
Seems like a double edge sword for the rich. Yea they may get rich all at once, …
ytc_UgzYqH-DM…
G
It sounds like you're pointing out that the robot is driven by its programming. …
ytr_UgzPoDseo…
G
People just want to find a reason to blame their child having autism
So they ar…
rdc_gvwnlso
Comment
Boredom brings about violence, among other things. The breakdown of civilization would be swift if AI is allowed to freely improve itself without hard boundaries. Every technology is advancing faster than the "safety rules" are put in place.
youtube
AI Governance
2025-09-08T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNZcOWd3YQ7cAS2Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9K5DybkKU4akq8iZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8kNRnWSBnvbojN1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTOhzqSxGRjlfei5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwhe90Ce0Or9iooAqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuJ6o9LMEaShnc-Ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuL9fDcxdtLFQgEt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY_4CNadQ-VrYiF6B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwayJ6dNzSlk3rERyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgylY8k3Kta1enfgb8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]