Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
universal basic income will only be a thing if the people that recieve the free …
ytc_Ugzh77i_K…
G
Know your enemy before you fight them and if you don't then it is pointless this…
ytc_UgwDSvNDU…
G
Yeah it’s gotten to the point I stare off into space sometimes just worrying whe…
ytr_UgzMRtc9X…
G
Programmers with a masters, musician and painter here, Charlie bro this is offen…
ytc_Ugw3UG0UN…
G
As someone who has some insight into this, I would have to disagree.
Vince Dhil…
ytr_Ugy-6QkIh…
G
I’m in art school and I can’t understand why our teachers are making us learn ho…
ytc_UgxOn3gl4…
G
Like he said at the end, we need to find the sweet spot. The problem isn't tech …
ytc_UgyIisrUg…
G
After watching this program on 3/19/2026, it seemed like something was missing: …
ytc_UgzmVA9Lw…
Comment
Lex Fridman interviewed Sam Altman who is the CEO of OpenAI recently and it was very clear that Sam has no clue to what this technology can potentially evolve into. I was expecting to hear a podcast with more of a scientific approach and instead its a podcast with a tech bro. Very concerning imo. Good listen to get a view into someone who is leading a world changing technology and how dangerous this really is in the hands of someone who appear to have little regard to something that can be potentially so dangerous if not constrained property.
youtube
AI Governance
2023-03-31T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIhTXUevUHL-mUamR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzaypavHPsuMcskrsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1IXPg55f2SZvQL2R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxaAnB4FIK4xSZyMK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2yMnAO7Wm_BxlUMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3B6LWk0kUPYxTNWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyM7Zk2ryFfdeYd4Rl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwG-4vywd_tPa5-f5Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIors5mc2t6CSLdt14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbREkM3p5QPRD9cjV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]