Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its 2026, we've been hearing that for the last 5 years 😂 AI is only taking our d…
ytc_UgzKzA8k7…
G
Can’t wait for this ai bot to use “ The godfathers of AI” line in his next video…
ytc_UgyB3zcpY…
G
Fabulous interview with Geoffrey Hinton.
It wasn’t just the AI or its potential…
ytc_Ugw2xzhOX…
G
If its SO bad and scary, why did this guy help invent it? Why wasnt it regulated…
ytc_UgzljAx7C…
G
Wait a few years when the overall motivation to pursue art disintegrates, and ev…
ytc_UgxNA4oZ1…
G
Your take on algorithms is fundamentally wrong because it does not align with th…
ytc_UgxfjAUXf…
G
Wait I just realized something. AI is usually used to abbreviate information and…
ytc_UgyI3oQyI…
G
I say please and thank you to my AI plus I also excuse myself when I fart or bel…
ytc_UgxTxLkKX…
Comment
Somehow, I don’t believe Steven was convinced at how dire things will be if we don’t regulate in AI. One of the few times he seemed more “you’re joking, right? It’s amazing.” it could be because of the investments he’s put into AI companies.
youtube
AI Governance
2025-06-16T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyJv7o5dpFRjhfqKOp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6-ctKmxFl2vgITzN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhaRTotNhi-72fhh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8wXvQcU-fm_03Jjh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxsKI_VgmLKV4OsrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzmVMUMyyM7gZzWzal4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMQD-z5M-bPvr6LpN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxDlTeWZ0fXSjqCKhh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxAhdK1KgnZbXjW_0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyokSLoknCeP2M0lFR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]