Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
🛑🛑🛑👉This is the small version, wait untill gov want to lock down civilians. Civi…
ytc_UgzxSkHzo…
G
I would imagine the people at Boston Dynamics will get it first and the worst wh…
ytc_UgxLGTTDw…
G
AI is a yes man. If you ask it a question that leads it on it will always agree …
rdc_nta4se0
G
If you want to remove AI, please tell me, I can do with a reasonable price…
ytc_UgwQ_YwXL…
G
Actual art professional tm with years of experience working on TV animation here…
ytc_UgxJwB-rR…
G
Stay away from this “AI startup” wave. They’re essentially all writing prompts a…
rdc_mjdnf4s
G
Oof imagine buying one of the AI art pieces and then seeing this video or inkwel…
ytc_UgwuvP5j8…
G
The CEO anology he was talking... he was referencing himself and his students. H…
ytc_UgzKwfmY1…
Comment
AI is moving faster than most leaders can keep up. Imagine if policymakers spent just 1–2 hours a week actually using AI for real-world tasks—not just reading about it. Could that change how they regulate it? Puts the ‘governance in the dark’ problem on the table.
youtube
AI Governance
2025-09-10T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4HV95CNWeGWBUPIl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbwloABSnB5mOrh954AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxELNiMbVwHljrkBrd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRfuw6E71Pb3Bw3jZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlwNFvpBfA-U-vrGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwr-lCrt-QcNN_dOyd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPyelEEKWaPAcs0xB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW3Gd-DBf1flh29dB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDAHcEt9XThuwUUAF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKfTNlDalRcGw048R4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]