Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is programmed to do what’s best for humans. How long would it take for them t…
ytc_UgyxfY-wi…
G
Exactly. Let’s pretend there’s 3 highly secure career paths. At the explosive ra…
ytr_Ugw2GPLxT…
G
Proposition- if a job is replaced by AI, the company should be required to pay t…
ytc_UgwXDUqqc…
G
Not gonna lie, I would be tempted if AI offered a device where you could get int…
ytc_UgzXDqTj-…
G
I love how the obsessions of Silicon Valley creators to expand ENGAGEMENT hasn't…
ytc_UgyknhMK6…
G
1:26:00 I'm sorry I'm commenting so much Neil, this is just one of the most fasc…
ytc_Ugwj3hMKG…
G
Hello there, been listening to many of your videos for many months now and have …
ytc_UgwlsJ5Yl…
G
AI art I feel like should ONLY ever be used to privately brainstorm ideas with. …
ytc_UgyLme9oX…
Comment
Sorry to correct the professor, but 5 to 20 years for AI to be smarter than people, but I'm an American, and as of right now, my dog is smarter than most Americans, 🐕🦺
youtube
AI Governance
2025-08-14T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRtarstrWwRQMu22B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwan1_9DvbcG1rRH_F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyROXPeek-vNeZ3gTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZqjoPmfec_UbUZyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0MY7jTLC3DrmuIdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhET_KY1Z7alju2wd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7m3TG2agDZKUskWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySqql1ODiK3_Nl19N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBVV-hqlslcb-LIk14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0Erj-kHymCzqunUN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]