Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can you create a Michael Jackson robot that can do all the famous things the Kin…
ytc_UgzwZAP7b…
G
AI is only as intelligent as the sources its trained with, expose it to biblical…
ytc_UgyAuE3Wl…
G
This sort of comes across as a bit vapid to me. Obviously socialization has gott…
rdc_n7tky4j
G
I can't help noticing this video has comments from 11 months ago... so all softw…
ytc_UgzcG4QyH…
G
@MrDintub of course not, your what they call an NPC so u and the AI are one in t…
ytr_UgxXf6iLX…
G
Well. I don’t mind watching YouTube with ads. What if AI can’t differentiate its…
ytc_UgwcaiVlr…
G
The Godfather of AI Geoffrey Hinton only wants what is best for Both Humanity & …
ytc_Ugw48Gk79…
G
I will never take a car without a driver. I think this is pushing it. Also it's …
ytc_Ugztc7Ikr…
Comment
For what will they AI optimise, if nobody is there to consume? If that is the scenario that would give a crazy look to all the AI creators still having the last jobs and life… technology is there to support the society… that is the primary goal otherwise it makes no sense
youtube
AI Governance
2025-11-05T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhX78gF58BZZspKAp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgymMYCORwC4A537kgJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8rLEmdtQxupFxwXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYs5VITa3jphuU6_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynhfWLGISTxihPDJt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyARP3w6YywfObTQ0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxl6IGkyNIVXKVbUll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyIYoMZHmmao5cqRuN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySgH_Z5bo5kjG0MOZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweoD0Z4QxuoK4kizh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]