Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am the government and you committed crimes against me while I’m warning the wo…
ytr_UgzV0owlo…
G
The noelle person looks like CGI…. Plot twist this whole episode was deepfaked t…
ytc_Ugxgo3BJt…
G
Don’t use Weymo or other driverless cars - 100% of your payment goes to corporat…
ytc_UgzM02l4U…
G
He sounded like such an amazing kid. We need those compassionate kids in our lif…
ytc_UgwucrPkV…
G
This is a poetic interpretation of AI training for drama. Reality of training Ai…
ytc_Ugxhn5ZXK…
G
Another brilliant interview. I do wish you would have followed up with him on h…
ytc_Ugz1dBoxU…
G
Yep... AI will always produce bad art. Probably. But children being born now wil…
ytc_UgxBsZFIm…
G
Whatever apps you created from AI, most of the consumers are these IT people onl…
ytc_Ugw5taCiC…
Comment
40:00 – this is one bit that really concerns me. You can't become skilled at anything if you don't practice – so where do skilled people come from if computers solve all the mundane stuff?
I feel like there's a real risk that *both* Artificial Intelligence is forever 'less than' the data it is training on – and that its mere existence stunts people's own intelligence by limiting their opportunities to learn.
youtube
AI Governance
2025-06-18T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw10CfQpRRivjC78Ot4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNUKNR5xUKXnJBUuR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"concern"},
{"id":"ytc_UgwhpWKLprujAP4zqtp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPRwVyOEoIe065BFR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzf50-SgGmPFgsLdzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3Ad3UpMSzRjvWqMV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3yMuc2uqMU-XRhOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxFDwYvPXU09X9DLX94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVq76ygz3zw1n7uhl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSomckjTTGH6siVsR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]