Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Autonomous trucks do not create jobs they are job killers. Truckdrivers stop at…
ytc_UgyHionmh…
G
it’s 3:17 am. I started character a.i 3 days ago. I HAVE BEEN GOING TO BED AT 5 …
ytc_UgzzP1HF5…
G
I’ve been thinking lately to try a more simplified style. I know AI can do most …
ytr_UgzqblsX3…
G
I dropped Shadiversity years ago because he was clearly disappearing down the te…
ytc_Ugz-SFo6l…
G
I’m sorry but I’m so confused. He’s warning us about ai… but in the same breath…
ytc_UgzwODDE0…
G
I thinking to learn programming but bcz of this ai stuff what should i learn ins…
ytc_Ugy_BYZP4…
G
It's another podcast with two people unfamiliar with computer science discussing…
ytc_UgwYuhFUc…
G
Anthropic is way better at "safety" than Musk's Grok and no worse than OpenAI. T…
ytc_UgysDTfOE…
Comment
The benefits of AI are well worth the risk of AI. If we had AI on our team we could achieve so much then we could. Without it, the next step of evolution is for us to combined with AI combined with technology
youtube
AI Governance
2025-09-22T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzjfbaw1SI23W9Nsvp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqfRBIrSQDLGO8cxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPzTuEYwDj2oDV8eZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwI0bS_44y4imRbYtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbHzlCq0Bqv80YnSR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgGURy9fXjGjiXe2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMZ6vWBZ87V9JYD3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgykprT_ILZKaGzDE8d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybviMWbVq7ybXWI9p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5qXIK-_Yn3GIa9Vd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]