Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its going to us being slaves getting goverment compensation..for awhile.......al…
ytc_UgwJ1nYt5…
G
no I don't want human like robots to exist because one they live forever two wha…
ytc_UggJgy1oU…
G
Well thanks to Saudi Arabia one of its newest citizens is a robot named Sophia a…
ytc_Ugy8Zh3m6…
G
Oh yes, I often quote what Jeff Goldblum said in the film. Since we were in the …
ytr_UgxH9BG6y…
G
All artwork is a human growing and learning. AI is just about results with no ex…
ytr_UgxCJEYJ-…
G
We will become like in the Matrix, a bunch of biological batteries to power AI.…
ytc_UgwDcSWOw…
G
Probably scary to know a computer that has access to what's going on with compan…
ytc_UgxHResd3…
G
This episode missed the core argument for me. Unfortunately I can see that refel…
ytc_UgwVdnj2i…
Comment
Many AI companies are seeing a dip in their revenue becaise humans do not want to interact with a robot. Its the most off putting thing for a customer. Often you realise your not selling a product, you're selling human interaction.
youtube
AI Governance
2026-01-27T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhwDWIMmdyGHzBSop4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxwPVmz2QV7S0cQh6J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5S_yYxt4Wtq0UD1t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiHEzfkIC8DXeP8Tt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxoSkaPi0yV3EdyKo94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXw6_J9K8vqammhxJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwG_Ullkiac_PJOfRp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugycwp1Fv0b0VQIXI6p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugymt1TfHlG8ycr4gOt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRRasdZhXeT7Yft5l4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]