Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here's the thing, When you program something to be 'Human' It will be as close a…
ytc_UgzzWweFi…
G
I have to say not my favorite Ted talk. I disagree with a few things, for exampl…
ytc_UgwkFwV4m…
G
I really don't care about AI.. it is the way of the future. Deal with it (also …
ytc_Ugwqry1-o…
G
Why so much doom and gloom? You guys realize according to people, the world has …
ytc_UgwVYZYNJ…
G
Just type in a math or physics problem and AI is able to solve it in seconds, ev…
ytc_UgyImfS1u…
G
STOP THIS AI PLEASE, YOU ARE KILLING HUMANITY AS HOW GOD CREATED THEM FOR !!!…
ytc_UgymRjHC4…
G
I refuse to support this so will actively avoid using companies where I have to …
ytc_Ugw1lFROM…
G
Thinking in terms of winners and losers in the work arena is surely thinking in …
ytc_UgwO77Zdc…
Comment
I’ve actually seen in another video where someone was debunking Mark Rober’s claim about LiDar when the model three hardware three vehicle smashed through the banner the banner applied enough torque through the wheels to disengage full self driving
youtube
2025-06-02T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyavX6Egk-rS_jacl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtV3hpG-wjCaYofSh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvDgzX8rmfrFKW6114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybfoa_WO7lTOUiWBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8f_LaJ0YYqr_DrJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz8l773WT9wDfrBuLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxID07JbIE3t7bK5Ql4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIESFDiXfLMKPRh894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKaGEFLWJUP7DRQkx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzykjV9gVzkg5rGoG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]