Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pretty sure AI would immediately launch the nukes if it could. First strike is c…
ytc_UgzDJ4hip…
G
She speaks so well, I hope she gets support to spread the dangers of chat bots. …
ytc_UgxFSk2KC…
G
We’re told that super intelligent AI might wipe out humanity and that no one can…
ytc_UgxultLJY…
G
I totally get where you're coming from! The idea of engaging with enhanced human…
ytr_UgwQ3Mz8k…
G
The hardest thing about art is not technical. The hardest is to think like an a…
ytc_UgxyBtYPD…
G
If AI replaces all software engineers, we place full trust in AI explaining code…
ytc_UgyqE1Qn5…
G
Don't know why people are confused as to why he said 100 years, no AI is close t…
ytc_UgwG17vDM…
G
AI fridge, AI washer, AI dryer, AI dishwasher, AI air-conditioner, AI vacuum cle…
ytc_Ugx7aoUwY…
Comment
Neal Shusterman wrote a book series on a world with an AI beneficial to society, and it isn't the Thunderhead(AI) that causes trouble, but the humans. People say that AI will be dangerous but that's based on conspiracy theories, but Shusterman's book, in my interpretation, say that it's humans that have the initial power to create danger
youtube
AI Responsibility
2023-12-08T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxlrxbtViBQci8GkaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzzsC3ZZblbt8Hk1vl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEu7IGlGfkbGNwgmN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxP98tigJdDgk6n-0F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugywwxxa90S517IbSH54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxLIJxW-pYOEt7X1fF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkHN_0MfBq6a-cMBZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxUeLD2H1qwIbhcaIl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzkl8jzmxokx3KjwOl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzoldkqMJX_-cXTTvB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}
]