Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wheres all this Ai, cuz life still the same . Still see workers, still busting m…
ytc_Ugyr167AQ…
G
This Utopia of no one having to work, and Robots doing everything for everyone i…
ytc_UgyWFS_Wc…
G
I like Mr. musk but this interview failed to describe why government regulations…
ytc_UgwchyYLG…
G
*Human does stupid human things*
*Human blames AI*
*Humans wonder why Skynet nuk…
ytc_Ugw8HdDTG…
G
4:29 it would be wonderful if healthcare was prioritized over warfare…and people…
ytc_UgyFmlRWR…
G
AI is exciting if we can get legislation around it to catch (the f*ck) up.…
ytc_UgzQTwZfG…
G
@Tifgol12 Fuck the computers. I love my consciousness; it makes me human. You so…
ytr_UgzfjhL-N…
G
Sorry, but the fact he thinks 'AGI' will be available by 2027 discredits the res…
ytc_UgxR-Z9e0…
Comment
tbf this is how you solve the issue of alignment with AI which is actually a serious concern in the field, if we treat it as an actual being with respect and rights we could do great things together but knowing our species and leaders they will see it as a tool even if it became self aware, they would use and abuse it for greed and cause our undoing.
youtube
AI Harm Incident
2025-03-06T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzRCMrqB1qM9OMPLeh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxop5g8XiGOZLdd-g54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyvjjcA6I-bph9tNRx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaxrMNEpy1b-1NK8B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGUfLQ15vzNj2pcv54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxvZ_01EDvhd1ApxpB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSCKbPY9B622BaA714AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOFMsO0vOWxPNQfR54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPkx9IQ4pmYKuAryh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTWVigZTPzQQHdn-p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]