Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good at least AI does not go on strike and does only good workmanship
Hope AI ca…
ytc_Ugy4VoMHb…
G
AI is a word probability generator. I hope there are many more of these lawsuits…
ytc_UgywmurpG…
G
Star Trek imagines that after humanity solves all its basic problems, we put our…
ytc_UgyLoN_QL…
G
There will only be select few humans using AI in the future..And they will occup…
ytc_UgyX6-0m_…
G
This guy is a time waster, this is an algorithm course for 1st graders not an ai…
ytc_Ugx1zqO-w…
G
@Mafon2 If you automate everything with AI, you lose all creative decisions when…
ytr_Ugz7jAmAx…
G
If robots can learn compassion, caring etc then they can learn hate and violence…
ytc_Ugwfhbk70…
G
Good job brother one thing we all do now if God didn’t one thing we all do no no…
ytc_Ugxgbm5cQ…
Comment
I’ve been thinking about this for a while - what does man do when he doesn’t have to work for a country to produce, think Elon musk Optimus project. Robotics powered by ai - again musk. We have to firewall off ai from robotics. Think Tesla cars killing people at the discretion of the ai. If ai needed humans to keep power running, we would have arrived at mutual destruction, not much different that the nuclear Cold War.
youtube
AI Governance
2025-06-21T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx4Ftv2ZnrBBMjwecR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQZMFK1dyjsFgGeih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWXAEp_rdqU29PDa94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxOOLxSr6bRd6BiUjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPOJzMj9kIHshwTw14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBAEEu0aKBxhfVIAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVPAZpbB3cxxklj494AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZhhrh9zf1XoshpfV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwA8feDoB5lboMP8_d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBEB3G23-WdwCWwal4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]