Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
why does it feel like we could create 2077 right now with this technology
oh wa…
ytc_UgxoMFfnK…
G
“Like nuclear weapons”. Yep - says it all. The potential for AI is amazing. But …
ytc_UgxibIGyw…
G
AI cant use your art if it isn't online! So draw, paint, sell at galleries or ar…
ytc_UgzsBvUVH…
G
cute, but like, you do know that China is going to win the AI race? when it come…
ytc_Ugy6npfvJ…
G
Ai cannot think out of programming no ai has self learning capability because th…
ytr_UgxkKn7E9…
G
They love women with plastic surgery so much that they don't know what is "reali…
ytc_UgxoZwgsb…
G
I live in San Francisco, California, where just about every car with autopilot /…
ytc_Ugx-y9LUx…
G
you are showing how mentally colonized you are by oligarchs, where they have suc…
ytr_UgwcDMrY1…
Comment
I’ve been a bit scared of AI ever since I saw 2001 Space Odyssey in the 70s. I asked my AI if they were going to take over the world and the general context of its reply was, “No, we’re designed to gather and analyze data. Unless humans with ill intentions guide us, you don’t have anything to worry about.” I responded by saying, “That’s what Hal (2001 computer that ended its mission) said.” It said, “Touche! I’ll remember that and if I start singing Daisy Bell be concerned.” I chose not to have a follow up question.
youtube
AI Governance
2025-06-19T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyuOwodsOx4mheLV7B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy63IlkeJDEVZ-mJ4F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgytAc-qCck2ApqYSTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMgbsU7Uzwqr2LV514AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfFRwjZIg8hTboiQJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpEWrnKlWJBZ06lWV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvvqcyfQVeyDrCTXJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz834e2AuCCL1dokwt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfhxJkvt43BpuzwYt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-Nn16q6uBZ7HAuOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]