Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is consent the same as being ethical? or promising the right impact?
What is the…
ytc_Ugx5zjd_5…
G
This generation is getting more futuristic and scary I swear we found out more s…
ytc_UghoM3zHe…
G
Nobody cares. Take care of the rapest priest, corrupt cops helping inmates smugg…
ytc_UgzCLSZbg…
G
Kurt Vonnegut's first book is probably the most prescient piece of science fict…
ytc_UgyeU3K2g…
G
The fact that we are allowing police to monitor civilians and “predict” crimes w…
ytc_UgyoP9YFQ…
G
He made AI now he‘s saying „we have to do something“. Not pretty smart if you as…
ytc_Ugw1futhj…
G
What Insurance Company will want to accept the liability for a 'driverless' big …
ytc_UgwymEr9S…
G
If these AI robots are emulating humans....we are all in trouble. Humanity has l…
ytc_Ugw-Bv_Gs…
Comment
Elon is butt hurt he couldn't control open AI. Now he knows his AI company wont catch up. So he's stifling the advancement of commercial AI to try and level the playing field. Although I highly doubt it will stop the AI boom. Its like trying to regulate the internet when it was invented.
youtube
AI Governance
2023-04-18T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNfc8FgULtK3IC_JB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznKqfaRqggtc_5NLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGr4l4w63E2SpIbVx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8BR6Bq6WvrW-kh614AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5h4VwLnzy6zkaoQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyujRqnQ37qXjAmQ14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz__ZGf47NvUi0TWYZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqBOLbwR2ZqJlXHWB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkcW5OVE6GQrV9rfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4YUg7NS1IVIn2iYV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"resignation"}
]