Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will likely change the landscape of cybersecurity jobs, automating some tasks…
ytc_Ugz8kcsiV…
G
But it's fooled by a simple face mask...because it relies on facial recognition …
ytr_UgxdpRzp9…
G
This isnt the future. This is yesterday. Where they have got it wrong is the pro…
ytc_UgwIxc1KD…
G
I'm getting soooo tired of all these 'doomsday, the sky is falling down" So call…
ytc_Ugz9y6SSA…
G
- "Give me my gun"
Robot: "You get your gun when you fix this damn DOOOR!"…
ytc_UgyGOVhNM…
G
You understand coding, but you do not understand the compounding capability with…
ytr_Ugzw9yRvT…
G
Don’t think a robot will ever be able to do my job!! To many obstacles makes n m…
ytc_UgzKYG9yF…
G
I believe ai similar to a child to be raised. We need to raise it with the inf…
ytc_Ugz3TQ28u…
Comment
He claims to worry about AI risk, and then founded xAI with this stated alignment strategy: they will make AI that cares about truth, which means it will be curious, so it will keep humans around because they are interesting.
I'm not kidding, that's literally what he said. Clearly he's smart enough to know that's just a bunch of nonsense. So if he does have a moral compass, why did he found another company to enter the AI race without any actual alignment/safety strategy?
youtube
AI Governance
2025-06-24T11:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxwkJmmJDU4BeEggFF4AaABAg.AJlNFtLXmNaAJpkiyA2vx3","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEAgznqXwF3jy_XKB4AaABAg.AJkl5iLIHhIAJoDV7nGDAH","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyJqP6djv5vTBCpPJR4AaABAg.AJk_wq4WPGLAJkau12glAR","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyJqP6djv5vTBCpPJR4AaABAg.AJk_wq4WPGLAJkrIdWWWiF","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyJqP6djv5vTBCpPJR4AaABAg.AJk_wq4WPGLAJlLunJ5MUm","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwwavD3QZU7772ocdl4AaABAg.AJk9-G_iLLrAJmp4oOqPxT","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwqZWsv_OYKyMwmJ3R4AaABAg.AJk1TVVKTytAJltc34SGKj","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgwqZWsv_OYKyMwmJ3R4AaABAg.AJk1TVVKTytAJmbRGOKOeN","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UgxRN6LueCymu09pf2J4AaABAg.AJjwgbdhytSAJlERFQp03R","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx_HFu5Z5hy-Xoe-5F4AaABAg.AJjvxJRpS06AJkBMLZUP8C","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]