Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ballet seems safe for the next hundred years...😊...robots could possibly dance …
ytc_UgxTEmmb2…
G
Meta AI spending billions dollar and can't outperform Deepseek AI, while deepsee…
ytc_UgzLYqcrV…
G
Any company replacing it's staff with robots, AI and self driving vehicles shoul…
ytc_UgxRXf0C7…
G
AI is BS...it validates you no matter what, even if you consider doing horrible …
ytc_Ugxg-hVnG…
G
WTF? No seriously, what the actual fuck.
I'm a software engineer myself but I wa…
rdc_g1ies82
G
AI being a "threat to humanity" is his business model. His company was created o…
ytc_UgzJntB3v…
G
Given time then you will be proven wrong I’m afraid.
Yes, right now today they…
ytc_UgxFiD3lm…
G
idk ai art is fine to me and the farther we get from its creation the more peopl…
ytc_UgxC3AU8B…
Comment
@RobardoHughes Ai is too safe it does not have any intention , cannot code until humans start to code , AGI is the actual risk (Risk rate <10%) it wont be given any military or sensitive government data so everything would safe do not worry .
youtube
AI Responsibility
2025-08-07T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz-d6nXl0HpTK5LXB94AaABAg.AIRS9VFjhuhAKePsODELOl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxhKSyoqra2WWwRKMF4AaABAg.AIR7uwgqM5GAIvNGgHvfB7","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzYZ8AnAGPms2k4guJ4AaABAg.AIR3KiBFwNkAITs4CVsuEN","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyOEX1rW5Gl66_4TxB4AaABAg.AIQIX6q5w_lAKeKEtTUehy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyOEX1rW5Gl66_4TxB4AaABAg.AIQIX6q5w_lALBmJXamYSu","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyOEX1rW5Gl66_4TxB4AaABAg.AIQIX6q5w_lALXwdluEn6a","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyOEX1rW5Gl66_4TxB4AaABAg.AIQIX6q5w_lALXx8-odk0r","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxQdS6GVHoOo8qr-Cl4AaABAg.AIQElAWq_c4AIQI3V0GHge","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxQdS6GVHoOo8qr-Cl4AaABAg.AIQElAWq_c4AIQbxw5k50D","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxG18gOOlravQe2SWZ4AaABAg.AIPZPvZiZTIAIRR3cDiwwN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]