Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meanwhile every single person online should be furious that they're not getting …
rdc_ljpw00e
G
I think AI is cool and should not be discouraged it will not take away from huma…
ytc_UgxeWWT8Z…
G
Plateau of productivity. Cheers dude!! Thanks for showing us you re not AI, very…
ytc_Ugx6eGZmE…
G
Couple of days ago, a waymo with a passenger suddenly parked on a light rail tra…
ytc_UgxZ--oI9…
G
Ai will not be smarter than human even after 1 Million years , Because The Ai ma…
ytc_Ugwe-FC1i…
G
I mean... Wouldn't it be funny if just emailed my Principal a AI generated vid f…
ytc_UgyrFJE7d…
G
@dumant7975 Those laws don't include Ai, which means if Ai or anyone using it ca…
ytr_Ugy20ETCQ…
G
Mother Nature will destroy at some point all insane AI world. I will not see it…
ytc_UgyihYAON…
Comment
This just goes to say that the Horizon Zero Dawn and Forbidden West story of how the world ended due to A.I. around 2040-2055, is very possible or rather, inevitably
youtube
AI Governance
2024-02-04T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw7XYbYsW4TNIl7gHZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzP2To8mQirHuh-PnR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxpUEjNpzz5K8kNiXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGlX62x7H-TV3_6Zd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR2oMGcyOI_Utc-v54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxM0B2al3sC4VGlNwd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzj0aCwxmpLPeGN_ml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwQ7gz0xp2w8f3UBaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8XGHhLdVx4X8iYL14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz4RSAgJaoCXQMKxJx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]