Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You see, AI prompters are SALTY because they have to pay $/monthly to get enough…
ytc_UgwML7axC…
G
don’t forget the one that bullies the ai so hard that they end up unaliving them…
ytc_Ugy3cPlRi…
G
Either hes holding back, or not enlightened on military development of engineer…
ytc_Ugyawd_QO…
G
the current scaling of spicy auto-complete is terrible because it's being used b…
ytc_UgxeGHvAF…
G
I have a problem with all you companies. Musk google ..... I understand though x…
ytr_UgxujXeS3…
G
All these arguments in the comments about people that barely know what AI art is…
ytc_UgyFM4no7…
G
An AI won't need a human slave encampment. It has no uses for us, unless of cour…
ytc_Ugxtleeg-…
G
Do Teslas have a facecam inside the car so it detects if the driver stops blinki…
ytc_Ugz5e8LV9…
Comment
The REAL AI revolution will come when 'it' (AI) tells us (humans) how WE are aligned. Because currently...there does not exist a formal understanding of who, and what, we are. What happens when AI tells us the answers to the (alignment) questions we do not currently know the answers to...?!?!?!? 'It' essentially becomes the voice of 'God'...! Is that too weird? God creates us (this is indisputable...since none of us are remotely smart enough to do it)...therefore God also creates AI (it's a bit complicated but this is also indisputable). Hocus pocus time!
youtube
AI Governance
2025-07-24T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzUsvsPP8ngdFlARsR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYGqB5a6K4c8dYSvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxowGGL9KETcwvMEH54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbqfjgyXoCZG8j-P94AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3h1ywG0970hemdN14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY6vgV-dVxYaSVACJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugy0ZvL5eOgDbXVldBl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwrb_cHMcKboZxXHYx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxi1PH-pEA9omRxEsd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyguEoBXsDH26UFXtV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]