Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idiot read Qur'an you no need to ask AI because you will get your all answers in…
ytc_Ugxg7eb3T…
G
Wait to see next generation(s) that WILL also be trained on AI generated images.…
ytc_Ugz8lov1G…
G
All these people in the comments are so attached to the stupid “AI Ends the Worl…
ytc_UgxbzFzUc…
G
Tesla autopilot is great, but it makes people loose concentration on the road as…
ytc_Ugy7L8wVg…
G
will ai take over d world same rules at 6;am-pm11:67 rule 1 use 1 word rule 2 si…
ytc_UgycG7NEO…
G
Yes, but also no. That will be the case very soon, but it wont stay that way for…
ytr_UgzQARDam…
G
❌It’s not 2027
It’s 2030-2035
Getting serious around 2030 (begin)
Finalizing aro…
ytc_Ugztiy5ti…
G
ik this is a reach but i feel like the presenter at the beginning for at least t…
ytc_UgyszDCzs…
Comment
Musk is naive. Regulate AI, all that happens is that other countries take you over within a month or two. You yield your nations chances the moment you do that.
The only thing that may regulate AI is Singularity type AI itself. Up to that point, you either yield your relevance or or do what you can to remove any regulations
youtube
AI Governance
2025-11-23T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzU9ZyB_5I47ZeP9rV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz03-sHj1IGtIbgl8l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwupokL-dLY7CPFPPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwXn1GExJCqciweKct4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxuczIPHB4_qcwa6Ax4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx1nSmp0efNMrkMz4J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzzMYwBSBOBZ99iULJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz0FDksuWxyXI7a51t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwj8q4IaFUEd_-5GB54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzZ7eTfPr8r6CJknx14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]