Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is also missing one thing: being bad
AI will always draw incredible anime…
ytc_UgzmoUn2D…
G
It does try stuff based on probabilities. You have never worked in AI in your en…
ytr_Ugw6TaxP4…
G
What if AGI is not possible and youre just mudding the water just so we could sa…
ytc_Ugxs3dUB5…
G
correction: google has a whole lot of compute than that. They have thier own cus…
ytc_UgyFFVb0Q…
G
Your quote in the description tells me everything about your ideology that I nee…
ytc_UgzJZpP7c…
G
I'm really thankful to Eureka for making me understand basics of AI.
Can I pleas…
ytc_UgwP7gCiV…
G
Once ai is able to run society without human work, there will be global income, …
ytc_UgxUR8Rh5…
G
I asked both chatGPT and Gemini to tell me which country from a list of countrie…
ytc_UgwL2-Nkl…
Comment
We're overshooting by concerning ourselves with sentience and AI taking agency against us. AI will reach great power before it reaches sentience (if it ever does). We need to first worry about the dangers of AI in the wrong hands.
youtube
AI Governance
2023-04-20T02:4…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxDs_G6yMuMR3rbUBp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSJE0OobKT6yTGKcB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwfiqdkDG_KRiluIth4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8cUoXObSig4XdPu14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyV9t7DmDGWWEHnhAl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgykD5GjhgaE6QT38i14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_Ugwb5caVnaJyQP5nCj14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHxEgJ913rqAYzpLh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUrs36AIgKToIgcZZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxFtKeubJNiA859-Bl4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"indifference"}
]