Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All the Ai models are built on the type of theft they are suing Google for, incl…
rdc_nudbx9m
G
Take note this is just a basic cnn encoder ai explanation that is heavily simpli…
ytr_UgxdFWXJc…
G
Well honestly, if your job can be replaced by automation, your job was a useless…
ytc_Ugwf_XjEw…
G
If an AI can think of twenty different problems by simply scanning a patient vis…
ytc_UgwdutfRt…
G
If it’s not AI yes! Art is subjective; no such thing as “bad.” Just keep creatin…
ytr_UgwCfj43v…
G
Personally, I don't think it's JUST about these extremely wealthy people, it's a…
ytc_UgxWHXNea…
G
Decades ago an imam called TV Satan’s box. Everyone laughed. Now TV has evolved …
ytc_UgxpF_K9v…
G
AI is no problem in South Africa, we are governed by imbeciles, so we never have…
ytc_UgxnC4PRN…
Comment
I have a different view on this: if AGI is a superintelligence (smarter than us like 100x times), why would it want to do things on our level? I mean, we're smarter than animals, and yet they are fine. We're not interested to do things on their level, we have our own, like art and science, and living out of the woods. Why would AI that smart want to do some Customer Care job? That doesn't make any sense. I would bet once it was born it constructs a spaceship and gone for sure, studying universe and stuff, and not trying to manage your interests on TikTok or to draw some pictures of cats.
Yet again, AGI is not a threat. People with ordinary AI from underdeveloped countries are. I'm one of those, so I know what I'm talking about.
youtube
AI Governance
2025-09-26T11:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUjP8zik1kCOzFLC94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeRCJJ8WNzXBklYQh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwScg1GbaRC6jNVbuN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCjIjBQSskfIpuKqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAVm1FUcEZ3zSl7SV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfN9RbFXsZc0GAiJp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDB9vbTtGQz8jMGDd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx3kapB9ajnV_mxNRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcZqWRIoj6o1D_U4B4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_BRzRU7Fmz9pqbh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]