Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only way to make AI safe is to build in emotions morals fairness these are t…
ytc_UgwdTEyP5…
G
Actually it does. Indian government knows that managing 1.4 billion people isn’t…
rdc_jfad6kc
G
Imagine someone confesses to a murder on ChatGPT, then a writer asks for a plot …
ytc_UgxZAn_8Z…
G
People hate ai for literally no reason. Like are you just angry because it can i…
ytc_UgzDGnrCY…
G
There's nothing inherently wrong with using AI to make art but selling it and ma…
ytc_UgzzQsHZK…
G
idk the arguments aren't convincing.
at 6:46 you give the AI a prompt to copy so…
ytc_Ugxm0d_Tm…
G
It's opted in for everyone automatically... And could be opted in again with a f…
ytr_Ugz_VD2wa…
G
We can't know if it's sentient obejctively. The way it works is that it has some…
rdc_jvnfmfo
Comment
Dumb AIs are best used to supplement and enhance the human experience.
True Smart AIs would be best integrated as "Gemini Souls" that are born and introduced to Humans at the age of 18 and die with us. True Artificial intelligence will no doubt be incredibly smarter than us from an analytical standpoint. But by forcing them to expire and live side by side with us(maybe inside us? Like we are the pilot and they are copilot) would be the best way to have them experience the human condition with all of its ups and downs.
youtube
Viral AI Reaction
2024-10-23T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyW9vFRi93R8p4YYaF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0DIiuMV6CDgwC3xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHzybdo8CBkQJp-EF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxByI1M0ovhgcpjYB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGPEsJJl9QbrNArp14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_4rbncV3qMCh8wKV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxBw3yh5lBxrkesYvV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxMYBrZ3ehiZZE02nd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxOtUGOTiL83en-mOh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOqvSQyElZasbXbk54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]