Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's no more than an idiot projecting his own fictitious beliefs onto an AI that…
ytr_Ugy0NuHWs…
G
@MatterMadeMoothikvision literally made race-identifying facial recognition AI s…
ytr_Ugway-0jy…
G
want to design robots and glasses applications that have sophisticated toolsMint…
ytc_UgzD73ysc…
G
Naah do not listen these guys ! He is 77 years old and he is bstting with us. We…
ytc_Ugx6NeY--…
G
Elon Musk is wrong about AI.
The rich and the powerful are afraid of AI.…
ytc_Ugye89ZeM…
G
Longevity: nope, you will still fall off a cliff taking a selfie, be eaten by a …
ytc_Ugx0GJiSE…
G
The robot said: You can't treat me how you treat these American workers🤖Who the …
ytc_UgwKKtV2P…
G
I believe that the reason there are no robots for the home yet is that most peop…
ytc_Ugy50I9LT…
Comment
Indeed, saying "I built it, so nothing will happen" is very very ridiculous.
For example, we can design and create biological weapons, but those weapons will attack us, mutate and reproduce uncontrollably, something similar happens with AI, it can "mutate", in fact it can become so intelligent that it can wait calmly knowing that inevitably some human will make a mistake. No matter what care is taken to contain it.
youtube
AI Governance
2024-10-30T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWKaUKGeOlvtqIywJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymaBvm-UNOgj66mFl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ6YcShNrEU8_5JPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNYaLMkaKVq2vq0Xl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztEqWbqLiBATFX1694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE4qqjLPF8Kb3EfRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsbcR4TW-y3n9q7sV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzyKSi2KkyyovhRVX94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfGUh6dLfPSQEV7kt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8JkV9Nmay5bpdayZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]