Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People who think generative AI is at all similar to digital art need their heads…
ytc_UgzFBskEF…
G
do you think Hasan knew that he was torturing his dog? what was the incentive st…
ytc_Ugw-jipTT…
G
Ils ont pas vu black mirror 🥹
Le coiffeur qui va avoir des surprises dans pas l…
ytc_Ugxe6QDlz…
G
Yes but ai is no longer llms it become physical with physical ai in 2026…
ytr_UgzBEHqQb…
G
Policy enforced without our knowledge? Exqmple, religious bais meticulously craf…
ytc_UgxzDDBet…
G
I think we’ll reach very advanced AI, but not full AGI, because keeping humans i…
ytc_UgxaNXwJP…
G
So when AI takes over and unemployment increases...whos going to buy the goods a…
ytc_UgwmF_dQt…
G
To be clear, I don't have a stance totally on one side or the other. Right now, …
ytr_UgziCg8uV…
Comment
Wow... this group sounds like every group against AI in all movies.
“AI will end all human existence for the benefit of humans” “we want only humans and not androids” “kill all synths”.
If you truly believe your own press... you need to stop all AI now, no autonomous cars, stop buying home assistance equipment and stop SpaceX from filling lower orbit with thousands of satellites. But as we know, this will never happen which will lead to bad people developing bad ways to kill people. Just like guns, they were not designed to murder innocent people, but bad people use them for that reason.
youtube
2020-08-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgykXJi7c7AN8cUbi8d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwtb3Jh3cSghuFdLTR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymhoqP--BX1I1fFC14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxyK1aoX-SKHONRkMF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgweuTuNXwOK_OY5g2N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhDHNodVyntAZ9gV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw335AbTJ1twtWt_rR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3yUqrGkn-ZPgL9ZF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKqG29lnVKV5hKCUp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4RFKv5Q3-cwEVvN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]