Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I disagree, I don't think it's the same as human creativity or inspiration. basi…
ytc_Ugxu7Al6_…
G
1:16 this dude forgot that ai can make mistakes, never give a machine full t…
ytc_UgwMS9lzJ…
G
Huh...he's wrong about scifi. Australian author Joel Shepherd wrote a fun space …
ytc_UgxH1Ba9u…
G
It doesn't matter if any of this is true. AI, at least, provides cover for the…
ytc_UgwQu5fOr…
G
You don't have to know anything about AI itself to know that "hallucinating" is …
ytc_Ugyl_hzvD…
G
My brother in Christ they scraped the internet for all art pieces people have po…
ytr_UgydRm8Da…
G
i think its funny that nano says "saying AI is bad is classist!" when to me, its…
ytc_Ugzh2MriY…
G
AI hate is hurting real artists. That’s the irony of it all … it’s not that seri…
ytc_UgzRkW4ue…
Comment
What I am scared of is a group dropping a 100 or more robot that shoots everything that move in an area. Maybe have crude recognize system that will not shoots at something with the right collor on it.
Put a hundred or more in a area you are going to attack. Anyone trying to defend has the risk of being shot. The the aggressors will not usually be shot. But it is worth the risk because the moving things without the right colors will be shot.
I am not the smartest brick in the pile. If I can think of this, I know other people have thought of this.😢
youtube
2026-03-11T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxjFYESy6tomjwN4d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhSkA-8HdrOjj_7dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy0gQLgLOo6OkcUUVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwaPyWNZQ6ha3J2eg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgymZ5qYdd5I8ml6IAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxTT4_0jF2ORMzHTDR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTiWKgZdR_o3XQXdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy72D3wTYuNj6RjDVt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgynN6nUrxQ7PRQKWVd4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza79EGRXrOH4F_VZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]