Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope they improve the Ai though , it’s still the beginning. Maybe it will actu…
ytc_Ugz4NvWkv…
G
makes me even more impressed of the actually working driverless taxis in Shenzhe…
ytc_UgxYeccnB…
G
No, there isn't, stop rushing to make defenses on things you don't now about. Ju…
rdc_grqukz6
G
Now why tf are we giving robots guns? Has Elon not watched Terminator or anyone …
ytc_UgxjqS1r_…
G
“The kind of intelligence we’re developing is very different than the intelligen…
ytc_UgzZxzSGV…
G
AI isn't tracing and you don't understand how the algorithms work at all, so why…
ytc_UgzdSxKYz…
G
My thoughts about automated AI improvements: the automation only works if, and o…
ytr_UgysbT0gJ…
G
If all the machines and computers are going to produce everything, who is going …
ytc_UgwJzEC1K…
Comment
Why are AI kill drones morally different from unguided artillery?
Humans aim the artillery. If they mess up and fire at civilians, they can be held responsible.
Look how America behaves with their human-controlled drones. They used to decide whom to drone strike in Afghanistan based on if they pee sitting or standing.
We need to stop treating war like a video game.
youtube
2024-09-05T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGqsXImgtbHXTdhwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykfZqCW5c0VRloPk14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8ZQwnYfDKvLg3NW14AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgybJENTIwuzPe8GIQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzT1PTKwhq5TZ8yUTt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywU1EfqmsUF135r-V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzfD9Qbs8VfrGbSyO94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzVtmv0p5eFwr8g7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyU8FhQ2-sAAIC4mJl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxnjqaLeipnDy5bIJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}
]