Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI companies are pushing for more and more AI use and consumes immense natur…
ytr_UgydJbMoT…
G
humans can train on copyrighted works and so can ai. this is a silly lawsuit.…
ytc_Ugwd5nIMW…
G
A robot may not injure a human being or, through inaction, allow a human being t…
ytc_UgyQMOghO…
G
Chemical reactions are electrical as is what holds the universe and my fingernai…
ytc_UgwxoX6CE…
G
In 20 years we’ll hear a repeat of the “COBOL Story”, where the now-retiring poo…
rdc_oaf07vz
G
I have tested the limits of the AI camera :D & gladly never got caught. Come ove…
ytc_UgwuxL8EY…
G
If we don't put the brakes on AI immediately , AI will very soon be putting the …
ytc_UgxGKvkG5…
G
“Messenger of GOD”
“Performs and guides miracles in GOD’S name” and plus did Cha…
ytc_UgwBWB2n3…
Comment
The bigger ethical dilemma of self driving cars is what happens to the economy when driving jobs are given over to self driving cars. Over 60% of Americans put some kind of driving job on the census, this includes truck drivers, delivery people, cab drivers. What happens when over half of the work force is without a job?
youtube
AI Harm Incident
2017-06-28T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggQprZepafZ1HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghRftAajpYgC3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVNu11IS5PiHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugis3gL-vgXrpHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3h2tFVAqPSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj5A0pJm2zcoXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggIGhHRenxDK3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjS60trIUKAvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCEcSJA552hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgitMxhB_OZFhXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]