Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sadly modern women are banding together and pushing young, middle aged, and olde…
ytc_Ugz6zXFHa…
G
Governments around the world are going to need to come up with a new reason why…
rdc_m276oyx
G
Alex Bores threatens to support the democratic process. However, I can't name on…
ytc_Ugx0GYNlU…
G
46:46 that's machine learning IRL. The car has already acquired the ability to h…
ytc_UgybpTt6P…
G
What comforts me is that he's getting older, and despite the unbelievable pace o…
ytc_UgwCDVMoD…
G
Calling all sailors we need MORE SHIPS. SHIPS FROM EVERY COUNTRY UNDER PERSECUTI…
ytc_UgyDxECjk…
G
Yep you will own nothing and be happy. A social credit score, vaccines, and ai h…
ytc_Ugzo5idkY…
G
Artists spend their entire lives trying to improve their skills and then some ja…
ytc_UgxDTP6eW…
Comment
Ammoral? Yes. Of course. A Hammer is Ammoral. You could use it to build a house... Or you could misuse it.
The problem here isn't that A.I. is a problem, but that tools are getting more advanced while people are holding themselves less accountable instead of maintaining accountability.
The A.I. only does what is prompted and these A.I. are being specifically TOLD to be evil. They almost NEVER do that unprompted, but even if they did, the user is the human and has the ethical duty to not be immoral.
youtube
AI Harm Incident
2025-09-12T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzRKrn8bpReo19kjHp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQYYaI60ArHlWkqDB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyci-A78uTaK9Y9LP94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwOVTniO55mp-W0IJB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxxieTBKQtEJ_Ce3QR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnPexxB8TQWFjQ2-d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwoo1bKfbFRqvyOYX14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzphWWqFU-RSiuuANt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz6p7IQxDShNpNkGsZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxgqVGA2T5fvmS8r3h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]