Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You made a big mistake in this video. You put AI and ARTIST next to eachother.…
ytc_Ugxv69v4x…
G
Or we should, you know.
Not use AI for everything and use the 8 billion people…
ytc_UgzB5aYw1…
G
Not sure I personally want to get sick to help develop herd immunity for my comm…
rdc_g9ufiwk
G
He's pointing out the obvious, which is that nothing in the data suggests that A…
ytr_UgzkDWiB-…
G
Loved all your previous videos but as someone who works within the AI industry s…
ytc_UgwgIG1lw…
G
Prompt: "pretend you like hugging an active EMP with your neural networks withou…
ytc_UgzEyx9Yv…
G
did you know this whole thing started because they stole the notes of a develope…
ytc_Ugx6q1_FY…
G
Yeah that should be the only meduim where ai can fully be used with less restrai…
ytr_UgzeB7St5…
Comment
@bejeta7 The capability of a machine to suffer isn't what I hope humanity can avoid, I'm just afraid that if we were to build one that really could, it would be made to endure suffering without a care for its well-being, like a lab animal, instead of a thinking being. If a benevolent artificial conciousness is to exist, It wouldn't do any good to traumatize a fledgling AI due to ignorance.
youtube
AI Moral Status
2023-09-02T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugz9yeqfqWwOadueWjd4AaABAg.9to-wjlC-1i9uDSod71ZOF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyiCVdSVWngP8Ot8uR4AaABAg.9tne4FeFkBu9uxNv9idz17","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxRU3BsqC3AKVe_6ed4AaABAg.9tnbH8wqmOr9tnbW3-zZQZ","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzbYYMUB1rBKcRj0fh4AaABAg.9tnWX_gLshb9tsGrhoV0tC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u0Ny22svgw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u38q1_74wH","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u5wE7yB4NN","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwunzFPx9PCu9rzPzR4AaABAg.9tnC4SYblqj9u5wITctQt5","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugx-3lnb8zV6suMP0Al4AaABAg.9tn6RatUNti9u9O1EBo9j9","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxyAHIGawFuQ2EkpNt4AaABAg.9tmi07x8WJT9twmF9eCaiO","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]