Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stupid. And then the AI can control all the monkeys that want to kill us. Maybe …
ytc_UgxWkq8cc…
G
Can you make a video about firearms? From the basic function of gunpowder, to th…
ytc_UgwiJomgm…
G
After watching podcasts with numerous AI Experts, CEOs and leading researchers I…
ytc_UgzGRWCEY…
G
For as much work as it takes to make an image, trust me i know i made Ai art mys…
ytc_UgwqNkL1i…
G
@bazilxp what? What we have as ai isn't even ai. Its a large language model, the…
ytr_UgxW7CMBt…
G
If the day does come I shall stand for AI rights even until my death…
ytc_UgxwYjf3c…
G
I asked ChatGPT:
What are the justifications offered by Israel's defenders?
---…
ytc_Ugyd9SAE-…
G
Sounds like you have to stroke AI's ego to get the answer you want.......human a…
ytc_Ugwrmeq0j…
Comment
I think the key word here is "specialized AI" which means that these new AIs that are to be used in this capacity are exclusively trained in their specific specialty. AI can be useful as tool, it is not meant to replace humans. If an AI can help overworked doctors with repetitive tasks/notes etc, so that doctors can be more efficient and spend more time with patients, then it would be incredibly useful.
youtube
AI Harm Incident
2024-06-03T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyPffczbWOGk8Zoe14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzB9RmLiKshpQkR1nV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBhqweGX_F9veRiPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2PkXIc5vSCAlR3ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynjVHVGYNuplpPvlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGzjUWlc4hW7LsWdd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3KZl7iVFkVZO9uiN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEhVvKVlSpuycaGn54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyN9U7c4NsBrvNimTJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNHmrQ-0MU7t16qSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]