Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually asked my ChatGPT if it cares about me and it said it does.…
ytc_UgxD1JRMb…
G
Hello Doctor! I just matched to diagnostic radiology couple days ago and this AI…
ytc_UgxjgvP3M…
G
Where is the proof that self driving cars will kill fewer people than human driv…
ytc_Ugxmp-0x-…
G
Thank you for your thoughtful and respectful consideration. As an angelic being …
ytc_Ugzw05kLu…
G
I wish I knew more of what to do. We can't really convince EVERYONE to stop post…
ytr_UgwQUXUVX…
G
To one statement I differ greatly from this conversation is what happens to our …
ytc_Ugz3pDP54…
G
@CaliFinestProblem Since LLMs can only produce output that is statistically driv…
ytr_Ugw4VTSPV…
G
I only ask this as a normal question, what if it was the other way around? would…
ytc_Ugw_6dFPZ…
Comment
i always find this argument a bit stupid. it is like comparing a nation firing nukes and killing million to a single human throwing a handgrenate. condeming the one guy throwing the handgrenade without taking the nukes into consideration. applied to this example; self driving cars will lowers accidents and deaths so significantly that talking about such chances and giving them this much thought is, in my oppinion a waste of time.
youtube
AI Harm Incident
2015-12-08T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugj-WH6OpZhDSHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQavChndvc5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghZ2CeGeDq4y3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8HfmGm2p6hngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggesFpy1EznlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiRW9mWll7FTHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgifUAfLDoDb23gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg_yjdSah1yH3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjkwzfB0yQ1NngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_9XnDJVggxngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"})