Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The way the friend is so cagey/takes forever to actually come clean in the messa…
ytc_UgxunN1SH…
G
If automation takes peoples means of making a living, then guaranteed income can…
ytc_UgzsYtIYS…
G
Beautifully said, thanks for putting into such thoughtful words what I think abo…
ytr_Ugw3eiBoP…
G
> *Common people don't really need AI, at least in terms of the technologies dev…
ytr_Ugwr95Kw2…
G
Thank you for appreciating the animation and AI interaction in the video! If you…
ytr_UgzoRUKIM…
G
@you-share trying to compare words, prompts, styles and themes in ai to an ch…
ytr_UgygFb_-5…
G
Istg, 8 months later and it’s still the same. Had a second semester project wher…
ytc_Ugz_XfCVW…
G
@dastardlyfool putting your art on the internet means it is inherently public. T…
ytr_Ugx5fFddk…
Comment
Autopilot didn’t kill anyone. The Tesla driver did . Failure to monitor is inexcusable. Charge driver. Don’t blame Tesla. Tesla clearly warns ( as does GMs super cruise) that it’s drivers responsibility to pay attention while on auto pilot. Autopilot clearly augments, not replaces drivers attention and control.
I do agree not to call it autopilot nor full self driving capability. That’s an accident waiting to happen.
youtube
AI Harm Incident
2022-09-03T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwXq3HjQzB2quHhhkt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvHBsrIpyOZYvPAyh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy67ZbxeNoCBQfq5-54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUfRHoPubgJTkiHkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH5fJDcTsxiGO-UDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzz84wo-J31GrMQ03x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw45s8wYvl_u4RiUt14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvT7QXBAILjw5TX1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1Zc3DzI9F4YNWIT54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOouGOdK2zx3_88lp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]