Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT is not a concious being. Conciousness is awareness, something which Chat…
ytr_Ugxci86e3…
G
I mean no, the reality is the "stolen work" argument is a red herring. It may no…
ytr_UgxLUDent…
G
It's cool because the algorithm promotes this video based on the amount of inter…
ytr_UgyJU7JFN…
G
this was a comforting take on a very difficult and scary subject. Thank you. We …
ytc_UgwOXTR9N…
G
AI art has drifting issues as well and is still struggling to stay consistent, r…
ytr_UgzWjaqtv…
G
I choose to believe countries like Norway or Finland, who are already doing a be…
ytc_Ugxx8hnlP…
G
If ai people want to say that ai lets disabled people create art, then only let …
ytc_UgyJffP1K…
G
I’ve always seen parallels between the use of fully autonomous weapons and unres…
ytc_Ugxh-aBsJ…
Comment
Easy to pick on Tesla but how many non Tesla cars are driven into emergency vehicles? The data needs to be normalised to see if this really is a Tesla Copilot software issue. Does this happen if it’s a queue of stationary traffic with hazard flashers in the same weather conditions. I suspect that overall the Tesla probably save more drivers from incidents compared to cars that rely entirely o; humans alone. I do not own or ever driven a Tesla so I have no bias.
youtube
AI Harm Incident
2025-01-21T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAFQBeSTNdDFCNLCd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFJUGkjhTnVZaO2cd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8Kq0kkDafXWeuKAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyJw0lBpwlB8t0uM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6BdvFC32STL9goAp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVwdsHIRRfPHIeXlh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwisFB6Iwt5bwYPyDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFsr2csxlbD3r8UiN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypHeJg7YS7YKruSSd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxh50gLsl7n_pjy-594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]