Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@12:04 -- _"Sam Altman originally started OpenAI as a non-profit..."_
Hm... no…
ytc_UgzqEy8Jy…
G
A 'Godlike' App? Software? The human brain is not even understood yet properly,…
ytc_UgzhQE-Ah…
G
If AI continues to be developed without restraints, the only place people will b…
ytc_Ugz1HKWG2…
G
I’ve been using Codoki for a while now, and it's a relief knowing it catches iss…
ytc_UgwLiJfeB…
G
Yeah but for the purposes of the calculation in this article, they'd effectively…
rdc_d7kphps
G
@Toothles_fur I told myself i sucked, and i believed myself, i hated my art and…
ytr_UgxxQ-Z-q…
G
I think that a lot of the people that are like "Nah lol, that won't happen." hav…
ytc_UgiGv6YyL…
G
Shit give a robot any weapon watch it go for the Nuke then we go all 🦖 style. T…
ytc_UgwqrK-eZ…
Comment
This is a level two autonomous driving system. Calling it autopilot was a stupid decision, but it does not take away from the fact that the car makes lots of effort to make sure you understand that you must be ready to take over driving at any moment. I don't really want to defend Tesla, but the truth is, all these accidents would have been preventable if the people behind the wheel had had their hands on it. Also, to be clear, when a pilot engages autopilot on a plane, they don't stop paying attention, and they certainly don't film Instagram stories or play video games.
youtube
AI Harm Incident
2024-12-14T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgykhBnK03A1cnzoMK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_0-FCKcGUog-xMyR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyB0K3BpkDGlIv0-0d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfqxdC6amIdtgpkxN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugzr2jhU0SC7IdM0fVB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1VuB22UrMVB9ECqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydaUlBFkjLkGIJRpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNsYdmvLWHKfavuTx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoM0nD_LE-irFS2tx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxT0dy2tMWQ7ZEFb9N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]