Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They haven't really gone in the red, that's the point of this. If you remove the…
rdc_czladxq
G
Hilarious how people think trusting the tesla advertised "full self driving" mak…
ytc_Ugy_D1pBd…
G
LLMs aren't the route to AGI. I think to achieve AGI we need to make an AI archi…
ytr_Ugzz5VMBm…
G
It's simpler than that. AI can/will release highly contagious "Covid 25" (alread…
ytc_Ugx0O5wJN…
G
You said that AI wrote this episode, did you instruct it to write a fiction stor…
ytc_UgxlDoQyq…
G
We need the world to be sustainable. And AI ain't sustainable. It also comsumes …
ytc_UgxtcsOkp…
G
***But billionaires create jobs***
Amazon jobs = pissing in bottles.
Apple ph…
rdc_ogyh99e
G
Everything he calmly described is reason to collectively agree to stop developme…
ytc_UgwLYqz1H…
Comment
Tesla: Here is your self driving car, it can handle complex roadways with zero input from humans, as the car is self driving, they are just there for legal reasons, just look at our video proof of it doing so.
Person in car that doesn't need human input: Yeah, years ago I was told i would have self driving in several months. So clearly enough time has passed that their claim of full driving makes sense.
Tesla: The crash is not our fault. They should have known not to let the self driving car, self drive.
youtube
AI Harm Incident
2025-08-23T21:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyR1EF9kaZOvBEaDMh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4IbGWKhGsZ364hsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw03S5SnYHMdOC_K6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT8E8Mealb8fVM3qJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxvHLHN9LUFdEsPAHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwJAChWjkhUdf-t914AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYHM_C2ZmNvoVhPut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxt4P0B-5Qbr55IEBd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfAohMOjjIenXqiOp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyv0R5RkG5T2ZRRLux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]