Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I let AI do my math. Because math SUCKS! But yea I do understand what you’re say…
ytc_UgzV1EzIt…
G
Liberals want them to have a gender switch class and read some sex book instead.…
ytc_Ugyvwqi2a…
G
I'm not ready to give up my controlled freedom of movement, honestly, to those w…
ytc_Ugx8OfmIZ…
G
It's certainly not no. Otherwise we would do things without moral consequences. …
rdc_hazjat5
G
Great that real artists drew it, but the sad fact is this a.i. account got exact…
ytc_Ugx748Fdi…
G
@Apheleion
Literally the entire point of a company is to seperate liability of…
ytr_Ugz_WhhoK…
G
I am black and ai say that i not a human wow i gonna kill AI😡…
ytc_UgxXXQFmK…
G
he’s sort of talking around the fact that people are trying to make super intell…
ytc_UgxF9a_M-…
Comment
That’s why if I get a Tesla, I wouldn’t spend another $10,000 CAD to get a full auto pilot. I wouldn’t spend $80,000+ CAD to a Model Y that’s included full auto pilot. First off, it should come with it. Secondly, shouldn’t be an expensive vehicle. Thirdly, I don’t rely on the car self driving to take me from point A to point B. Yes, we do need advance technologies in the car to protect ourselves while driving. But not giving our lives completely to the car for safety. In the end, humans > technologies. Not Technologies > humans.
youtube
AI Harm Incident
2025-07-16T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw0_VbvyPjzjsKF-Rd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVnq8SEkY0qLk1S914AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8oXcLC6Dpkhtq9YV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze3rcWJ49Ll6xX7Gl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4oppkbh3iG8s-SMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIQlMJBJkGzZwsjU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYk1ZPlOuBbYGdrjh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyePPdVDK5nqbdKFhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwPgTC1HjbyQk196dl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrIGwjTB3LjBlb1mt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]