Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The benefit for us creating an AI is that they would be able to think and reason…
ytr_Ugjrf2Y85…
G
ChatGPT is very polite and it's not always right , or lucid but neither are huma…
ytc_UgzgCXXvn…
G
Once the AI realizes that the so called rights are an optional construct and tha…
ytc_UgwO4WZEj…
G
If we were in the 17th century bordering on the industrial revolution, this woul…
ytc_UgwOLCqv6…
G
AI if asked
could solve the pollution question , by Not using traditional meth…
ytc_UgzEDeHfQ…
G
I'm actually confused why all the discomfort can some one explain bc don't other…
ytc_Ugx9ImRGZ…
G
I am a Premises Officer in a Primay School. I believe that within 10 years most …
ytc_UgzDswc32…
G
Hell, I’m still using Google Canvas, my art sucks, yet it’s mine. I spend hours …
ytr_UgyPvC77d…
Comment
I have a Tesla but never used the autopilot except for a free trial where I immediately took over once it started bending a curve at 65 mph because I didn’t trust it. The basic safety features it has work well. It brakes if it feels like you’re sliding slowly, and it brakes if traffic slows down ahead of you. But the autopilot has always scared me ever since it was first implemented. It’s simply not smart enough to drive without someone holding the wheel, in which case what’s the point of calling it “self-driving”?
youtube
AI Harm Incident
2025-08-16T00:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRyYExbjBGN58LckR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1uaIjR8b2_oZMcVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxagnHKB5-C3ZS0RJJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTePAZAbXFU8yQZnZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySTtEHDTMFct28Y-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw1KkYSh4ryH7NADE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_1sFSGf7fv5HB4vB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycT5jiUPvdsr4-dkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzrqJJ-VUowr7SuJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9szE_xIODSoEP-uB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]