Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You think like an eight-year-old. AI was not built by one homogeneous, coherent,…
ytr_UgxEyt-UO…
G
Just accepted a github copilot commit suggestion. A miss spelt variable. The age…
ytc_UgxV_e6qU…
G
We've seen exactly this happened with the ubiquitous and completely useless cust…
ytr_Ugzbu7AsN…
G
@UCantAlwaysGetWhatUWant While the M-series CPUs are useful for AI processing, t…
ytr_UgxAVRmht…
G
Its good if AI gets smarter then human cos it will discover all kinds of drug, i…
ytc_UgxK58Xdi…
G
IM A DISABLED ARTIST: I legit cut apart photos to make images for reference.
Pl…
ytc_UgwzfQcZE…
G
One of the other thing which is not talked about is the ai art wouldn't be copyr…
ytc_Ugzitia4O…
G
People are freaking out because AI is "malfunctioning"... It's NOT broken, it's …
ytc_Ugz6K_BiC…
Comment
You miss the point: self-driving cars are "sold" to us as being BETTER than humans. They have all sorts of sensors & radars that - should - enable them to have a better vision/awareness of their environment than humans. And, one day, they will be superior to humans. But right now, they aren't!
This accident is a nasty example of the serious limitations of the self-driving cars and a sad reminder that car companies LIE about their cars performances.
youtube
AI Harm Incident
2018-03-22T13:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzQOZ0xeR_MvbxaTPl4AaABAg.8e4YBJWuzkA8e5gQOfUZmh","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugz7AxvoeP0cNLIk5-p4AaABAg.8e4WM0WYx8O8e6NBFp_Rvl","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz7AxvoeP0cNLIk5-p4AaABAg.8e4WM0WYx8O8e6SYyLtDOe","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzfgGIMf0YW9DpIzxV4AaABAg.8e4W678ktNY8e54DRFXGnj","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzEnXkqTqC0O-XZLOV4AaABAg.8e4VnUp5aqP8e5xQyebu9V","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgyU2UngCDg136H8NUx4AaABAg.8e4GR1FHZx08e5dkLBbx8h","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzF7MnIXWPYaBn_UUV4AaABAg.8e4FfPJnYWM8e4PBgB0RqO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgyV019tyG9R0g7rw7F4AaABAg.8e4EMi_4IbY8e6ooGklDfK","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyiyW43qT6sIpx0xW14AaABAg.8e4E7Ts8Gs58e4l71ZHvLA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyiJmvjw7Yhygp30C14AaABAg.8e4D0yxQ5F_8e68DNiMAmd","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]