Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The plan is going on schedule. This is the part they tell us AI actually is just…
ytc_UgyB7Ix2K…
G
ai is a fine tool to get an idea of what you want to make or find inspiiration o…
ytc_Ugz2IA3nS…
G
It's unique seeing how AI has changed from funny images that looked weird, too s…
ytc_UgwEcGgLj…
G
im suprised they didn’t embrace ai with open arms given this current administrat…
ytc_Ugy92LMG3…
G
And yet ironically, none of those pieces would exist without ai art.
A whole tr…
ytc_UgxyL9Ccy…
G
Give em hell! And may the rest of us learn from your courage and unity.…
rdc_dy8t0wd
G
He has started believing his own bullshit. If AI starts to take over just switch…
ytc_UgwjcNN_w…
G
What if the robot is programmed by a black woman. Can it still be racist and s…
ytc_UgyFdBT5g…
Comment
Frankly I see these so-called scenario as a joke. We are placing moral judgement of right and wrong, good and bad as the responsible party, the machine. HELLO, HELLO Has Anyone Ever Heard of Human Drivers Behind The Wheel of The Car? Some of Us Don't Need Hazardous Condition For A Life and Death Situation. Well Has Anyone Heard of ROAD RAGE? Are There Humans That Just Love To Tailgate Other Humans? Are There Humans That If You Kiss Them Off In Traffic They will violate all kinds of laws chasing someone down the road ways. Has anyone ever heard of human drivers drunk behind the wheel of a car?It Amazes me that humans are SUCH A HUGE RISK WHEN IT COMES TO DRIVING AND WE'VE GOT THE NEVER TO SAY I WONDER IF SELF DRIVING CARS WILL PROVE TO BE BETTER DRIVERS-WELL COMPARED TO MANY OF US THEY SURE AS HECK CAN'T BE ANY WORSE!
youtube
AI Harm Incident
2017-03-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]