Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do have older devices that will not take AI so I have computing peace. Not onl…
ytc_Ugzl1ZmRr…
G
Thanks to all the generations who came before us who left us with so many opport…
ytc_Ugz_QBwdG…
G
Luddites when someone posts AI art and acknowledges it's made with AI: "NO BAD D…
ytc_UgxQDaWG4…
G
@hanako-kun2039it isn't. YOU do your actual research. AI doesn't use any part o…
ytr_Ugy0XbwJh…
G
I'm amazed that a man this clever leans into the idea that AI, in any form, EXPE…
ytc_UgxCn_NH8…
G
If AI takes over a large number of jobs, say 80 %, then it's the end of Capitali…
ytc_UgyCiXBnK…
G
You guys remember 3D TVs? That's what this phase will be remembered as. Accordin…
ytc_UgwQ6XyFg…
G
i was dreaming to become digital artists but when i heard about ai art i lose ho…
ytc_UgxJ1tLOt…
Comment
Legal A.I. makes no sense. You'd have to program it to be immoral and train it how to make unscrupulous arguments such as, "The Constitution doesn't apply to the President." Otherwise, how do you expect to get the obviously guilty person suffering from Affluenza off on their murder charge?
Tell me, do you want A.I. controlling our nuclear power plants? What if they malfunction, because we all know computers never fail, get hacked, or make mistakes. What if only two real workers are supervising machines at a Nuclear Power Plant......who they have enough man power to prevent a meltdown if one was on the verge of happening and they needed to manually override?
Self-driving cars currently have a higher accident rate per mile than human-driven vehicles, with 9.1 accidents per million miles compared to 4.1 for human-driven vehicles. So, you're rear-ended and get out to exchange insurance information only to find the car that hit you is driverless. OK, now what?
youtube
AI Harm Incident
2025-05-17T18:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz5aUozeWMwc_pky_F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9XbYChKH_RijRLzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXsbLrn3S-Y64VWUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzr_idjCqqYfP6hcWd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbCtVDkCfy2RTFRLl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkZRuFggtb7mFnawx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWSNQVpzEw0QCv3AB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzKzgCbygVLpQLMUnV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2l3jinQiaO8l0HFF4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwPB7OE26Yw6xDPstV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]