Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
7:30, you can easily ask a chatbot to describe Caravaggio's style, then go to a …
ytc_Ugy0rQrpA…
G
History will soon view author George Orwell as a prophet. His dystopian novel "…
ytc_UgyfKe1yQ…
G
I commented it on where i got the information... i thought this was gonna happen…
ytc_UgykmGZvk…
G
I bet there is someone that is actually typing into a program remotely and it's …
ytc_UgyDfnfE3…
G
If every worker loses their job to a robot or AI, who will be left to be able to…
ytc_UgwPPIH8V…
G
Very impressive overview. I think the message I get is that logic rules? With a …
ytc_Ugybd0ERn…
G
Capable of human emotions? Even Data in a Universe far away is still learning, d…
ytc_UgwkKCaXa…
G
My major concern is the merging of organoids with humanoids and computers and wh…
ytc_UgxBYNr4r…
Comment
Ryan as always perfect! Thank you for talking about this problem. I wonder, if some day we will judge programmers behind the autopilots for their mistakes and human lives. Or company managers, NOT with money, but with jail. There is always a MAN behind every computer and there is always a MAN making decision. Take responsibility. And don't call it autopilot, giving the TOY to the dumb people and then killing some innocent and sending to jail your poor customers....
And we didn't even started to talk about future AI problems, when AI will decide who will die in the accident, and who will survive.... according to algorithms.... We will need strict rules for this, or closed roads for AI (for trucks, buses etc.) so people can AVOID them on the road.
sorry for bad english.
youtube
AI Harm Incident
2022-09-03T19:2…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwpnoCLAao9Wg8qJat4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6jLbUlRnvCYxwAFd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYcy5jEGjYbZ4EIE54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwKovd3cuzR-B4BXqx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJIWeXj94MSc_NAqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-VeJiSSBXNnT-LnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNf1hwXOiok8_s_a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4RsqwjHZjIjdr0UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyj7J2UQOZfQSmnnNN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3xHhheB1MwNJgIYh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]