Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh no... Now I'm thinking back to when I tried that "look under there" joke to C…
ytc_UgxxBNxlQ…
G
Btw. Mark wasn't wrong about the only thing of value is actual labour. The ener…
ytc_Ugw5aQYXA…
G
I we went the route of achieving Universal Income, I don't believe A.I.'s goal w…
ytr_UgwdyWU26…
G
I always do thank Ai too.i thought i was crazy for doing that so im not alone 😂😂…
ytc_UgwsBJO5s…
G
So who takes the wrap if one of these driverless trucks smashes into 10 cars kil…
ytc_Ugx9tHHHj…
G
We might have tools available to detect AI, but on line we are exposed to images…
ytc_UgzeLj3cB…
G
A.I is becoming the norm, but I guarantee software is also being developed to be…
ytc_Ugyy1ghuc…
G
That's not exactly how it would play out... At least not that simply. It's quit…
ytr_UgxLXDO_q…
Comment
It’s almost like cars are so inherently unsafe that they can’t be made to play nice with other modes of transportation, and the only logical route is to incentivize the other modes of transportation by designing our cities to disincentivize taking your death machine everywhere you go.
(At least until self driving is so good that it’s basically something you’d see in a utopian sci-fi film)
youtube
AI Harm Incident
2022-09-03T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwpnoCLAao9Wg8qJat4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6jLbUlRnvCYxwAFd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYcy5jEGjYbZ4EIE54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwKovd3cuzR-B4BXqx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJIWeXj94MSc_NAqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-VeJiSSBXNnT-LnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNf1hwXOiok8_s_a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4RsqwjHZjIjdr0UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyj7J2UQOZfQSmnnNN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3xHhheB1MwNJgIYh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]