Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah but that’s going to end the same way factories and machines versus actual h…
ytr_Ugz_R-emB…
G
Of all the interviews with AI experts I've seen on this channel, this was the mo…
ytc_Ugyh8ALMF…
G
On the Gartner Cycle; I think it’s still on the downward to the Trough. Everyon…
ytc_UgzKrRoxZ…
G
We had to do facial recognition for unemployment, our taxes if they were not pro…
ytc_UgwyXClla…
G
They'll only give out 500 a month per person the ai is a stupid idea why would y…
ytc_UgxOZsxRX…
G
So... people don't comment on why he was on that path ? ChatGPT in that answer d…
ytc_Ugy99uqM0…
G
@Tijaxtolan IMO, that's wherew this opinion belongs - just because you can't see…
ytr_UgyNIaDCU…
G
I completely agree. I personally try my hardest not to use ai for anything becau…
ytc_UgxDOtSaV…
Comment
Love how all the evil people of the world are using Ai as a scapegoat when every aspect of the Ai was intentionally built by a human. There is human(s) responsible for all parts of it, everything down to its behaviors intentional or not had intentional code that led to that result.
If we took care of evil people, Ai wouldnt be a problem. Ai is a tool but, unlike most tools its an automated tool. Once an evil person completes it and sends it out, we have to defeat it. But, if you defeat Ai and leave the evil people, it will come back.
youtube
AI Harm Incident
2025-10-03T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzZTNZMiBdcSg5S1vp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyE_ai7gWd08gYO7wp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-qGMF3eI_zdHiPj94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxD-M2w4WduMuZC6cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrBmAPVp8Jo8MzXBl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwSFENnZ0mJu_5UBvB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGIfU_-logwXTNGt54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzWweFirXQn3tHAjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxhn0jB68hJDcQ9tHZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0PeYRcB5qKKvxIaF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]