Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When I think of a futuristic Utopia that AI could help us get to would be simila…
ytc_UgxA9t0nf…
G
The key to Ai safety is to begin training with the principle of non harm. Sanct…
ytc_UgwgFaBZD…
G
Idk, I think the guy is right on the border between delusional and having a poin…
ytc_Ugw0pmE2l…
G
Why can't we turn it off? I guess no one is going to adhere to a moritorium in A…
ytc_UgyyqDp3_…
G
Yeah but it feels so good to tear someone down than to do that hard work to lift…
rdc_deuis28
G
Yup, I’ve been drawing for about 7-8 years, finally started to get comfortable p…
ytc_UgxpWfV9g…
G
What the F…k, is IA robots now being programmed with the memory and physical rea…
ytc_UgyU5V3cI…
G
I called this months ago and now it is happening and said it propagate but start…
ytc_Ugxe6srPU…
Comment
And yet the Australian authorities allow it to be used. Oh the irony! 60 minutes digs up a case of Autopilot (not Full Self Driving!) from 2019 and stacks it against FSD now, 2025? To set the tone? 6 years later? No statistics quoted. What are the accident rates compared to non self driving cars? How about also reporting on the benefits of full self driving? We are dealing with an evolving technology which is destined to improve. MSM at its best.
Can we please have a 60 minutes on how unsafe the Ford Model T was, because it had no airbags and ABS.
youtube
AI Harm Incident
2025-10-19T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz0BLs5H935O6Vn_ep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFTB0LsQe9ItjzBWx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBG8TjqdSEYBxUHt54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxoi6aOC-zZ3g_mE9d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyP6JJXFrGmeP0FHYh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwO1iT7fyKwjccuNN94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2kAi0RT35u5atxA14AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyWKpStpllznFf1Ncd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDvWK8lAmES1H5qR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyexCoJp99nKlvQoDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]