Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Looking up and to the left is a sign of recalling memories or information in hum…
ytc_UgwwUQL9f…
G
Been saying this for some time now! By definition LLM's mimic the probability of…
ytc_UgzPP_Clj…
G
@NatureSay-e8gpeople are posting ai art claiming it’s real
perhaps you’re not lo…
ytr_Ugw8kacz1…
G
We don't do ai . Space travel for the US went through other countries .…
ytc_Ugyis9gUH…
G
@thatDigitalGen yeah but the brain is also a matrix that makes chemical reaction…
ytr_UgxHodCfg…
G
Man, it's like A.I. is the cotton gin and the actors are the luddites. Actors wa…
ytc_UgyM5oUPq…
G
And by the way... it felt super disingenious as how he chose to keep her bio til…
ytr_UgyZ7sNED…
G
Yes, this is so harmful. Instead of teaching your child that you don't want to l…
rdc_mvkkrss
Comment
A Tesla hit piece. Other car manufacturers have this problem. Also Tesla Autopilot requires supervision. Its no different to someone crashing a car while using cruise control. Also lidar based systems dont guarantee no crashes. There has been many instances of lidar based self driving vehicles crashing. WSJ fails to provide evidence that Lidar based systems are safer. Also at the end of the day it is user error. The Tesla FSD system has several safe guards to notify the driver to pay attention to the road and keep the hands on the wheel.
youtube
AI Harm Incident
2024-12-16T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5u5HtLY08NTVVfgt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMaurJKyDiIbNascp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQtIE6uIkffGcwFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-0uxc3Wu-BwH7Fed4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUe291MA0JNbhSmqx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyk1BNOnsfe05bx2dp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzlB6E3CAoi4migsJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqkZj5-iP5QoGVEK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwb12XEGOL00GaWhCR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCAXJSWP6FM0ttC294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]