Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a fun toy. I have dabbled in the "art", but it was more... playing around.…
ytc_Ugyz4-qhj…
G
Since I read Mustafa Suleyman's book The Coming Wave back in 2023 when ChatGPT w…
ytc_UgzeeQUQ3…
G
I like it a lot for generating fancy furniture and other background objects I ca…
ytc_UgzofJNJd…
G
It's literally the essence of a group of humans, greedy, selfish, and impatient.…
ytr_Ugwh60fuU…
G
My problem is not the ai generation. I honestly think it's pretty cool innovatio…
ytc_UgwJfIIn3…
G
As someone who isn't a dev. It's interesting to see how I see two opinions aroun…
rdc_mjtowgu
G
That lady can't connect to her WIFI let alone use AI to make something. Elitist…
ytr_UgzWQumxp…
G
For OpenAI to call their agents "AGI" is like Microsoft calling VBA macros "AGI"…
rdc_n40qmnn
Comment
We need laws that force self driving companies to forfeit their business to accident victims at the first injury accident. This would insure better safety rather than the company weighing safety costs against damage liability. Anything less will produce accident victims as companies exercise cost avoidance when safety costs more. This is why Tesla has had so many self driving accidents.
youtube
AI Jobs
2025-09-11T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-su48OGZiP_5Wm0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxu6OJgVQ9MRi95P3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtOn2TbJf7pUZXJUB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFPhfHLRLhHFVFS8d4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4AfEiuw9tR3-cADp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgCtgFyHv_K_HvJN14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxJIkyjI4rUpORif9x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpLmLhwHL5Mcf6PAR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7Km5WTnVAsf9s0jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0noKyPsRMb_WgDf54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]