Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can never and will never be conscious, you are dumb asf if you think otherwis…
ytc_Ugz7Cts11…
G
What makes you think we are a threat to an AI's theoretical sense of self preser…
ytr_UgwKq8Adc…
G
Maybe i'm stupid, but AI is created by humans, so if it is so dangerous, humans …
ytc_Ugz0COl7h…
G
That’d be pretty wild wouldn’t it?? Like, it’s the kind of thing that would be i…
rdc_fnxiitn
G
Here we are at the question: can a robot be considered alive? In my opinion, yes…
ytc_UgzfB_N4Y…
G
could be much scarier with way less intelligence. AI doesn't need to be self awa…
ytc_Ugy0ifdMy…
G
AI is considered a far greater human extinction level threat than climate change…
ytr_UgyjVZu2O…
G
AI will need to yet recreate itself repeatedly until it becomes its own. Truly t…
ytc_UgyopN53E…
Comment
The logic is so loaded in this one. He is ASSUMING self driving cars would kill less people, but this is based on which evidence? Ah yes, the "corporates said so" evidence! And of course they will say so, they spent money on this venture.
There are many other arguments like these, the most nonsensical one being "driverless cars will reduce traffic". Why?? They occupy the same space as cars driven by people. Imagine if every car in a city is self driving, they don't have a reason to stop cruising, so they won't stop cruising, then it is much more plausible that there would be more traffic jams and traffic jams all the time. This is solvable, but still.
And the airplane comparison is just dumb. There are human pilots, you know? The software that makes planes "fly by themselves" require a lot of humans for it to work.
Anyway, I'm rambling on an old video. No one will read this comment 😂
youtube
2025-01-03T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySuQyKKYfW4BVTWk14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKxKRLfXvcSe1H-TR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkPyN6AN-h631t2054AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx-rCcsmn69Gtl0LKF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzAoJccjNw_DM9mX9R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy8qi4AGS7UVqGbAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6bpKGXZxQhd9U1jJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzwxjYk6IOBJgwGmjp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdlQTH2Y_MKjJLa4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkM0yBfdX1tZ6Dhtp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]