Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should NEVER assume automated tech will work as intended every time. As much…
ytc_UgxiqWl2K…
G
Wait so, mile for mile autonomous cars are just as dangerous as non-autonomous c…
ytc_UghJNA0FS…
G
🚨 Artificial Intelligence will NEVER be smarter than Humans.
Baruch 3:36
[36]He…
ytc_UgzFF3pnI…
G
People don’t understand how encompassing the Internet is. AI posting fake conten…
rdc_le5m1ij
G
Why do people in this sub hardly ever account for the fact that AI will… improve…
rdc_mjtn6x6
G
As an BSc in CS and an ML engineer I don’t agree with some of the things you say…
ytc_UgxKI9AUH…
G
This is just my opinion, if a UBI does become a thing, I prefer a basic one that…
ytc_UgxDJ-16V…
G
This is coming doesnt matter Ai robots will beat humans at everything military e…
ytc_Ugx0bnwfS…
Comment
Look, full self driving is not something we should try to achieve except for those who are handicapped. What should be attempted is the car takes over and the event the driver has failed to recognize a life-threatening event. Why create a series of cars with FSD that now remove any skill you might have learned rendering you utterly helpless in the event FSD goes down. A much better approach would be for it to take over when you’re about to crash.
youtube
2025-11-26T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzFpLDVB1cpNavKLz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvQTI-CllD4WMUzvJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXv79hkiNPKergW2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYLFpxg7CmAyvAWKV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5eBqpoaD5kber8OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVs97RCpw9RjWhBxx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwz-_EWCT_vfJoZRLt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRsy_sMbzzxrEz61d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkipaSXGRIlwqdg6Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3J-dOjBT46x7JfNB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]