Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If someone has to use a robot to speak to me, it means they’d rather pay a robot…
rdc_n710t56
G
Artificial Intelligence (AI) is indirectly and detrimentally impacting polar bea…
ytc_Ugzr7F8d_…
G
AI race would end with a big distaster just like the nuclear race did. Some coun…
ytc_UgxrupSM3…
G
I read somewhere that people who are CEO, often are void of any empathy or co…
ytc_Ugw17JXqa…
G
Thank you, hate when it’s called art they’re AI images derived from chopping up …
ytr_Ugy299LMb…
G
@jules9669 Yessir. I am a maintenance tech. My job is safer than Doctors' jobs …
ytr_Ugz6Y7VxL…
G
Make o3 and opus , rogue ai learn to be humane, by learning human values....emot…
ytc_UgxwkSi4q…
G
The thing is that AI cannot be neutral, because the data isn’t and the makers ar…
ytc_Ugx3bXljd…
Comment
As a software developer, there is always one more bug. It will approach zero, but will never hit zero. There is a likelihood of introducing one or more bugs that vary in criticality by altering code to resolve a specific scenario. For this reason there does exist a scenario where self driving cars can become more dangerous during an update.
youtube
2023-08-03T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy-hqylfhc7ZoaQqBl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygTOfSZNd_29mNsu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMTqi9TLjRTB46STR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmLab3KIQRIiHmrSl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyA5GQ7YfEN8Qr1CPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzn4_ruCYg4FGvuOgF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwywOOVA1I-n34OT0F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3kRRafQDFP9Pj_dh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_KsJrWinrz9-BbmB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd9_TEqH52OzA5kkV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"}
]