Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol, or just have it write your whole case and get disbarred! Haha. That's one o…
ytr_UgzTRR2VL…
G
AI makes your customers hate you! We are screaming at the phone - OPERATOR! OPER…
ytc_UgxcpRvPu…
G
Because there are a lot of Nazis and racists out there, and they want the AI to …
rdc_jht4nl0
G
You can not blame the AI. It doesnt have any feelings and it gives hotline for h…
ytc_UgyYWV_CK…
G
AI gets smarter and we are at least some of us getting dumber... because of such…
ytc_UgzvxcHkj…
G
It's harder to maintain the north american empire of lies after live streaming G…
ytc_UgzLswZ7k…
G
A I is not sentient. That’s stupid satanic propaganda. It’s the doorway to dem…
ytc_UgxsHX8SI…
G
The words 'AI' and 'Artist' should never be put together. I suggest 'AI' and 'ma…
ytc_Ugx9_j0BJ…
Comment
On that missile question it got wrong, turns out ChatGPT 3.5 gets it right. That is, if the question is correctly worded. Maybe the AI didn't understand the question from the way it was worded(??).
Anyway, give ChatGPT 3.5 this: "If 2 missiles are fired directly at each other in level flight, and one missile has a steady speed of 9000 miles per hour, and the other missile has a steady speed of 21000 miles per hour, how far apart are the missiles one minute before they collide?" ---- It gives a well-worded explanation as it solves it correctly !!!!!
youtube
AI Governance
2024-01-07T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzWYBPZ8tykLzg9fsF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzD7oxc3cDIuuh50D94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUveA6fuMSQVTyuPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyus3WzvrM1oTenk0N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlLjMHPdKKteqAVWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLEUEqVP_VhrERlUt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9mYyfFgTl1WPwPcF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnS4ds3hpzGoCaT_F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwh-nqgAGFcJaU3lOR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzimA_w3y-rjVINn854AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]