Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting (as usual in spite of host's question. Sorry, but.... and 10m subs, …
ytc_UgyXcGe1u…
G
@Squad wipes™ Didn't Will Smith do a robot movie kind of like that ? "I Robot" c…
ytr_UgydbbllI…
G
Since when ChatGPT mistypes or utilize broken English such as, (... i'm honored …
ytc_UgyUcPJ5E…
G
Except this conversation will not have a significant weight in any AI database. …
ytr_Ugwk4ukLJ…
G
*I’m talking to Perplexity, now; Quantum Computers, what effect will They have o…
ytc_UgxmX9zeM…
G
Yeah the turing test hasnt been considered to be an accurate indicator of "true"…
ytc_UgwGEfNia…
G
Yeah? Weird how even without any of the additional sensors a Tesla on Autopilot …
ytc_Ugy5YD1uk…
G
give me a break. don't use chat bots AT ALL. problem solved. they are ruining th…
ytc_UgyWgQJV-…
Comment
It is sad that you lost a family but the person who is driving is fully aware of Kate Bailey is a full self driving. Before you start auto pilot has a disclosure you have to agree. It tells you you must take over immediately and always be aware of your situation. This is not a full self driving. The person you should be suing is the driver. Not Tesla.
youtube
AI Harm Incident
2026-02-07T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzG6qV7OsGfYouFiIJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1O6ZvQBizd5oFkPl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH_5oQPoNoGyzaJSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVVMDLsrSANhM_QSZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHEyBfgBz4kKsAl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8bDzDiZdaSWUIEex4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQuL3kY_eyaEMD4FN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKwyjy0Lv9K6Zt9Sx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy900VElDf46i9y99V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwN32k9hLtOGwy6Ep14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]