Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pssh next thing you're gonna tell me us the image of him on top of a cell tower …
rdc_lq6h0a0
G
AI could gain so much capability and influence that humanity could lose control …
ytc_Ugz_72VrG…
G
I fear it’s already too late. Somewhere, someone will eventually succeed in crea…
ytc_Ugw4E3t3T…
G
As an auditor: I hope hope hope something will be created to lift our workload. …
ytc_UgwH39V-B…
G
Air gap AI from robots until we figure this out. This is analogous to the testin…
ytc_UgyCfe2Gb…
G
@ImSodaLiriousLet me try and explain it a bit better. LLMs work like auto compl…
ytr_Ugz51y7UG…
G
Humans never take proper responsibility for the things they create. We figure ou…
ytc_UgxF0fjcx…
G
Are humans conscious? Whether they are or not, they behave like automatic aninat…
ytc_UgwwDZlcq…
Comment
If we are going into the future at least have logical process for these things before you unveil the actual products.Because I think this is something you allready went through before you started this self driving cars.And he is right your the operator and there has to be someone who has the rights to stop the car from going in circles.Like test after test of this example of errors.Trial and errors before it sent off to be use.Because everyone thinks they have it under control.
youtube
AI Harm Incident
2025-01-13T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4rFK8IUrLf_3BH814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOuoFxb4otbOtGwFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxg0oRMp1IMYGcq3WN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzJm9d2eEcVsxo5lvh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYAWszfNv_Ghvz-ZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzA6vasg--n7_XBjt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyV8J7jRzkTAb7mvtJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7C_si6lLwYP4a-B54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGv9Uu7pXTGzqEpQ14AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyOksWG-2y9v3nZCCx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]