Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I *highly* recommend playing through (or watching a playthrough) of Alan Wake 2'…
ytc_UgzDe6CBO…
G
@derrekingledue3138 if you cant open you eye .. i dont know with version you sa…
ytr_UgxypvE1i…
G
Why does no one mention that a key reason they need all of those centers is to c…
ytc_UgwB9XmKf…
G
All BS. If AI cause unemployment then who will buy products. People will not hav…
ytc_UgzJKzTqQ…
G
You’re gonna get downvoted here but it’s the truth at my big tech too (not G). I…
rdc_ohsn8tx
G
The fucked up thing is that this has happened before to graduates in 2008. Ask m…
rdc_gkrfxqm
G
Lots of negativity in the comments. Most jobs are bullshit jobs anyway, the worl…
ytc_UgxhhqMR-…
G
Sure, let the AI take over all the jobs. In exchange, every product and service …
ytc_UgzD6iwJ-…
Comment
This is the same argument that keeps coming up against self-driving cars. The reliability of these systems has scyrocketed the last 2 years with the scaling law showing no sign of slowing down. That means that the either the next frontier model or the one after that should be able to outperform humans on reliabilty as well. At that point it actually becomes morally problematic to not have human experts checked by AI.
I do agree that the systems as they are right now require oversight. But given the rate of progress I think saying "never" is outdated and shows a fear of change more than anything.
youtube
AI Harm Incident
2024-06-02T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyKib66b8s5YkeJGnR4AaABAg.A4Cnn1DJ6g4A4cWbquRXTr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyTBsoIjnKqYD_a-sx4AaABAg.A4CggS7ktZiA5uoYd_7TjJ","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyTBsoIjnKqYD_a-sx4AaABAg.A4CggS7ktZiA6ZHFOwhYwr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwn1haP5cFdZs_AVdN4AaABAg.A4AnJcNnh1cA4CDMBnl1P-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwn1haP5cFdZs_AVdN4AaABAg.A4AnJcNnh1c4CIFXhmBh3","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyrXvIHrsU1GKX3MLp4AaABAg.A48XmVFEoKtA7Mx2e42YYt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyXpXXhU6y1SMq-8pt4AaABAg.A48VmxuDnpJA48WdkgKczf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugydt8PlJDYYAwWn-Ch4AaABAg.A48HSbRsNp1A48UXZ0LsU9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugydt8PlJDYYAwWn-Ch4AaABAg.A48HSbRsNp1A4CFrEdOApF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy778tA6QSF-ISWArl4AaABAg.A482dS1MC8UA48Y2pekjVg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]