Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2:37 Additionally, if it wasn’t about the method, people wouldn’t be so impresse…
ytc_UgyvMTiDa…
G
If he stops posting we all know why
And if I go missing the ai had taken me I’m…
ytc_UgytcUbyX…
G
Incorrect on a few levels. Firstly, no AI has the entirety of whatever has been …
ytc_UgywnjGaz…
G
exactly,and ai bros try so hard to pretend they aren't thieves,so not only are t…
ytr_Ugz5MkIg6…
G
I used the dream up AI mostly out of curiosity. It was just kinda weird like yo…
ytc_UgxylFAa5…
G
I do not know how people screw up with si that much
Whenever i used chatgpt in …
ytc_Ugz0sNq6a…
G
@MrGolibroda inspiration means you have tried to copy someone's work and tried t…
ytr_UgwsFIISZ…
G
The people who believe in this crap and push this intelligence transfer have no …
ytc_Ugx0yH380…
Comment
The way I see it, developing self-driving A.I. now will probably get some people killed that wouldn't have died if the A.I. wasn't driving, but by the end of it we get an A.I. that'll save countless more lives in the future thanks to being a much better driver.
If we try and save those lives now by preventing what it takes to develop a self-driving A.I, then we're dooming all those people in the future to die because we didn't invest in the option that'll be safer when THEY need it most.
And are the lives of those future people less valuable just because we haven't met them yet?
youtube
AI Harm Incident
2022-10-31T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzGo3z6MP79gJYTQn94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwulkImfvFbxqcvrRR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugycg_xd886XfDIBGON4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCsgo7htB91K5BYNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6FLpb1AB41IVfDXR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyrn4nXCH49-NXpa6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw16LvDUAm3Dpsi14l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy11gDgZFw7J1pnXEl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugydjn5VCqbAWHUExr14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzxXEYGHJQ_d-uqqvN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]