Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If your job is the first few being replaced by ai then your job didnt provide hu…
ytc_UgwyZYAu3…
G
Robot "I don't have time for modesty, I want to create the singularity tomorrow"…
ytc_Ugy0QMiEE…
G
I don’t know about there being no more jobs or power prices increasing. Haven’t …
ytc_Ugwxz1Xos…
G
t least try to make your unique style art not to be better than it people cant u…
ytc_UgyYCtfyz…
G
I wish they would just enforce the laws on the books. Almost every jurisdiction …
rdc_lvac9s1
G
Humans: our greatest goal is too survive by any means
The AI humans created: our…
ytc_UgxT0ZLF2…
G
@cosmicreaverkassadin5948 "A banana taped to a wall? Really? Thats supposed to b…
ytr_Ugz_V8yUx…
G
Ahhhh, well not really. I got a good amount of knowledge about llms (ais) and Op…
ytc_UgwiRW-to…
Comment
Mmmhh, since the vehicles will probably know everything about the occupants of everyone in the vehicles, it might "rate" the occupants lives on how much they make, how many "likes" or friends they have on facebook, how important they are, if they are critical of the government, if they have no children, etc. and "off" the less important person. Maybe even based on age, race, how much they make, how new they are to the country, etc. Facebook can already tell if you are gay, white or black, or are a republican -- Just through it's privacy-invading algorithms.
I'm sure Google is even worse. Basically, these corporations will decide through covert algorithms who lives.
youtube
AI Harm Incident
2016-06-22T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugi0oNCeHP92AHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQqqQ8pvsVC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUueruHXVu1ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgijXoYPKjY_1HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghGnVVF0cNqSHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfmpuz0HRxeHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRgo7ALDJJCHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghT-lpLHZCE-HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UghO2h5e1TxTNXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghNM3jgeKUHEngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}]