Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was glad that I could ask AI about the congressman and what his views were. I …
ytc_UgzbdLrJz…
G
The "AI jaugernaut ".....is here on our door step. A small number of phycopaths …
ytc_Ugxl8UYif…
G
I have conflicted feelings because on one hand I totally agree, but on the other…
ytc_Ugx-3930h…
G
@eindrake8418but as of right now, the ai is not the burglar, but the lockpick.…
ytr_UgyFN9_io…
G
I mean, as an artist, i use AI to get a reference of something that doesn’t exis…
ytc_UgxSCoKhP…
G
What's worth more. Auto insurance revenue or revenue from automated cars in an …
rdc_dmokuny
G
Current Ai only predicts how the answer should look based on it's training data.…
ytc_UgycTA1mm…
G
Delightfully devilish, Seymour... eh, I mean, LavenderTowne!
But seriously, tha…
ytc_Ugy0T0zUj…
Comment
Self-driving cars should be functioning off of software that allows the cars to inter communicate. The same situation happens, if all the cars are self-driving then the cars can move away from each other in unison. With the motorcyclist there is a serious problem, which is that if we find ourselves in a society with self driving cars then suddenly motorcycles and non self driving cars are like smokers in public areas, their decision is no longer theirs alone, they are putting everyone else at risk.
youtube
AI Harm Incident
2016-07-09T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugi0oNCeHP92AHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQqqQ8pvsVC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUueruHXVu1ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgijXoYPKjY_1HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghGnVVF0cNqSHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfmpuz0HRxeHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRgo7ALDJJCHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghT-lpLHZCE-HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UghO2h5e1TxTNXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghNM3jgeKUHEngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}]