Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@quetevalgavergaaa Laion's database uses art from everywhere. its not good but i…
ytr_UgwZTgwfR…
G
I've had almost this exact same discussion with chatgpt several times. It always…
ytc_UgwBQrarO…
G
From this video I understood the only way to stop ai from taking over the world …
ytc_Ugxmm2tK_…
G
Me: “Once I saw a man get run over but he kept walking in injured “
Person : “…
ytc_Ugy4QjdLW…
G
they are paying less, there is less jobs, things cost more and there is more com…
rdc_gkr6yyp
G
@Sparkynerd-24That’s kinda two different points here. The jobs immigrants take a…
ytr_UgxN1_Wjr…
G
A video Narrated by an AGI character about how AI will take your Job is DIABOLIC…
ytc_UgzJg4sTw…
G
Please keep them on human input programs and don’t ever give them ai to adapt an…
ytc_UgzpGnAa4…
Comment
Here's my take on this problem. True self driving cars would leave enough room or go at a proper speed so that it always has time to stop by altering one of the two variables.
youtube
AI Harm Incident
2015-12-08T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]