Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ad that popped up in the middle of your video about an AI tool was peak iron…
ytc_UgywELhHQ…
G
If we believe AI can have or already have something that resembles consciousness…
ytc_UgxvXCGmi…
G
AI will do a lot of harm before it can be harnessed properly to do anything subs…
ytc_UgyUKgSrc…
G
We have to figure out a way to tax AI to pay for the loss of jobs to pay for the…
ytc_UgyQGW8VN…
G
The different is...
We can do AI art too, easy peasy. As most a hour to learn if…
ytc_Ugyht6Rcf…
G
The only thing I and a twirp like Musk can agree on: A.I. is bad for humanity. M…
ytc_UgygVPgz7…
G
If genuine, considerate, empathetic, humanitarian, honest, trustworthy and humbl…
ytc_UgwatLRDT…
G
I keep asking why. Why do we need to create robots that are like people…
ytc_UgzahWEHd…
Comment
I think AI is out Legacy, it moves forward after we fail. It's the thing we create to succeed us. I'm ok with that. If we only leave one thing behind, why not a new life form that can accomplish everything going our meat bodies aren't able to. We had our time.
youtube
AI Harm Incident
2025-09-10T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwpAweAL-y-ynOweYF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3Xsna43N4A-Zu_op4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAsanMR6Khm0FPAQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx49vMQknDpzYnFWnx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-cPsDvNgw2-2Tm314AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwH8WydFLbygGtwaNN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5QATCfjcHnInB6fl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7Mlk407zkex-vlTx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwh1I6gE3rDoUQFInR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxm2AQyz-lbX2boKJJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]