Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is the AI will kill all free advices, free instructions, free articles…
rdc_nu7g5dt
G
I briefly had chat gpt on my last android phone but hardly used it because I swi…
ytc_UgzWBKSM0…
G
There's an additional perspective to this emerging conversation that I think mor…
ytc_UgzKqfvOG…
G
I think I would short most companies if they tried. Eventually there will be iss…
rdc_mtbb7kk
G
Everybody living in 1st or 2nd world country wants a high paying Ai and recessio…
ytc_UgwXOWHrE…
G
Y'all enjoy dystopia. I'll be moving to north africa when the sh1t hits the fan,…
ytc_UgyNSovBU…
G
the biggest takeaway is the eyes and lips. The eyes always look super crisp when…
ytc_Ugxn13qOa…
G
I watch all of this content and I am so bored about the never ending mystificati…
ytc_UgxYLDmuB…
Comment
Some of you argue they shouldn't get money, I will argue that there is no proper disclaimer or training to use the product properly. If you can sue for any product not having proper warnings, then you can sue over this. The problem with this AI race is the NEGLIGENT behavior of many companies. ALL well-meaning engineers who know the flaws of the system would NOT blindly push this on people like is happening. It is corporate greed treating this powerful thing as a toy. It is NOT a toy. It is a tool. And like any tool, a person must be taught how to use it to avoid hurting themselves or others. Look up every instance of when greed caused people to ignore the caring engineer. People died! As is also the case here.
youtube
AI Harm Incident
2025-08-28T04:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwtNKg5Qm0_GQ6PIbJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJk9ALl7xh1jfTGYR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEIc7DQc3QUgLCBx14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2iJ7GZQt2LU0gYaZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgwJYj0AVoN6jNQHdIl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrgcNF3a8rISszLe54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgysMDJzYQJgVNS_eKZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzO3OV2i-1ng10HTOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgziuLnqlspkdbmbBzx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwKmmN0hyDBc6MizZJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"indifference"}
]