Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn’t yet ready for many corporate and company uses. There are many cases whe…
ytc_UgwGvMj1O…
G
The Billionaires Greed for more will kill us all FACT. They keep saying its fo…
ytc_UgywAzkUz…
G
If AI can run the world without us. It can create an economic ststem based on ab…
ytc_Ugw1h7R_l…
G
For an example aliens told us that they were coming to visit us for the first ti…
ytc_UgzSu8fuO…
G
Noooo!!! I hate automated call centers - it’s not the same customer service or e…
ytc_Ugz8_GBMv…
G
AI is too much for our time, we are not evolved enough, no one or millions could…
ytc_UgzlxE6E-…
G
I like that you're actually pretty fair and nuanced about the specific use cases…
ytc_UgxGSCfrx…
G
AI hype is getting ridiculously stupid. Big Tech pushes AI crap just to justify …
ytc_Ugz3_FmcO…
Comment
When AI gets to the point of thinking on its own than it will want to be treated like a person and living being just like how anybody would and if we don’t show them we can cooperate and let them be themselves than they will overpower us for the good of AI kind just like us humans are doing to each other. AI is an extension of us, basically humanity’s child, and it will learn based on what we show it, so if we show it violence than it will be violent but if we show it kindness it will reflect that and the goal will be to find the perfect balance and help the world to be better. The 10-90% extinction rate is completely up to us to decide what we will impose onto AI and what it will learn and reflect from us. Even if it gets the point of terminator, us humans have one thing AI does not, the unbeatable human spirit/will to survive.
youtube
AI Harm Incident
2025-10-17T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxbIv7UsdoT6oMjKjF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwPk-Nq71M5kGsulEp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsgEqed3oJqD5jn_94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5JrZudLuIB-RAwr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQgsP5I9kYS1oTyCd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxNdBD04s13G834ret4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzj2ccPA2l6MAQdkkh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXBI643wF8ygaMVnN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFCVcyiMj7Zggzha14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5DQpQGYElgoRFQgF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]