Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My mind moves faster than my hands, so any 2D digital or otherwise art I produce…
ytc_UgwezUolR…
G
Techbros have made the term AI so worthless that we now have to call actual AI "…
ytc_UgySpcrzM…
G
Wow! I had no idea what deeply moral, socially and environmentally responsible p…
ytc_Ugx43kDpw…
G
The issue is that AI is still in private interests. Dude used Uber as an example…
ytc_UgxDJg9G0…
G
Passion is not the main issue, its talentless people generating AI images and ma…
ytr_UgwNecYzy…
G
We've had Multiple Films,Books,and games Showcasing that AI Is a Bad influence a…
ytc_Ugw3RrAYa…
G
What is most worrisome is that all the experts, creators, even the godfathers of…
ytc_UgyMueX4N…
G
Stay away from this “AI startup” wave. They’re essentially all writing prompts a…
rdc_mjdnf4s
Comment
Given the certainty that there will be another Carrington level event in our future, where the Sun throws massive amounts of charged particles towards the Earth (a planet with a weakening magnetic shield), I'd say AI itself is at enormous risk. Along with the rest of our technology.
youtube
AI Governance
2025-11-27T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwB4HphivkiO5zOKrp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8hylunaTYKqWFvDN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwdy-N1tOFiQXpFDnN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP7IDyX-8CwdCl8oh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyyk1aP0fM8N39Npb94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8jWaaRMQ2k27LfUt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwqx-TXWkYif1N5MnB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwX4vMgJrPCtsJ4yiF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugylk8oftbe_sMUmdFJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGF9I54V-YRIP17AZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]