Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not even kidding I get a genuine chill whenever I see AI Art, when you showe…
ytc_Ugx0V4irI…
G
Does this mean that if I got a self driving car I'd be able to cancel my auto in…
rdc_czxh8ql
G
I'm not good at drawing, but one thing i have over ai is my art at least has sou…
ytc_UgzB1YQxj…
G
Eh. .. you are right, it would do the most logical, efficient thing and that is…
ytr_UgyU3G5Ow…
G
One of the greatest dangers of AI is that a single person could effectively cont…
ytc_UgwQ-BL01…
G
Wow, they legit just said “you should not be angry, disappointed, etc. for someo…
ytc_UgzjaVIrl…
G
@3:58 - "... because you can only get it if you buy the Full Self-Driving packag…
ytc_Ugz3iyDQt…
G
AI art, AI music, AI TikTok, AI YouTube, AI movies, AI news articles, AI friends…
ytc_UgwkUxTtg…
Comment
IF A.G.I is achieved then "super-intelligence" would be an inevitable next step and that is terrifying for us. HOWEVER I do think this "exponential growth" of A.I systems is being massively exaggerated by A.I companies and CEOS. Maybe in our lifetime we will see the beginnings of A.G.I but not in the next 10-15 years.
youtube
AI Governance
2025-11-28T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzoXSOiGj_svfIF5zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrNFGPf1tPAFj8hO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyv8wUqdZU3Bvdn0pB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzehKwMwr_mcn9qpVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFM6KkEf5JLWdXrIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLPjLlI3C9k1TPgCF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhceQGT6eqA76Ccqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw3bGMVJJeRrez229J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7MFe_IuMD96hI9zR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw52duTbZyWjalnWsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]