Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The biggest issue in the job market right now is an abundance of job postings fo…
rdc_n5glbp2
G
This AI moment feels like when a child sees a new, gifted cousin getting all the…
ytc_UgzQJu5vk…
G
I'll elaborate in a reply to the comment if anyone cares to read it from the per…
ytc_UgyEv3xqf…
G
If AI basic data can be influenced and enhanced be more accurate and closer to r…
ytc_Ugx2sBiRs…
G
Unfortunately just like the nuclear race. Nations are not going to stop. Is like…
ytc_UgyMhYfBz…
G
Social media hawent been regulate properly in 15 years in the USA and therefore …
ytc_UgwVGgZ4l…
G
he does not understand tech and first of all I really dont like the way he talks…
ytc_UgwDOWsgQ…
G
The most human emotion is vengance so this a step in the right direction to make…
ytc_UgwwhQ6KR…
Comment
It's scary, but there's nothing you can do... If you don't develop AI, someone else will—and they might use it to destroy you. Humanity is trapped in an endless marathon of competition, unless one nation with one leader unites us all. After so many years of history, we still haven’t learned anything from war—because human nature is fundamentally flawed....
youtube
AI Governance
2025-06-16T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyN21og4E1Vp25P4Xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgycNhsUV7-IrRsLxI94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRz6ya75jreLcsNeZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6vJahek872ciF8tZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzMWVw8-G1qgyU1nIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwLUNT5bvPHt7KxCYR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwllG9Jc0rmyH97EFJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxE6hY69raIcmiIf9x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz74FuoXxBSMON24Bp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyH6iLOZRXjO4JY-TN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]