Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The robots can be the next explorer for our galaxy they wil be capable of find n…
ytc_UgyJzYMNu…
G
This “blue blood” commenter exactly shows why as a society we need to stop using…
ytc_Ugz3WzUBJ…
G
I don't believe this story. I believe the young man took his own life, tragicall…
ytc_UgzqYxh3m…
G
If everyone is unemployed and without money, who will consume the beautiful, shi…
ytc_UgwozGxAZ…
G
Youl never get it to stop while governments are always in competition with each …
ytc_Ugzw7Pw2P…
G
"Blinker to human needs" might be a typo or autocorrect error. Based on the cont…
ytr_UgynpElRe…
G
“These systems have nothing to do with AI historically.”
Uhhhh machine learning…
ytc_Ugwu1_nfA…
G
I agree. I keep seeing companies saying they need programmers for AI training. A…
rdc_n619fi7
Comment
The world leaders in AI need to answer these moral and ethical questions - the obvious players, of course ( Hey, Sam Altman, give Steve Bartlett a bell for everyone's sake!), plus it would be nice to hear from Musk, Bezos, whoever the hell runs the Google version etc.
Any genuine expert with knowledge of what state entities like USA, China, UK, EU etc would be great, too
youtube
AI Governance
2025-10-08T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyN6-m7qNBZDAPITqt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwSAy23d4wGs0kI-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGu6jZqTvzq0rW9oV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9gFKKRuIsvcqskmB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgweqliGXRUDmL2pLZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyab3TbemzfgrndCbJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQDs8ZIbcX8HUovph4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxy43k-PNQcUCqrRft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIoRezXEaFEGVi3ld4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy8bDdDZjJWXDbGTl54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]