Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am currently wanting to write a book and I do have an outline and plan for it.…
ytc_UgxiKbR56…
G
. . . . I'm in danger :D
Well i have one thing that a.i. doesn't have
It's own…
ytc_UgwA9Vr5V…
G
If the governments, bankers, corporation leaders (or whoever is in charge) did s…
rdc_cthnpmn
G
AI will be our downfall. Obviously. On every level of life, for humanity and all…
ytc_Ugwkw_F_y…
G
According to Eon musk Earth won't survive more than another 12 years why are we …
ytc_UgxeK84F2…
G
Programed responses, every single one of them lol I wish people would understand…
ytc_Ugy58PqR6…
G
Personally, I would be happy to buy and read a book written by an AI if I though…
rdc_jdjx73n
G
So in the end the other artist indirectly used AI as a tool for creativity spark…
ytc_UgyN5-vNR…
Comment
Insightful conversation. Thank you so much. I am hearing that we should acknowledge the benefits of AI BUT we should not let the developers of these Tools create them in isolation with governments. Governments should put up regulations in place and ensure that the majority (humans) are protected, and the risks are reduced. The developers are still in the laboratory, but the sample size is the global population, and this approach should be questioned, their experiments must be tested in a small sample size. The developers are learning hence the various versions, we should not let the benefits outshine the potential risks that might come with exponential AI.
youtube
AI Governance
2025-12-04T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAhMrhUc_auGh4TTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzuw-2iuq23MEl313x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwwxmib6fBINJO4lRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCLafFK5kLCkt7oN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwgk0jMLcalxQ_6LC94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxe5qXrCSSmD1A6b4V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy6q0Jl6o_33ZOp_Lt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzlj4NF2U5a3jMqLFx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynMNdnRu4xTzvs7Q94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw7E1lqGesi2nP1WOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]