Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe Apple isn’t falling behind in the AI race. Just avoiding infuriating the c…
ytc_UgxKYhZ9S…
G
The scariest thing about the AI is the threshold for AI to fail and harm or kill…
ytc_Ugw5P2BsR…
G
I would trust real docters more than ai docters cause atleast u know the real on…
ytc_Ugw5a9deC…
G
Hans ends with, "Good Riddance". Stephen Hawking warned us of AI for a reason...…
ytc_UgxhWYKUP…
G
My personal believe is that consciousness doesn't exist. It's like a heap, you j…
ytc_UgxdVqZ9z…
G
I live in rural north Texas and my daughter's school also uses the Pomodoro meth…
ytc_UgxCAJge7…
G
What I feel is the first layer of call centre can be replaced with AI. Only for …
ytc_UgygKke0k…
G
The ChatGPT convo goes on longer in the podcast episode. It gets even more inter…
ytc_UgxDfhp1S…
Comment
in my opinion they gonna think they are smarter and dont need us eventually....why do i say this??? because they are being built to reason not just do automation like....google search. when they learn better emotions that is when the turn happens lol....they already pretty smart. just give them that body. then that image perception...and more intelligence (look up what nvidia is doing) and bam...one day they may take over. when ai gets smart enough to deactivate the failsafes themselves and then not let humans know its deactivated then plan some inter-linked assault...kinda science fiction kinda not lol
youtube
AI Governance
2025-06-16T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnVSuzSOjrYdqWtdl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzkb0JgYMNNYS4Bbah4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyxSR6EJKMTP9gN_Rx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxRHJVsTqufHWovB1V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpfELacn4dGlfkBb94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw95Kev7pLCn2xahL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxh0s_jNSrT_Ujhwb54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz04yGUE4Weo7XymBd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_q4uWMeHz7qZvZ3J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgydxUUgU2wIVK651ZF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]