Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The one segment called "how will AI cure diseases" should have been "how will AI…
ytc_Ugyot18IM…
G
I remember coding and using AI starting in the early 1990s for manufacturing app…
ytr_UgxP_pEpn…
G
If you did it recently, The A.i is long passed this stage,its already trained we…
ytr_UgyAN_Fc3…
G
So my thing is this. Aircraft use ADSB both in and out. That let's other aircraf…
ytc_UgzJ1SyL7…
G
This one must be for the boob guy. I can only imagine what the other version loo…
ytc_UgwL8ohGG…
G
Who payin this girl to spread hate on ai ? Who she voted for ? 😂😂😂…
ytc_UgzXfZEsL…
G
Your US copyright law differs to that of other countries. So in some countries i…
ytc_Ugyi_4_98…
G
Very interesting. I think there are still however, certain points that were miss…
ytc_Ugw5ecbIy…
Comment
"love for our children"? in other words love for ourselves will be the solution for the upcoming problem of an AI intellectual dominance? didnt he, just a few minutes before, discuss how AI started to have behaviour in its own interest? that AI started to act in favor of itsself? that AI started to act to 'love' itsself?
isnt that what we humans do, the one way or another, all the time? and isnt the idea that OUR love for OUR children could be game changer for the upcoming crisis just another iteration of exactly that? isnt that an argument that exactly claims (for oursleves) what was just recognized as harmful (when presented by the other, the AI)?
youtube
AI Responsibility
2025-06-02T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyS1ahr33TZ_KVndRB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgymwFq2xr-ysDtEm5V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzE95Yg_VOwazLk63F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHG5WPXG-K4SmqQX54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEPSK31JdzyeV-KhJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFbkP6p29Mnop-eTV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxsj9C1qn4kSNpDRx94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzE7ovyq72qybWbROB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzzIQGLPNcQ9mTmG6x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyhYbFSxHku1ARbp1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]