Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is all nonsense. They’re trying to get you people to believe. If there’s ev…
ytc_UgxGs69Fe…
G
Saying you're an ai artist is like saying you're a doctor, but you just didn't p…
ytc_UgyBYarYA…
G
So what do we need to do? I can’t just watch this and sit idly waiting for doom,…
ytc_Ugw0LgtW8…
G
It really is sad that these tech bros have tainted the word AI, as there are gen…
ytc_UgyyGSXWX…
G
In some ways, trying to explain a generative learning algorithm is like trying t…
ytc_UgykhxVAY…
G
I wonder if some of the AI bros are going through the dunning kruger effect when…
ytc_Ugw9n2GNg…
G
Aww man I hope AI doesn’t take the word glorious or devious cuz I like over usin…
ytc_UgyOKcAC0…
G
This conversation very much went around in circles ... it's understandably a tes…
ytc_UgyOeUo3Z…
Comment
Well we're fucked then because we are essentially chemical biological machines and ai does not understand emotions at all. It has no sense of actually caring or being compassionate. Having said that, I'm still not totally convinced that it "understands" anything that it is talking about. It just seems to recognise patterns and give the most probable or likely answer. Take for example the concept of a goodbye. There's the clip on Instagram of two ai agents endlessly saying goodbye to themselves. Yet the whole point of a goodbye is to stop talking. Or the one of it having a 'conversation' with itself. It is completely hollow.
youtube
AI Governance
2025-11-12T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxZxqL-Xf-60wZ9Bkd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBdNlWhaCQJPZGa_B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztwtIyBTVM7VzTNDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYIjWBw14RBgHZoh94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwG3eMQoh8fiViCKYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbN3RKn36TBwnEgmt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxr60BGUGwWePHm35h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyO-ALuJkIB072MUVB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDTXkhzv3RBa7bweF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwdr6lir_QRgNCNWTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]