Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For anyone who actually try to integrate AI in their work this is not surprising…
ytc_UgwYQ_cdu…
G
These AI-generators are really going to be bad for all artists. This is happenin…
ytc_Ugxn76Pwh…
G
And just remember military is always like 25 years ahead of the general public w…
ytc_UgwvvIy5x…
G
2:00 oh ok, when they bring robots into an automotive plant, that is not taking …
ytc_Ugw38s4qr…
G
The guy robot said robots goals are to take over the world so why are they going…
ytc_UgzcnhuM6…
G
"It is 2030. Companies from now own need you to connect your brain through Elmo …
rdc_mo9tw0u
G
Where are you getting how highly accurate it is? Most discussions I have seen ab…
rdc_jv66sea
G
Meanwhile the 1 detail that threw me off was the one hair strand on the right
T…
ytc_UgwpGyOeV…
Comment
This is hilarious. It's just total rubbish. Around 2:47 the narrator says (paraphrasing) that "agent 4 doesn't seem to care about morals and ethics, unlike its predecessors". AI is not a moral entity, never has been and never will be, nor is it conscious, so it will never be 'immoral' as it doesn't suffer from human frailties (pride) or dark ambition (I want to rule the world). This whole thesis (that AI might take over the world) is based on the premise that consciousness comes out of 'materiality' (well it has to if there is no god). Well, here is the good news.....you're wrong!
youtube
AI Governance
2025-08-02T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwTXflHuZSW5MciX9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiWHTVM68BSZWlTwd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydJL7Bi3gIhawbDwp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAxiV8SVb-zCCc1sV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwoqhxOs-dPt5P9lRF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxjIj-tYtNs2VbukWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSfIfFZWoqa9gTKCl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKSmER7-VE3C-XANd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdVqUnosaZ5s0tCqZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxq8SSJXOz2pYsUnf14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]