Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like ve how clippy is the bad guy in this but is the good guy in another ai sk…
ytc_UgzdGwJxu…
G
I work with CSVs a lot, best way of that is to have ClaudeCode (i use Antigravit…
rdc_oi2hr1r
G
It's just a matter of time before the Will Smith movie "I, Robot" becomes a fuck…
ytc_UgyaUAX2Y…
G
I personally am tired of this mob mentality. They cause a crime, to then promise…
ytc_UgwT7yL6u…
G
Can tell that it is not a human it it is a robot and a creepy wallet that…
ytc_Ugx6BrDZQ…
G
Indeed computer programming teaching will turn into orchestration or architectur…
ytc_UgxqX1QVr…
G
A dialog from a chat between me (a human being) and an AI :
[me] You talk like …
ytc_Ugw3vSZvC…
G
This guy just spouts a bunch of his opinions as if they are all facts. He say…
ytc_Ugwq0ELkX…
Comment
Does this guy not know about the existence of DARPA and how much they know about AI already? I mean DARPA does advise the government so this corporate Ahole was so full of it it was pouring out his mouth. Self regulation never ever ever ever works out except for special cases like the Bar, and that is a totally different beast of self-regulation general business does not.
youtube
AI Responsibility
2023-05-15T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwRpfDpwDxHu4lz1hB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwIRNiIRN7KhpQypJF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwZwhkCV6Jrq9dPIax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXEO6YeSIDVwd7iEt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxCB4IJYYJhvSt6QfZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxnys6xP1U8bc5klix4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx8cfN_kyu8Mc19MDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHIFtq6gaIaxHIlO94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3TUb41pgpWCB4GBx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8p3dY0jXARwCRPwR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]