Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI isn’t actually more creative or original. What it’s saying are usually phr…
ytc_UgwSds3iS…
G
I think people equate autopilot on a car with autopilot on an aircraft. If there…
ytc_UgwZ0QMTS…
G
They wont because companies WONT own the code. Openai ext will own the code so i…
ytc_UgwrfU78o…
G
Okay…but the problem isn’t moral dilemma, it’s a technical one.
We do not know…
ytr_UgxkP3JTD…
G
So, we've to just prepare the Flowchart and then use AI to code in parts & put t…
ytc_Ugxv9yIL5…
G
Maybe AI can create its own rules, maybe AI can be god may be AI can create its …
ytc_UgxeJDgt2…
G
In what reality does ‘health care’ mean more people? Your logic is flawed. We al…
ytc_Ugwk5tgtC…
G
Who’s pushing this AI, what’s the agenda of AI, why isn’t the public informed ab…
ytc_UgwYUbK2N…
Comment
Nuclear weapons are a threat weighing international decision down with MAD. AI is invasion into the fabric of capitalist society that had never happened before in human history. This is not automated looms and the luddites, but a complete upheaval within maybe 5 years. Not the same.
youtube
AI Responsibility
2025-07-11T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw46072NtsdwGiVyhx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1y9ImLPAece5APgF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyJNvOmjb5kklSaZVx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmhfLrczppt4lqBvJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw0Cf9C57QjMV1gJxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyevB8WVGCGitPN2-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGAXJlXPHs5Ju-SQR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy9GVhIftYQFxaZHQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyEkeqmtKkYbJnVgtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwJN7uzdHMsLsFFdMZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]