Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI imaging sucks. But realistically it's here to stay. I think Artists need to a…
ytc_UgzPfQFO8…
G
Go in politics.People who have lost job to AI would never vote for AI candidates…
ytr_UgxKjfE2M…
G
Because the risk is still too high for them. This isn't one of those things wher…
rdc_jhdz9dh
G
AI is supposed to be regulated since the day Elon made it public. AI is good in…
ytc_UgyG590nG…
G
There will be chaos if Super AI starts inventing and humans lose most of their j…
ytc_UgydTo8n1…
G
This guy never accepted no, never gave up, but know he is saying “let’s hold on …
ytc_UgxTiA8xh…
G
You cannot compare former periods of automation ( industrial revolution, comput…
ytc_Ugz4LhFvA…
G
LOL, "it doesn't work". If they have to filter out any possibly poisoned image f…
ytc_Ugy5lpkNT…
Comment
It's completely unfair to compare nuclear weapons to AI. Saying nuclear weapons have no good purpose is correct. What about nuclear energy? It is more fair to compare nuclear energy to AI. Both can be put to good use or bad.
youtube
AI Jobs
2025-11-02T22:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9EyMfEKXabCxm_Gh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwIrYAXgJAUs1lBYzF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz39xB74GT-i9WWtkh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6zUnC9Nc8728e4xZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpVcxJwbnNhISkIgt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3L7oHztUbFEszVsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEQZ19D-2Jv-AXZNV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmZ-rylUeWkg9oVbd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaoW66-g4Vf1TQv594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDQSX0GB_ckX0BPsl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]