Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree with Charlie, but I disagree with the whole "AI art bad" take. I just t…
ytc_UgzFmZcK0…
G
Even I feel that AI is taking our critical thinking away I use AI for almost eve…
ytc_UgxmYuc9s…
G
Why is chatGPT failing and deepseak maybe those people want to make robots human…
ytc_Ugw8mgrnL…
G
As an artist this comment is rude, I’m not a professional artist at all, but I w…
ytr_UgxDecRDE…
G
@TheReal_birbwizardI agree.
if Github Copilot was a real coworker of mine I wou…
ytr_UgzXjTQaR…
G
AI emergence at the same time as unrestricted drone warfare is about to turn Sky…
ytr_Ugy8Lc_6N…
G
Automation is never an issue. As long as there is stuff to get done, which there…
ytc_Ugx_lHbpf…
G
11:12 whaaat?
Anyways, excluding Gemini, ChatGPT and Grok are just breaking the …
ytc_UgxQdZ-Wi…
Comment
Robots could replace humans in a lot of jobs, but they won’t take away all the jobs and it will be gradual. There will always be a market for “organic” service.
I do believe they will possibly exterminate most of us at some point, but it will be on the orders of the elites, who will blame it on an algorithm; because they will not want to share the earth’s resources with a bunch of useless crumb crunchers with nothing better to do than wait around for a government check. There will never be a utopian society in which everyone has all they need for nothing.
youtube
2019-06-06T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxQw7YfMcOhg6zyCbR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZKKVQOOweXnuzyGR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgxMvnr5ixkjJGjQTgp4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1DPqIDMxmnKwbdc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQi1gAtvrINJhUugx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzA4QfdzSS2WxK1u6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3r5ONYiHca-oWIdd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghmHBsOLD4fY3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UghA24C7Vxvn43gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjnnBXlqmRuLHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]