Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What about lawyers? I disagree.. my friend is general council at a corporate. Th…
ytc_UgxZ7scld…
G
This is not about artificial intelligence! It is about the vageries of the human…
ytc_Ugz2Wyk8H…
G
Stop dramatising nonsense. All AI does is predict the most likely response to so…
ytc_UgzkYH88P…
G
I've seen too much dashcam footage to have any faith in human drivers. People do…
ytc_UgwENmpqh…
G
What a lot of commenters are missing is the co-botting part.
So the AI alone wo…
rdc_m83imdj
G
AI is just a computer Hallucination ,AI in details is a Triangle covenant of the…
ytc_UgyrZpHIq…
G
At least it's not created yet. Right now it's only as dangerous as the person us…
ytr_Ugxw4gvlA…
G
I don’t like Tesla, but I believe they have the most advanced autonomous driving…
ytc_UgwumwH9J…
Comment
If ai takes all jobs there will be no one left to buy the rich peoples products anymore, ie the death of the consumers, therefore their companies would collapse and their money useless. So I'm not sure why they're racing to replace all jobs with cheaper ai when they rely on those workers and their wages to buy their products and services.
youtube
AI Governance
2025-09-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_bV1jwLAjuNilkOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxeb0e3BsIISpa6Qr54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"confusion"},
{"id":"ytc_UgwTDdEgXsZ7_fOv1OV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztFGr4QwQqe2QA7kR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfMH21s_XWjLyY2Sx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWStSA1qosnBpGQvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjkcyiXxGHu13gCAt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx79llVT16gbB0P6Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsGLDsfs5jZMktyDh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJXvUbV2lGnUpDP-B4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]