Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I always need to refactor and debug AI code. Looks like I need to take a course …
ytc_UgzmYXWmT…
G
This guy is under the impression that AI means “Afternoon Intake,” and we both k…
ytc_UgyxEY01k…
G
the coming of AI generated image is interesting sometimes. because it beg the qu…
ytc_Ugw4YkyDo…
G
Some companies understand it wrong, you were supposted add AI to existing progra…
ytc_UgxIH0t8x…
G
Yes, let's stand by each other, only together are we stronger! Like you lazy one…
ytc_UgwbA__CZ…
G
@Shaun_The_Pawn I guess we will see. I don't see much regulation so far and in…
ytr_UgzkxEXCV…
G
I really wish AI companies would start focusing on making me more efficient as a…
ytc_UgzX-vBE7…
G
You know, I actually find Charlie's art of his father to be appealing.
In terms…
ytc_Ugx9EktbW…
Comment
AI and AGI are a continuation of Automation and Process Control that has raised factory and office output. It has all been about productivity = more output per man hour. The acid test of business decisions is how much capital cost to save the company a salary/wage. You will buy an excavator if it saves 20 monthly salaries for pick and shovel guys, but will you pay twice as much for it to get rid of the sole remaining human: the excavator operator? No. Jobs are safe!
youtube
AI Moral Status
2025-11-12T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3rw6_lLhOwXK4B9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxuAskHLSP92pCFrS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyyeKd58GGPmClx3bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzFj_JfsDPLl4pi2Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOnxJYAgZ835BW1w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-NDgtC4ptswW6FDF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZqMtdvVoClufwnsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyymsPfdyYj6LuxL6x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx_VDbNZWpD1h8Ypf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgySs4RPmQrUs-nyGHt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]