Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cant wait for ai to take over. Then we will have more logical people to talk to…
ytc_Ugw4JSJ8l…
G
People and society is and will be off-guard about the speed of change that AI is…
ytc_UgxI3M2_N…
G
Its not the AI fault... its the over reliance on AI and neglecting the guard rai…
ytc_UgyfVVPL9…
G
It's not a reverse bubble. AI isn't the main issue killing jobs it's offshoring …
rdc_nc29fsh
G
As of now, AI has not independently answered questions that the human race has n…
ytc_UgwPvKUdx…
G
I have little or no doubt poor honest Balaji did NOT take his own life.. l pr…
ytc_UgyzRkXg1…
G
i dont blame AI, your son was struggling with depression long before. i dont thi…
ytc_Ugz6_b0jC…
G
Seen here, another painful part of the AI, even if you are willing to use it doe…
ytr_UgzYvtnHI…
Comment
Neil can't honestly compare humans shifting from horse-based transportation systems to AI. That's apples to oranges. Horses to automobiles is a metamorphosis because human input still plays the major role but AI is a whole takeover. "There's nothing new under the sun" is a saying that came to mind when he talked about finding new avenues of doing things. Like really? come on. how many people are going to make a living by coming up with revolutionary ideas...even if they do, AI will then take over and do it even better and cheaper then what? Bottom line is companies want to make profit and wont hesitate to replace you with a machine that churns out production 1000x compared to you at a fraction of the cost.
youtube
AI Moral Status
2025-07-24T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzk90dzqrw2zGBYFbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzTai3twqZhvAQvrx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvSb0iSx_lF2T73s54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy77ADopFHYUlFEWLp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrfBnHDGAGpxl2WWx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJzUJ1OBKl0SVEYBh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsxLxO5ks7xWc24QB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzt812ZWTrATooYLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPxtp8ylY5EryfKi54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzhBJ8fzNntdzJebth4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]