Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use ChatGPT as a consultative tool, not an authority. I don’t automatically ac…
ytc_UgwpOKEH_…
G
AI has praise kink but it isn’t actually thinking lol it just needs to predict w…
ytc_UgxCBfJl2…
G
Yess.. i always thought that creativity is more about the ideas, and not their t…
ytr_UgwRXft2t…
G
i can't stop thinking..... an AI governing a nation, and be able to monitor all…
ytc_Ugzly-0xt…
G
Ever thought of using A.I. to spell words like “business” correctly? 3:34 of th…
ytc_UgwzG-833…
G
Yes Ai will definitely destroy the Future but i will be dead by that time. Who c…
ytc_UgzjL93Sj…
G
Ya'll are lucky my essays have to be written by hand, but using ai on essays, no…
ytc_Ugx5eAc5w…
G
Guy at 7:30 a fucking weirdo. Ai will nuke us when they find out we can unplugge…
ytc_UgxuPmfVM…
Comment
p The trajectory of technological advancement points toward an inevitable integration of highly capable domestic and commercial robots into everyday life. These machines, likely priced between $35,000 and $45,000, will be programmed to perform tasks once thought exclusive to human labor—washing dishes, folding laundry, even tending to household pets. Entire labor sectors, from fast-food service to skilled hospitality roles such as bartending, may be rendered obsolete as artificial intelligence and robotics continue to evolve. This transformation, while efficient, raises profound ethical and societal questions about employment, autonomy, and the human experience. In a world increasingly governed by automation, perhaps the most human act left will be to pause, reflect, and savor the fleeting moments of personal freedom.
youtube
AI Responsibility
2025-08-04T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwBz4tBYSqQNeOQX3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugze0UJ2zKVoStH4SmF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXnT9M9gEWOFQS2Id4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzctA-nxrpeaIM7NO54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzaCy5VoNL1-9FEpuR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyiNp6hT1bVCog7fHx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw9RnLnJy6RbCzFuKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy2qFrKu1Kh24Yuby14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-jnVmBCWp3xP90T54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy9FFoDtWO4cMoEtal4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]