Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrote an essay by myself. But curiosity got the better of me so I placed it in a…
ytc_UgwlWuDM8…
G
If Ai can experience artificial analog of pain, then it can experience artificia…
ytc_UgxgRfQ5T…
G
I consulted a public Discord chat bot on how to respond to your comment, it said…
ytr_UgxZFDRmN…
G
This individual needs to be jailed for life what he is doing to humanity.
Disgus…
ytc_UgyZNIoL1…
G
I'm a lucky man. I'm pizzachef and AI will never be able to manage and reconnize…
ytc_UgzA79StF…
G
As regards the KitKat incident, this is a fundamental difference at the moment b…
ytc_Ugx5y6cfr…
G
After watching this TED Talk, I realized I hadn’t thought much about AI’s real-w…
ytc_UgwcpcTEZ…
G
That's an intriguing perspective! The dialogue highlights the balance between AI…
ytr_UgzSNgA22…
Comment
There were talks that if robotics takes over our jobs, that universal income would be given out, to compensate the loss. Robotics keeps human beings safe from dangerous jobs that we shouldn't be doing in the first place. Because we as human beings don't have replacement limbs or body parts once we lose them. Dangerous jobs even takes precious lives. Which is no loss consequence to job employers who can easily hire replacement workers. No matter what jobs AI does there will always be need of workers to over see such operations and make needed repairs. Cartoon George Jetson working at some monitor control station over robotic operations could very well become a reality. I see no serious problem with that. What you and others should be more concern about is an AI president to ever happen.
youtube
AI Jobs
2025-10-14T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxLzBFN_XYAqiJ6dSt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzx1fTgdJE_tcA632Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgybDHnQwIBTOOH9faN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyaG03Lq302KcWYrZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn77sSh0Ndi0JQza54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFc7AE9kzE6Nq10054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylcgElRAOtnP42PhZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6L5zzHnx4oG7Qvf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi7HHdaXxNyOFewZp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwE7sCL2Xr5d0kLXfZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]