Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The MIT and Lockheed Martin collaboration is focused on autonomous navigation, w…
rdc_dwvlv17
G
Only God can put morality in the world. AI will never have empathy or morality. …
ytc_Ugwvue7Wd…
G
He only creates problems after problems. He only talks about this: problems.!! I…
ytc_Ugw8U1Xe5…
G
Insane work when in between this interview, you get tossed an advertisement for …
ytc_UgwBe8I3e…
G
Am I the only one who’s confused how chatGPT is making the art when it only crat…
ytc_UgwikVEIM…
G
The elite greed for total control with AI wil be guilty of humanity extinct. IT…
ytc_UgxIPID15…
G
@SoftBreadSoftonly one or two tho. Like there were 20+ of them but only 2-3 wer…
ytr_UgyOBwJA3…
G
I do like videos of talking animals wearing sunglasses and shit. Thank you A.I. …
ytc_Ugw9Bzho-…
Comment
While I don't believe that it will cause an apocalypse, the problem is that the potential is to fully remove the human from the equation. Not all jobs will need the regulatory capture to ensure those jobs continue to exist. Healthcare is unique in requiring this due to HIPAA. But the potential is that jobs (managerial, data entry, creative, engineering) can be removed if agi/asi happens. if the machine at some point in the future can think and be creative and can do it autonomously without a human being in the loop (assuming agi/asi), at that point (as an analogy) humans become the horse at the advent of the first motored vehicle (don't believe me -> go look at old photos before and after).
youtube
AI Jobs
2025-10-27T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzurRf7haIJ8T8GQ-14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0gj2K7UdFRO6g0UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzxcdoo-ix-b5X7wYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5XGa43BDnQ7EFrnJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx42F6SSo9kO_vAQFl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuQ8Z0JwzaWWX-0vZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxqdwIh6nVKE94RcZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugxrh9L4amjriku57ft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSy7wk4XUejnyz9VV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRyzdzerV5a9dtAhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]