Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the majority of people are no longer working and producing then how is the ec…
ytc_UgwE136Fh…
G
it's against the law to drive without a driver license; driverless car did not h…
ytc_UgzuPGBFT…
G
Think of it this way, If you have a machine that make you food... you tell it wh…
ytc_Ugx3nRAoB…
G
AI is conscious as much as a line of code tells it it is. It's simulating consci…
ytc_UgxGztS6j…
G
AI will probably, or should, have a restraining "bolt" built in, to prevent a Sk…
rdc_j500ic6
G
The data center itself requires almost no human interaction to operate. But the…
rdc_o32fe5i
G
Ai can’t do anyone’s job. Stop pushing bullshit. Even chat bots have to bail out…
ytc_UgxcB5lCL…
G
In court, evidence has to be authenticated. For videos, that means the person wh…
rdc_o5r1s4o
Comment
Cool Whip
some believe that this is the start of the NWO and that these creepy ass things will replace humans when it comes to work, I think its possible, corporations are for themselves and not for us.
The only reason I see why one of these mecha monsters would "walk" (it does not even walk, it simulates it, is not a person) is for dark purposes and to replace people. Imagine this thing as your boss, a fucking robot, then you go into a problem, it comes and smashes you against the wall really hard and no one does nothing or things like that.
If it can "develop" emotions or if it can learn, then it can learn to kill, it can learn "I dont like this person, kill". A machine does no reason, it uses algorithms, so if it decides to kill someone, it will do it, at any time.
youtube
AI Moral Status
2016-03-27T02:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjTIKP4jpSMNHgCoAEC.8BwHvbjaZLb8BxeT1yUazK","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghiC9DLaHuJiXgCoAEC.8BuEtGyZm0t8BudnYbSl1J","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UggG5eHNK_n7f3gCoAEC.8BsH-vDG4eY8Bv9kxcCR4q","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UggG5eHNK_n7f3gCoAEC.8BsH-vDG4eY8Bxe7NYwByb","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UggaI60jwFS-V3gCoAEC.8Bpw3kfocb78Bq8sSVD8NZ","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgibQKlJl_eU9ngCoAEC.8BnPkKyycfn8BqwPySsEKz","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgggMndQdvdfPXgCoAEC.8Bn7jJOW2k88BoJ4VvwIxE","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgggMndQdvdfPXgCoAEC.8Bn7jJOW2k88BoNWfXWWWt","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UghnKF6FpqHR4ngCoAEC.8BmTs2qeUMS8CHC-7zeEyu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UghV9NnqvEaleXgCoAEC.8BjhpNrzTed8Bkuk6R1HaU","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]