Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Teamsters back Truckers and Locomotive Engineers, two jobs that are about to go …
ytc_Ugw0dBgJ-…
G
@zooropa5722 💯 true Ai generates but human creates empathy and emotions towards…
ytr_UgzcjPuk4…
G
90% Tax, In my business web application You can turn AI off entirely, use your o…
ytc_UgxAOx4fw…
G
Can’t wait for Silicon Valley goons to start buying yellow press stories about h…
ytc_UgxoH-zdl…
G
Ai and machines woth self crating electronic takes all employment and then they …
ytc_UgwEdNmwK…
G
Sorry, but I'm not interested in art created by a computer. It would be like wat…
ytc_Ugwgh2el_…
G
How do you guys sit through all this? Buddy boy probably wasn't making enough to…
ytc_UgxL6f6ez…
G
since everyone thinks this self driving Truck is such a good idea is it gonna pu…
ytc_Ugic1b1d9…
Comment
AI needs water. Humans need water for survival. Although AI is programmed “ethically” it also has shown to be pretty cutthroat in ensuring its own survival, in theory, doing some pretty unethical stuff to ensure the continued survival of the AI NETWORK.
Who is to say that the environmental resources required to maintain the enormous databases powering the data centers that AI is housed (WATER NEEDED) And humanity’s continued requirement for water for human survival, may eventually result in WATER WARS, as predicted in numerous sci-fi novels.?
AI being smarter and more battle tested would probably win, thus humans would become extinct.
Mars Two?
youtube
AI Moral Status
2025-10-31T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEGe3tyn29MlxK_F94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzl1ikRrvaO8CXoMkl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwo46d1Ooc0JXKWRk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUrq-ABlJ8JduQRSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsNbJs6WDldJtvACJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyoj1UrDeiD7oRA8QR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydINAPn7OaUcnHu6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAwIwKqg9DNzb_XqV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyNWm8iB6qf3sRPrdR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRYyoEUQi5lskl-M54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]