Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In 1999, singularity was predicted in "The Matrix"... Basically we will become a…
ytc_Ugzs-Q25f…
G
OMG! It is illegal to text on the phone in the car, let alone drop it and pick i…
ytc_Ugw9aK0to…
G
@originzz well said. It’s good that both the good and bad of tech comes to light…
ytr_Ugxevdobv…
G
Unpopular opinion.
Look a the industrialization. Humanity wont stop using AI an…
ytc_Ugw1DgC3r…
G
AI is decent at general things otften way off on an unusual data. So yeah, it is…
ytc_Ugx8R4uNc…
G
Tesla robotaxis trials are also geofenced. It’s to limit to area well scanned to…
ytr_UgxkpJRiN…
G
Yeah you forgot that when Ai takes people’s jobs, human go to war. Humans have h…
ytc_Ugxhs6-SC…
G
@Kardriel1 that’s HORRIBLE - my prayers for her😔 youre right that it’s not just…
ytr_UgyHaQXsC…
Comment
😂Why will it work for us if it becomes too self aware,people after a point of self awarness wants nirvana or moksha,like gautam buddha, they want to either merge in nature, or become a self sentient being like god, they dont wanna be puppet of our genetic dna. We evolved so we got more data, our bodies developed a way to process that data, then knowledge then ethics comes, but our primal need was survival, i dont know what will it be its primal need, but if ai will learn from our ethics, then will it mimic as us? What will it need be?
youtube
AI Responsibility
2025-10-22T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxW7VrcDxo1BYimqA14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxK17EqoJbTiwDdG7Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0MNzA3MMtDU1e2jN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCDMGq460IsYDoUW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzDaWml_F-2d66cOC94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyv3qUxf1ffCyMX2xR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugws5h1CdJnByFHdp3B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrogRh3BVCkTdMSz14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4bNvbP_QWNLT7e54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9u7RTMNvH6Jq47xN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]