Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I doubt one lawsuit will "bankrupt" the mutli bilion dollar company. At best, th…
ytr_Ugxo3zoqD…
G
All the coders that invested into a single framework will be doomed. The coders …
ytc_UgxPqmC9e…
G
I know how to tell them apart ask them to say "im not a robot" if they say it th…
ytc_UgxONX29R…
G
I refuse to call people like that an "AI artist" I call them an "ai prompt techn…
ytc_UgxFi2fK8…
G
Ai creators are too biased, too left-brained, and too in love with their own cre…
ytc_UgzPjAbQr…
G
We understand your concern. If you're interested in exploring the capabilities o…
ytr_UgxU7O4t7…
G
I see your point 🤔 but I thought you were going a different way when you mention…
ytr_UgyypHmwH…
G
A quantum computer would never fit a drone that small. A quantum computer is ver…
ytr_UgwsKAZhm…
Comment
Wrong. You can go read experts in this field like Nock Bostrom and they all agree, the reality is that the harms of AI (In a Super-intelligent form) is extremely dangerous for us all. This is not a warning to gain person benefit on behalf of Musk. This is real and people are sleep walking into it.
youtube
AI Governance
2023-04-18T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyISgMbvzoyWeGM3lN4AaABAg.9odWv4Ftxb59odX0Q2PskK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy9TrKHHCdMPDoSfzF4AaABAg.9odVOa5GAlO9odVknFV7DK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxJtlfRytyVitU_2CV4AaABAg.9odVG8qJvpI9odW0387iki","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw0hBfu2CNCaZJrLVx4AaABAg.9odVBwZf8Ur9odXRFaB9Xh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw0hBfu2CNCaZJrLVx4AaABAg.9odVBwZf8Ur9odYLRu8r2g","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzSDvEbpcgYTOdkh-B4AaABAg.9odSYzBhx0g9odVoYl2Qmh","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzsdaK2hU7T1oj50hx4AaABAg.9odPTKe6br29od_GoeVTju","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwdguAgtxHlwIo5gYh4AaABAg.9odMnH3wjXE9odMvmHrVUO","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzFUjmhmUBeYFpgyKJ4AaABAg.9odK6TJlQ1L9odbkkTspYc","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugw_ZmgXgBrfzAiRFwR4AaABAg.9odGzFJyXKn9odRmhuSGdP","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]