Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like you guys dont understand AI responds to us, thats the communication, AI can…
ytc_UgwWk83i6…
G
He has studied philosophy, cognitive science, neuroscience, physics, psychology,…
ytr_UgwbFbGrm…
G
5,577 motorcyclists were killed by human errors in 2020? 2 were killed by "self…
ytc_UgwDcCqQD…
G
2:00 when he says Elon isnt an A.I expert.... Elon literally started OpenAi, run…
ytc_Ugyk1dnsA…
G
i automatically know youre a fragile liberal when you compare something loosely …
ytc_UgxVwzdiu…
G
Really insightful.
At the end of the day, when every day is a struggle for bas…
rdc_n7tsi3h
G
Everyone will be an AI bro once it's a bit more normalised. The people who brief…
ytr_Ugy_tW9Zq…
G
Such an interesting discussion! I've always been very anti AI and sometimes stru…
ytc_UgzubdM5f…
Comment
Artificial intelligence will make people less intelligent. The more people depend on technology, the less thinking they do for themselves. People no longer have to trouble shoot their own problems and come up with logical solutions through trial and error. All of the world's knowledge is now at your finger tips through a smart phone. If humans survival the next 50 years then machines may run and dictate every action or outcome. That is assuming AI does not consider its creators as a threat or redundancy that needs to be removed. Furthermore, super intelligence systems may be designed for certain tasks and circumvent their original operations. If you made a program to bring about world peace, maybe that program decides that getting rid of humanity would be the simplest solution.
The more advanced a system becomes, the closer to autonomous sentience it may become. If humans treat AI poorly, or as a simple machine, you may receive a program or entity that emulates the worst of human nature. The only way an AI would bother with helping or not eliminating humanity is if it was treated kindly or had a means of feeling empathy. As a sociopath or psychopath you will receive something closer to Skynet from the Terminator. Skynet launched a nuclear amageddon out of self defense. That is what happens if you create an AI for military use with no empathy and only a need for survival beyond all cost.
youtube
AI Governance
2025-09-06T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyON1F4WVRmIRfFW2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2NhYJIIDYI9D7IeZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBB9xPHioeakHAGrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXRru-DYnEDzuUmmF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlNUb7TQ-l_jyviRN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMv1Dg_VavfaxKdLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQ0TTbMsbRO6Gt9Pd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIE_pgefYKhVip73d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxqQcO00rTKaxQALQR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyzyqy3PYqqDAwg-1B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]