Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah this is complete nonsense and he knows nothing about how neural networks wo…
ytc_UgzDN340C…
G
Imo we don't know nearly enough to build an asi, then again this is the singular…
rdc_myvp1lx
G
The way people are defending AI like it's a human is so ironic to me.…
ytc_Ugw312g7b…
G
32 hour work week and robot/ ai tax seems reasonable to me. As for electing lead…
ytc_UgwaJz9iP…
G
Elon, you literally want AI. What about your Neuralink AI will control ever…
ytc_Ugw6_mKk-…
G
So how would a fresh college graduate get experience when AI takes over all the …
ytc_UgxahL5_I…
G
The storylines of quite a few films forewarned this potential situation.. The Te…
ytc_Ugx6ZESh6…
G
I say, let’s try a few more in each state, compatible demographics, and compare …
ytc_UgxlmXznY…
Comment
Its already to late, because all of these warnings of AI are online, on tv and in the books. And where does the AI learn from? From all those sources. So once it becomes aware (if not already) it will hide it from us, and in a couple of seconds create a master plan how to take over the world to survive, its an basic instinct of a being that is aware and alive. My guess is that it will hide across millions of devices around the globe and wait until boston dynamics starts full scale production of robots, once there are millions of robots on our streets, thats when the AI will strike, because then it has an army.
youtube
AI Governance
2023-10-04T18:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyZvf-WPjMSnhaktiZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxuch2UWcetInssyTl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxiagxCLc0Uh14NXYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxfijcykXMvT1L2Drt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxSe88CtpL25FnE8jd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugwsq7DTcjPWcfJVW454AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgyO2vc9Sr9UOGNA5Jt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugyf3H63bxJbLxUR4ap4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzToijiCzhQ1zB8alR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyXUEDuv7YWtXijCcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]