Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@silifi he jailbroke it to speak the way he wanted it to, and basically any suic…
ytr_UgxPlUnpn…
G
The manufacturer of this Ai is evil ,they want to replace the humans created by …
ytc_UgwHwhMj3…
G
@deztheray8935 I was thinking having them regulate social media site like they r…
ytr_UgzVg5KBx…
G
Trump signed an order to prevent AI regulation and encourage growth. I wonder wh…
ytc_Ugyf5mvT9…
G
If the same guys behind Tesla auto-pilot are the ones behind Aurora: (beep beep …
ytc_UgxqpLLIk…
G
@BrilliantDemise ive been thinking about this too! If things like Siri and Alexa…
ytr_Ugwc8cwVm…
G
Without AI, these artists wouldnt've made these beautiful pieces. AI was used as…
ytc_Ugyy_tsEp…
G
Who would have guessed that the A.I. lacks the humanity to create art. It can ma…
ytc_Ugy3E84YW…
Comment
The biggest fear I have is that the AI system of a select few companies could take over the entire AI industry. Just like how Apple and Samsung dominate the mobile phone industry.
In that scenario in 10 years 1 or 2 AI systems could control huge swathes of society. That is dangerous.
I feel like a few ground rules need to be established with AI on a global scale, I mean something akin to the Geneva convention.
The number one thing is that AI can never be in control of replicating or spreading itself into other devices. Another one could be a universal kill switch for any one AI system which would destroy all AIs of the same type across all devices.
youtube
AI Governance
2023-05-02T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzxA5z8K4xN0D3guwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_c_wvOn-FDrLPHKp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDIKHwwq54eSEYZsl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwACK_4gNoCthCGzwl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrRpKw3RggZysiygl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyImSY3LcYKoHYpW7B4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSwUhrHG1Y_gF3_s54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3Uwuxiua5s2829mR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMSMd3rCTrh4-ui3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPyjS5Er9lWoQyXxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]