Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They can strike all they want, I don't care. I think I trust AI more after what…
rdc_ju5wnnl
G
@blueskiesmudpies1061 _"can't answer any question for which it does not already …
ytr_Ugz2lYTZB…
G
@AndrejMejac made a comment saying "Learn about using AI ASAP. Make a company…
ytc_UgwGdhlzi…
G
good lord this is so painful to watch. i feel so bad for the AI…
ytc_Ugx_GaefS…
G
Why the fuck even *have* a constitution? Fuck it. It's like any new fangled tech…
rdc_dw3egn6
G
39:51 Echo, it is where the call is answered, this is a question of 2015 🤔🧩 what…
ytc_UgzGJ2rzd…
G
Sam Altman collaborated with Mossad to take out Suchir, the whistleblower who wa…
ytc_Ugy9NZvt7…
G
I'm though because it sounded a lot like flu fluoride, if you're curious just wr…
ytc_UgzKACa5t…
Comment
If AI were created by man,then that gave us the ability to play God,now with that being said what if AI got tapped into Crispr technology and began tweaking humans,that's especially scary when you think they could literally destroy the human race at a molecular level without raising as much as a red flag cause they could cancel our said red flag and so the beginning of end is already upon us
youtube
AI Governance
2024-07-29T05:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzEvQtx93GfEcyolkt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvOJoIEqb93fHBAAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz7q7kbiKnvinQjJh94AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzVm6vB4BRDb-nWrRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJCXOb6gZSW4aRNX94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzqw2NnTB80k_PwvrV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxjYFtE2RVfdT_fjJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDeSbnWBXN54EXzrl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxGKvkG5ZUity6BSZt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnIBw0URM2RXPfkDR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]