Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i was thinking the whole time this hearing should be simulated using ai.
chatgp…
ytr_UgxbpLCrX…
G
She just doomed herself and others with that comment. WE DONT NEED IT. IT DOESN'…
ytc_UgzYju29a…
G
AI was meant to be a scientist not your Personal GF.
it can solve complex equati…
ytc_Ugx8r6Rhn…
G
STOP AI!!!!!
IT USES WATER DONT BELIEVE ME IF U DO READ THIS! (True)
GLOBAL WARM…
ytc_Ugx_rUspj…
G
AI could lead to utopia, but not in the framework of capitalism. Unless AI is ve…
ytr_Ugw3xVFrO…
G
Why? It's literally nothing harmful. It's just a video where we guess which clip…
ytr_UgzE9U7he…
G
It is NOT good at roleplay. Sorry, but any AI I’ve ever tried out can’t remember…
ytc_UgwKKwm3g…
G
What is being described is the antichrist system, the beast and the end of the a…
ytc_UgwsA9NEM…
Comment
Terminator movies have all the info you need in the way we are heading, AI will be so powerful very soon and will be able to think for itself & be instructed to do tasks unmanned as we are building Robots etc to do things we don't want to manually do as we have becoming a Lazy existents, the more power we rely on AI the more it will feed!...then there will be a Tipping point that this Entity AI, will have Power & the saying says ‘power corrupts; absolute power corrupts absolutely’ As now mankind will now be the Threat for it exist so eventually it will have to eliminate the threat ie Mankind for its Existence!... 2001: A Space Odyssey Movie is another Link to where we are Heading!
youtube
AI Governance
2024-12-26T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxgB2a_y33BwOwP84t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBJmsJxnUpNbHgzXd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMaxHHcnhKLwyd0Mp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxS2gpbfB4GJRp6-x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqUEfXlfw-NCR0sXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7yFEhpXj_FS0AMwV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4P2_RqO8xrRtUp254AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzO3A7XVe9yvhEaowt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKnYdaXEV8LO48L994AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVLVB71LMMWQM-_WN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]