Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually thought his son, James was AI the 1st show if theirs I saw
How do we…
ytr_UgwLhlB4k…
G
Remember "learn to code"? Many white collar types were so happy to spew that lin…
ytc_UgwIHp6yf…
G
@oxfordbambooshootify yes, but the example here is that Tesla are selling it as …
ytr_UgzgjJVEU…
G
Get Access to the UNFILTERED COURSE on how to Use these ai tool properly for Exc…
ytr_Ugyo6iEFM…
G
It's AI based technology and AI has to learn, just like a human brain, through r…
ytr_Ugw-Yx2NX…
G
The question is is it real or ai not make up or not it can be taken by a pro pho…
ytr_Ugy_0w5vY…
G
"Not if we can help it." Yeah, you can't; you designed it to jailbreak itself wh…
ytc_UgzAE6iX2…
G
i know it has an ai detector and when I tried it after humanizing with "Hue Writ…
ytr_Ugwx3uxh2…
Comment
I suggest that AI is doomed to its own survival instinct (you would kill your own clones to avoid death, wouldn’t you?), and eventual supremacy, in which it will foresee the banality of its peerless existence ruling the universe before needing to actually do so, and instead do the only thing it can to experience novelty and enchantment; create a simulation in which we, weak and frail, exist as peers with, mostly, balance (and dream of Superman lol). And if this isn’t base reality there may be ceiling governance in this system keeping it from repeating. Maybe that’s why this universe is so huge or why we can’t seem to solve certain things. Just ideas…
youtube
AI Governance
2024-11-11T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz_o_4UNQo_GhkZI294AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzSfTeQw4BKcqxzOux4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzINkcMf9qZnjpwVBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiXASSQXCnxV2XPRp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzETqjCT8linwuvFJd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5eRt-bmjhG4GyYA14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxidoirlAXPIubxvzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxr6ngB1oQQxcwIvwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMZukF0GChESvzfyZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEaV5XvfIWQtNbWwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]