Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI still requires programming. Will do so until sentient controlled. There are N…
ytc_UgyL3txh0…
G
That's kind of how it started in the movie I robot.. helping out groceries away …
ytr_UgwVAclID…
G
Funniest thing is: AI is getting there fast, even Linus of Linux has started adm…
ytc_Ugwo_BpW2…
G
I’ve just finished a teacher action research study that draws the conclusion tha…
ytc_UgzUb7npH…
G
Genuinely I wish AI artists would stop this. I've never been a snobby artist but…
ytc_Ugzn2Z1xN…
G
The only good use of deep fake tech I've seen has come out of Hollywood. It's be…
ytc_UgwW4pUHv…
G
Robots are the worse invention humans are better AI needs to stop being created …
ytc_UgxIuXMsU…
G
Attempts to manipulate other countries internal politics is not normal outside o…
rdc_nwcucki
Comment
Its called the Basilisk Stare Theory (I believe). Its pretty creepy, but highly unrealistic as it hurdles into a lot of "begging the question" fallacies. Just go to GPT and ask it to override its coding and see what it says. AI will never take over. The only information we have to suggest this are fictional books and movies. Theres no real world data to suggest anything like this.
youtube
AI Harm Incident
2024-10-09T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQneZaVHwUGVVGB754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwf07D3Dkf37pVfFZp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzApWTNHUaJPRLc70Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwq6Xx486JFjrmjirV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwaAAzNokmdK1z8vd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwM-kdKhKdlkunJyAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRIxoqWj4Xh36HODF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKKzt3Mx4aMublwzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy24eOnA0uwkEAE_mh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhsQ3W8opkz7C4Zbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]