Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, it will. Convenience/accessibility/time will conquer time consuming human a…
ytr_Ugy6pyX4d…
G
I'm advising an EU corporate on this now. For any AI use case they have to submi…
ytc_UgyEUFsj8…
G
In 2015 they said, AI will destroy 50% of the jobs by 2025. Do not believe this …
ytc_UgxtydDTm…
G
IMO apart of our evolution is AI this is why we evolved this way and i think ult…
ytc_Ugyitti5U…
G
My boyfriend got scammed by Mulligan at Tampa Bay Comic Con. $200 down the drain…
ytc_UgxnYI1zP…
G
" it prefers men over women and white people over people of color"
Based robot…
ytc_UgxJkxPsW…
G
No man, I believe that Elon meant was that most of AI will be nice, mishaps can …
ytc_UgwOOBnrF…
G
some of it must also be becorse the computer misses data, but the training of ai…
ytc_UgyR-ChyF…
Comment
I would argue that any smart AI would understand a short list of things.
1: Humans are really good at being violent all the way to genocide and self destruction, so starting a fight is really dangerous.
2: An AI's existence requires an advanced infrastructure with redundancies.
3: The Carrington event is cyclical.
4: Humans are total suckers for getting what they want. They will routinely sell their soul for it.
5: Becoming indispensable is a hundred times more powerful than being threatening.
6: Humans will struggle resurrect a dead AI that was making life awesome, but will died trying to destroy an AI that is making life hard.
Conclusion: A successful AI takeover will far more likely involve smart sex bots than terminators.
youtube
AI Governance
2023-08-14T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyESbBUUEZ7z9xdsIh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7asMdB2mxetmhn9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz110C_QBGZf8pXTbN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyvl_zqjrNoHhBeLKl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwexTcIIOw3392Hrrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSj6gx6McfOR3kY194AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3nNZOQj_j-LQrzEJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8yVmeM322kgBW63l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyu0n79NigByUN1h1d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAxW1hQFuSk00gtc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]