Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I personally think for use in a personal environment like character references f…
ytc_UgxR0duA1…
G
They just talk about how bad the self driving is.
But I'm sure when you're on th…
ytc_Ugz87SdNR…
G
Thank god Elon and the folks at XAi are on the case. Dude started OpenAi and los…
ytc_UgyoJh7lN…
G
People not using ai is going to be over taken by those who do use it.
So you ca…
ytr_UgzVWQLhT…
G
There should be laws against "ai" large language models from training on copyrig…
rdc_l9ygnkr
G
I feel like the intro is meant to be some huge revelation that the teenage girl …
ytc_UgyNT-lVY…
G
I think AI will be our legacy, not our destruction.
It will be the children of h…
ytc_UgyhNGiZs…
G
This video is delusional. It foolishly accepts that UBI will somehow be magicall…
ytc_UgyD-Vujs…
Comment
This isnt new.. books were written by scientists like 20-30 years ago about this very subject.
The internet has HUGELY increased the threat, because once an AI gets ONLINE itll take over everything connected to the internet almost immediately.
Think about how Anonymous works, they use less than 0.001% of almost everyones computer (theyre inside yours too whoevers reading this lol) they take such a small amount of processing power from everyones computer, and it adds up to a HUGE amount (and allows them to act in secrecay).
An AI will be like this only it wont have to hide and it will be UNKILLABLE. The only way to defeat it would be to kill the internet EVERYWHERE.
youtube
2015-07-30T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UggLZ6M2z5JqTngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugi3e0GA4HfH8HgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugizge_QLY4xw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghHaxdOpGagangCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgieDwB_j4qUKngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjvbmAc83_c83gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugia_nrfbV5-d3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRCZjvTN6Mg3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjRoWWlA3PONXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugh0VoxfRhV-tXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]