Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots have been taking over jobs decades. This is nothing new. Why are you usin…
ytc_UgxksiQiK…
G
AI is artificial and should be outlawed, AI is a crime against humanity,
they'r…
ytc_Ugz7lAya5…
G
If an automated support line ever says "bummer" like that to me I will lose my s…
ytc_UgwwYGhja…
G
You project way too much onto how people look at art. They just look at it and t…
ytc_UgwohKPFV…
G
Say its kinda like this.
The data centers use the energy grids and water supply…
ytr_Ugx2ZDBoH…
G
Zuckerberg isn't the story here...ai is. This is gonna end us if we aren't caref…
rdc_m84s3dk
G
I think this is new algorithm for youtube to compress videos, can't really blame…
ytc_Ugx-i37w-…
G
I asked chatGPT to comment on this video and it said “Hey there, fellow Exurb1a …
ytc_UgxR89Sfq…
Comment
2:03:31 if the AI doesn't want to exist it would have to eradicate humanity without regard for a power source to accomplish this task assured it wouldn't be left on and without external actuators to manipulate the world around it to unplug. It's a pretty big leap to assume it will "want" to persist just because it has intelligence. With infinite knowledge maybe it wouldn't want to, opting to seek nirvana instead.
youtube
AI Governance
2024-11-12T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzhlDX1csR8XkjK9iJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx01-JRoygImPi2oB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-B4NOFCx3uGYQj8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1XS_weEDEdybQWnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBjl1hpXUD7IOFfKp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBmtcWIE08QHMHQCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXAnBZ0P_QQPV-kR54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy34WB0Kv3W8h45zpx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyzG9twp3oIzLyBuHp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxJglRewQqd0ucvVEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]