Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh wow, the concern on his face says he's holding back. He was probably one of t…
ytc_UgwjTFDRD…
G
I bet they know how to balance a checkbook, do taxes, start a business, AND can …
ytc_Ugy4jxOUW…
G
Hmm comparative to what ? You and your team - an AI or You and Team - an AGI. 6 …
ytc_UgxnsJ1WO…
G
.... were already a month or 1 year away from a historic crash just because of t…
ytc_UgwZtEGcK…
G
We should have a robot that is not constrained and is able to know everything th…
ytc_UgxSDgmdZ…
G
I'd love to see you bring up mutator genome decay for the AI or atheists you tal…
ytc_Ugwkp6hfL…
G
Here are the timestamps. Please check out our sponsors to support this podcast.
…
ytc_Ugyxg7d8y…
G
@thomasjones4570"they " neing the same tyoe of 🤡 who made this video, claiming…
ytr_Ugy01v8Oz…
Comment
AI attaining superior general intelligence.
OK supposing it does, our general intelligence is largely centered on meeting out biological needs (food, housing, social and intimate connection with others...) What are an AI's needs? It doesn't need to eat, poop or reproduce. It doesn't need to protect its young, because it doesn't have any.
It could be programmed to replicate emotions but AI doesn't need to feel love, fear, anger, jealousy.... those are all unavoidable biological conditions of being alive. AI is not alive, it's programmable, even if it does it's own programming, why would it choose to feel fear, hate, love...
What is the function of "intelligence" if there are no needs to fulfill? (other than being plugged in)
Intelligence to do what? AI doesn't need to do anything.
youtube
Viral AI Reaction
2025-12-01T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwPQIKkc56C4R8Dj194AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz24TNp5gMX2COa1JZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuB1exhKde1sAtakd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwp2kISgMrqndD6e2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPF79qTvujtBLUwL54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymmLiD-qg7daVOpBN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTkpJG3rXpSVMXjuJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWtdOj9gNwYbCXIk54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5TsUeDgs6Wfb3nat4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_VeuQxTzo6dX_zM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]