Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When he took her face off, it looked just like Robin Williams in Bicentennial Ma…
ytc_Ugzpviy9a…
G
👎 This is the worst ML "AI" explanation i ever heard, the ignorance, confusion a…
ytc_UgxshN67C…
G
@soo.loafly Really? I felt this version of Chat GPT considered morals/ethics/val…
ytr_UgxNzIGPO…
G
yeah turnitin’s stepping up, but GPTHuman AI still gets through makes the text s…
ytc_UgwZ_KTxW…
G
I don’t know if anyone noticed but he uses ai to generate texts too. Because his…
ytc_Ugx8XDli7…
G
@IvaN-cf7qt that, also in an apocalyptic scenario where humans are fighting for…
ytr_UgyAljeJ0…
G
wait, then what happened to "kenwood" on the mixer on the right?
maybe just …
rdc_oi140e5
G
Absolutely, that's a crucial point! Sophia was indeed created by humans, and her…
ytr_UgybzBbCa…
Comment
36:20 This is actually where the original plot of the matrix actually becomes super relevant before the watchotskis had humans as batteries it was actually humans as computers which makes sense given the human brain is 1000x more energy efficient than a silicon chip at least (20W which is less than a light bulb) and frontier ai systems need exponentially more compute the natural result is that if human well being and compute every come into conflict asi’s are likely to turn us into compute. Check out cortical labs to see the early version of this
youtube
AI Governance
2025-11-13T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxq77RqxhqonCeaATB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz10SmduLaUTyoC3e94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgztFDp9NaMemoVVsMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwpxWBRwfB3Z6UYtOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlgKEXEjSXguQBcRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsDYGt5pnHuEcwADB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRl-p9vVOUqtFNm554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmROmulnnePKne1L54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEi_7ke6mt_U6kqIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzg9KKyHaOn5A1fsDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]