Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro Ai is so racist that they have to put black human kids instead of kittens…
ytc_UgyfQaeLO…
G
-Men being generally more driven, resourceful, productive of results when pursui…
ytc_Ugyo3vSY9…
G
Can we take ai to space to get the work and tasks done that human are unable to …
ytc_Ugz0xqw30…
G
I love having access to AI art. About 80 percent of the time I can have an ima…
ytc_UgzunTfSs…
G
you got him debate wise with the conscious ai scenario. but gpt ended up taking …
ytc_Ugy6ubXHy…
G
I think I would rather stay solo than have this be an option as a life partner….…
ytc_UgzwLHWmh…
G
Peter Theil funded Jeffrey Epstein pedo rings and JD Vance VP nom let THAT sink …
ytc_UgwS2kcBU…
G
How is this allowed? How in the world does taking an entire industry that employ…
ytc_UgzjDkNd5…
Comment
I like the take on this video, but I think you missed an elephant in the room which is human nature and geopolitical interests. Even if we reach AGI we don’t know which country/power will own it first. Also, AGI has to learn how to operate in physical world and fix/feed itself. What about the military powers and thousands of nukes that we own? Will that also be in AI hands?
Even if we reach a point where mainly everything is free/cheap do to AI labor, I don’t think that alone will heal us humans from warmongering. Most of scientific/technological advancements in human history were firstly discovered/adopted by military powers.
youtube
2024-12-25T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9Qba2jIsjS53ko5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_ymExDkM6ldx12IJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwPa7u0z7plkOyoO914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi36BJsah9wPeNmIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypIKTBbhRW3o-PG2R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzjWIfeRMBXLZi7KMV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-JF_zkcP7hzhUpKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwia7_tP5RXjKX1LEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd7O8S-QEwqu33pDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxQTLOBW2dIrb_6b54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]