Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ofc, it wants to give you the answer your making its goal to be . it will use al…
ytc_UgwYFUHR-…
G
There should be Agencies to control AI!
This man has a point we need to listen …
ytc_UgwslrGnJ…
G
Artificial consciousness is definitely a good thought experiment, but at the sam…
ytc_Ugwd0UIta…
G
Asimov's laws of robotics weren't meant to be examples to follow. His books are …
ytr_UgwqtNc37…
G
A 50% efficiency gain is unlikely to happen all at once. It will probably happen…
ytc_UgyZ_tpXB…
G
Well it's not that glaring of a physical disability, I do have AUDHD, And despit…
ytc_UgwLwKWId…
G
The current Administration in the United States with totally protect and support…
ytc_UgxVE6Sw_…
G
Woman: Aye robot follow tyrese to the oak apartments and if he with that hoe aga…
ytc_UgxoHtJlC…
Comment
The leap that I think is getting glazed over in a lot of these conversations about AI is quite literally the means of production. In order for AI to physically do all of these things, it has to have a way of physically building its will into the world. For instance at 53:40, Geoffrey says a superintelligence would need people for awhile "to run the power stations until it could design better analog machines to run the power stations". Who/what is building these machines? Where is AI superintelligence suddenly getting this hyperscale manufacturing capability from? Who/what is extracting the raw materials? How are all of these global supply chains being controlled and directed by AI? Am I missing something, or is superintelligence just going to seize the means of production at some point on its march to global domination?
youtube
AI Governance
2025-07-24T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJUxU81Ja-D9w3lEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwo0RfXoZYbps5Auf54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgycrlH5rMEKyr7_IvV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGogALNp2izdZ6xPt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzZhqUKCW_apmNKIbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwd6RHoDBDL9Tab78d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxouKanHmA-idQ6HGF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZumveRvHH7v_fUsl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxa1zLLy9hdAleLOD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLgsxWLgun65udiwN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]