Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am a determinist (which it sounds like you are too)and a believer that conscio…
ytr_UgyUk7skP…
G
AI isnt affecting my life in the slightest that i can see. Not yet anyway. I hav…
ytc_Ugw4yZTxO…
G
Singularity AI and Humans will live their lives in Pods since their birth. This …
ytc_UgxLDyIgE…
G
I have said this before and I'll do it once more to prove a point: if AI-generat…
ytc_UgwIPTJOC…
G
It's funny to imagine what these people think is happening.
They don't know anyt…
ytc_UgyrOXCwG…
G
If you look at the gran scheme of things, AI is too much, too fast, too soon. We…
ytc_UgwhB_A7-…
G
Destroy that robot plsss,DESTROY THAT ROBOT SOMEONE DESTROY THAT ROBOT DESTROY I…
ytc_UgxTTRUKQ…
G
When making a robot:
1. Do make them bullet proof
2. Don't give them skills
3. …
ytc_Ugw5HtOdT…
Comment
1:03:50 onward: I'm sorry Master Hinton, but that might be highly confusing for regular viewers. What is meant by "feelings"? It's not fear for your life (self-retaining) or the ones of loved ones (social instinct) at all when it comes to AI. What Geoff describes, is an estimation of un/certainty or in/validness about (EDIT: better wording) inferences of how future events will unfold. That, AIs are already capable of as well as humans. However, there are tons of drives and instincts leading to "feelings" they just neither have nor need. They are not fragile instances of organisms and therefore are oblivious to such needs.
youtube
AI Governance
2025-06-19T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNCaPt7z11rtKrGDB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQl3Qd1EWZTlvN9PZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwACsVN_QZ2E5-Fi1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxIpXMZV3J7grTpo6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygCYja-bSu55NHYS94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFhNMTfW4NxqIZhrd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzWxbyjKtKd7QDT-lh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzCM2_JhqBy08TRMeF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxGnK6bsfLiNrt4uSJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyriiNfiEYnt3cdWlV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]