Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m waiting for someone building a 3d printed mini robot with chat gpt 4 built i…
ytc_Ugw4BmnKh…
G
I can understand why that might be unsettling! Sophia's ability to learn and ada…
ytr_Ugw-tl9Dn…
G
Holy shit! Showed my son this interview and he has been talking to AI…Wtf am I s…
ytc_UgyrxN2j4…
G
We want the Jetsons. AI and robots assist, not take over. And I don't recall the…
ytc_UgwdRr3B2…
G
You don't have to worry about "AI" taking over or our anything.
Tech giants are …
ytc_Ugwx21RIB…
G
I can't care that much about using AI for explicit images or video, most people …
ytc_UgwFwLZ61…
G
This robot will be enemy of human. And i guess this robot should be not exist.
F…
ytc_UgyxbbEP1…
G
Sehr Interessantes Gespräch,ich würde meinen Kindern den Karriere Tipp geben,sch…
ytc_Ugz7t0Vdj…
Comment
The idea that AI someday may outsmart us, get independent and make us disposable is contained in many scifi novels written decades ago. Complete societies if robots, self sufficient. Its also an old thing that old scientists become anxious about dangerous applications of their own work, see Oppenheimer.
youtube
AI Governance
2025-06-21T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTe_CJhzxfI3nRJyB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwcLXypBWVuXEx6PFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5Q8em39kIvOSC9O94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxi8J0f4c-S2AqQnEt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugygxt27of-MXlob-Zd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4EYwti9HgFLKOYxp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzjKYHZYQqNBeTtPEh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwkwa34_HhbReF6DUB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUdABSqUiPqFP7WUN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYOKGFWQ7_O-v1jfR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]