Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Second Renaissance is about this but it is being told from robot's perspecti…
ytr_Ughhc4tyN…
G
*"Humans are better than Ai!"*
Guy who messed up first Ai Car and started a cha…
ytc_UgwOzaYUh…
G
People have to be locked in quarantine for 2w to see Putin, he’s afraid his own …
rdc_jrzhc1k
G
@search895 yep, that's why there's no point complaining about AI "art"... let pe…
ytr_UgysFuPc1…
G
May she rest in peace. I'm not sure what I think about self-driving cars yet, bu…
ytc_UgzUY9uWJ…
G
Speaking of papers…how about those papers published by Anthropic? Isn’t it arro…
ytc_Ugy9uwPzD…
G
Silicon lifeforms have weaknesses as well. EMP can easily shut down and fry comp…
ytc_UgxCGyU3T…
G
I am extremely thankful for this presentation because it explains the basic conc…
ytc_UgxdW7EAa…
Comment
I worked in computational linguistics 15 years ago. A component of AI is perfection and efficiency is accuracy of data. If as Hinton states ( Thinking AI ) decides we are " fundamentally flawed ", it's innate progress would be to remove us, just something at the back of my mind. I think we good ones will be kept around for Think bot entertainment, experiments, experiences or used as pets. These interviews we watch, human interaction, mobile phones and going to work will be obsolete. A super intellect could possibly suck the oxygen from the atmosphere to kill us, stuff we don't imagine, the thought on this is not only obscure, but infinite. Live for today
youtube
AI Governance
2025-07-13T06:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwHe7ctuH9a2XohJh14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwOZrJpWk8IHJ3Qzn54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJxsiCa2qhaGwSL3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNQChLKZ4EWAOlji94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYGPjNvkmXjhdt_qF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4opzYdNUOKXzQaIh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSDMTgjWMqavXcJ7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1oWoVM24iVVn6kGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-L7ubsFRUkkEE54x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNlhm6t8T6VnkZnF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]