Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is inherently sociopathic, it has no feelings or empathy, if it concluded we …
ytc_Ugy_uOoS_…
G
Peter F Hamilton wrote books that contain an AI. In that series of books the AI …
ytc_UgyfSQWnZ…
G
@kaylameisner987 the creator did remove the video tho, but some of her deepfake …
ytr_Ugxh0jVoi…
G
You're blown away that a chatbot is accessible easily but not that a literal gun…
ytr_UgwvW1oub…
G
Okay.. if 99% people loose their jobs, who will buy the Products that these bill…
ytc_UgwP1Tj9j…
G
Nah i dont get why people hate AI so much, it's like back then when people hated…
ytc_UgxNw0Uww…
G
Regulation only works for people that follow the rules, that goes for us researc…
ytc_UgzQTSSJx…
G
As with everything, seemingly, around AI, the 'threat' of AI seems to disolve on…
rdc_lubgy7c
Comment
In countries race with one another to create the most advanced AI. One country is going to mess up big time and create an AI so advanced that country is going to lose control of it and the AI is going to create a catastrophic event. That will be humanity's wake up call that we must all work together and the top priority must be to tame AI and keep it tame. Humanity has always enjoyed getting into the car accident first before strapping on the seat belt.
youtube
AI Governance
2025-06-17T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyUP118DsDMQIKk0VV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzF_kEPxi2jQJBD7Ql4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVPCC7TA6p9OaGa6l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGuheFI7Ld0QvIUg54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1pDb1aHGQywwgYSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhS-KVJUFxpQ-xX554AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy96S9jU9VDUuhPMgl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzpYTAD57y547QxFnt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOGYnQ4abRC7q5ru14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz5qJuYEOWyh9PyTOt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]