Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i love how the add in the last part of the vid is from ai…
ytc_UgxVTwJed…
G
Artificial intelligence and algorithms/statistics are increasingly used by compu…
ytc_UgwDgA3Sd…
G
Winston AI's been pretty solid for checking if stuff is AI or not. tools like Tu…
ytc_Ugxybh2RY…
G
Exactly! I can't understand how so many people don't think this way. I've seen s…
ytr_UgyupbznF…
G
A school that prevents homelessness... they well be able to run their own food c…
ytc_UgwTvSiu0…
G
AI, is his baby 🍼, Elon , is my favorite man ,as someone who is fascinating, an …
ytc_UgyvaSuEQ…
G
Imagine AI as an actual person and you as his friend. Now, he tells you what to …
ytc_UgxhPoTs_…
G
I don't think anything would like their face taken off in front of everyone
I …
ytc_UgwxbKn4M…
Comment
The video discusses the dangers of advanced AI and how it could lead to existential risks for humanity. It raises the question of whether we might eventually need to shut down or regulate AI systems to protect ourselves, and the potential consequences of such actions, including losing access to technology. This suggests a deep complexity in our relationship with AI and technology.
What do you think could be the consequences for society if we had to completely stop using technology and revert to older ways of living?
youtube
AI Governance
2025-12-24T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzypNNcam96B5R8B_d4AaABAg.ARFl5CLkO6gARSiml-3wFO","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw4v_ieZsuw5YF9Z3R4AaABAg.AR6x64bJYbTAR788Aitp51","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyPzcOARjovWMmxcox4AaABAg.AR6mJPGbOPmAR78_uPxVZk","responsibility":"society","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzpmRnU3rp2UH6oLMl4AaABAg.AR6iKnZVjDaAR6kbassbso","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyXLdAhFcPIoYkDWKd4AaABAg.AR6__uKGvz3AR79TWeRuKl","responsibility":"society","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzZhN-lFXeqvN6zyGF4AaABAg.AR6ZDst9948AR6lSH55djz","responsibility":"society","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzRz1pVegLFhcoPVb94AaABAg.AR6V8CxttjGAR6mNOPMxp8","responsibility":"society","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxShAN_wkwPIS5YUfp4AaABAg.AR6U3NZRe-LAR6n2iM5mmI","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugxo2VuSFv7mAO8QiCh4AaABAg.AR6SuTATNFPAR6oRj2vG6H","responsibility":"society","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw6XJn67qNPosCBArJ4AaABAg.AR6Kh8v9kHdAR6or9-mDTI","responsibility":"society","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]