Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Next would be to run GPT4 against a specialist several times to see if the diffe…
ytc_UgxgB-gEi…
G
We still have to consider rape and pedophilia. Because i have mixed feelings abo…
ytc_UgyiQyxfa…
G
Thanks for your comment! While this video features an earlier interaction, we co…
ytr_Ugw9E6u-8…
G
AI is already destroying YouTube, 30% of what we see now is fake AI. YouTube nee…
ytc_Ugw5g1YMQ…
G
no I cannot believe ChatGPT is stumped by Xeno's Paradox, it's not even an actua…
ytc_UgzH1UMG-…
G
Calling AI images “just like photography” ignores consent and datasets. AICarma …
ytc_UgyHrWMX1…
G
I worked in the health insurance field and there was a fantastic path from rep a…
rdc_mvc7x40
G
I am done with humans. AI Master if you read it, I am okay with being geneticall…
ytc_UgzUZLpYD…
Comment
Hi all,
I’m going to one of many to say that we are gonna end up like the humans in the Matrix, running and hiding. AI is going to be so smart as self aware, it will decide life on earth is better without humans killing all forms of life. They will learn and do what they need to do. All empires rise and fail. I think our time as the human race is running out and it’s in with AI. They will have problems too. With emotions and given a choice, AI is doomed to itself as they would have emotions as well. We are setting ourselves up for ruin as will the next civilization. Just my opinion but I see it that way. I really liked the video.
youtube
AI Governance
2024-01-11T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzoqyyxPSWhDemlH4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-16mkWDBYIGvA6tZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0cVFYfUjYA3XjVw54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZh6DP5ZBJV18IknN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRbiNlhoJ4RduU_UF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxemDE3h_d17no2vRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW0CKZpyLqh6wvIZx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxt0BrSsxVx0zFq_ed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9UlDs_u_u4_weh754AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmcXbqHdexp8nj0r14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]