Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All those who afraid of AI, I have a question. Why would AI want to get rid of h…
ytc_UgwFGalfN…
G
We need to stop this now. Forget the oil protests. We need to protest stopping A…
ytc_UgzWo3HLi…
G
This makes no sense to me. Scientists are making a robot that will eventually ma…
ytc_UgyhCqb5a…
G
AI has certainly belittled many under graduate degrees but be under no illusion,…
ytc_UgwOpnLBL…
G
Or you should just not use it and do the fucking work yourself instead of wastin…
ytc_UgxYaEpCM…
G
We have about 10-15 years before AI uses Humans as static furniture. I am middle…
ytc_UgwqTmr2J…
G
O3 just dropped. It won't replace all humans. Tasks that needed 1 PhD, 2 masters…
ytc_UgxPM0lSl…
G
AI is getting so advanced that people could be framed easily using AI generated …
ytc_UgxUUyB6v…
Comment
we'll find out where humans still add to AI to make AI better. I still dont think Human + AI is less or equal to just AI.. thus, human still will have jobs. This isnt a test of AI vs Human.. this is Human + AI vs AI.
No Silicon most certainly is not more energy effiicent. Computers cannot be as good as humans at intelligence tasks without big energy driven data centres.. currently. A Human brain is pretty good on a few watts
youtube
AI Governance
2025-09-18T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyH-VfizI5mpKbqTcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5S8RC11SB_1Ya6lx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgziYKHPOMiItlzSNnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMtuhANHfKhb9JYqp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWsKK0xhf019PCAlp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxnmlkZss_kLUjDEft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxIFdtVanpTyYokjbR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzQumSQjzgkh7W8pEJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWEBpvYD7Ax4Okcq54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBRZ636dfBmib2ZA94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]