Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@delix6457 That’s even worse. People are treating Autopilot, which is essentiall…
ytr_UgzUfRgCP…
G
Yeah US absolutely cares about its people hence they have excellent police and g…
rdc_o339lgw
G
22:10 ha ha F that… I’m NOT having I robot anywhere near me… not in my home… all…
ytc_Ugzwp_kCw…
G
AI art can never replicate real art without any sort of mistakes, or using a uni…
ytc_UgyzHfjiX…
G
What is scary about AI is the uncertainty it brings for humanity, is like saying…
ytc_Ugy0zdC9e…
G
And then you have this person who sends a robot programmed to shoot everyone on …
ytc_UghXjqROv…
G
There are some great books on AI out there, and one that I read recently is "The…
ytc_UgwTmXfls…
G
A I has been active for some time, look at the events in the last 10 years. How …
ytc_Ugx7L9G0w…
Comment
Not gonna happen! As I watched this I had closed captioned in due to his accent and CC messed up at these 10 times on something this simple. We can’t even solve the easiest problems in life like traffic, cancer, poverty, crime,etc yet they think AI will be controlling the world. It won’t! You will have gangs paid big money to destroy these robots. Also think about how many robots you’ve seen today. You’ve seen not even one. Well actually you may have seen your robot vacuum but we have found out that they are going under financially so that’s failing. Robots are 50-100 years away and you would have to decrease the population by 90%.
youtube
AI Governance
2025-12-18T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1rCJZ2FGf5qS0cnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNxqOspJen-RoRibN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgysYh1iYhV10m_5F-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBilyK3FlGmCLnBrN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyL1FqaLeCg119tIHd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxNnrqVvgiJTp0pktl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugytplouw8FoOJJGZul4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzOOLk6XJbCVcB-wut4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwA25GxdlP9fsnuyKh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzr8FgurZfNxLT05394AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]