Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its a positive that we are always thinking about what is a present or future thr…
ytc_Ugw0QS7E7…
G
I hear they're making a rubber bag to put over the heads for when the face falls…
ytc_Ugxlwq12K…
G
This is where you are wrong. AI will replace 80 to 85% of all current jobs. Ther…
ytc_UgxVKVfQ-…
G
You have to realize LLMs take on a role-actor capability. If you want to play a…
ytc_UgwYxdpjI…
G
AI isn't replacing jobs. it's humans who are choosing to use it in ways that har…
ytc_UgwvNnGAm…
G
This whole report is probably made my AI. And has like 0 information. Guys don't…
ytc_Ugyo0xyG-…
G
If you want to participate in the push for regulations on the AI industry, join …
ytc_UgzprZXi9…
G
Shame on these people 😢
What will the politicians, judges and police will do if…
ytc_UgyhMLnF8…
Comment
I believe this guy is so smart, he's an idiot. The moment you remove the "specialness" from humanity, and say that machines can feel emotions and pain, I don't think you're in tune with reality anymore. I understand his POV, but this guy is "atheist" on life, and basically says there's nothing important about us. Never mind the fact if we stopped ALL AI research and development and didn't improve it, it would NEVER exist. Meaning, AI is NOT special, can't "create" itself, and doesn't "experience" being alive like humans do. AI doesn't "understand," sorry.
youtube
AI Governance
2025-06-26T04:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyaK6i2mig9D76SkZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyE-r9WuXmXmhVQ8tt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6y9EImXqeK0shKWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzzRAhB-1nBFIQnAN54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwkA1enztjB09Meq_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7GC2ISRNEHBA4orN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmvKB7Kr8xgjidzGl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLwdXzGzrzV5C5Yd94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRyJfIlSXPU-B4JJh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdXe6zsdQJyBpMQOl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]