Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it disgusting that people who use AI to make art call themselves artists;…
ytc_Ugw4Vhs9C…
G
Also AI can filter content too. So obviously the future will be a flood of fake …
ytc_UgxLDcXwR…
G
4:37
the only ai that actually somewhat learning similar to a human, is Neuro sa…
ytc_Ugy6VuIwl…
G
If AI had *real* consciousness then it would be *real* intelligence rather than …
ytc_UgwJc9g0l…
G
Why would we program a sentient AI to do manual labor? If it is a job that requi…
ytc_UgxmtyIG_…
G
Using AI video to create this report is garbage...it's an experiment is how chea…
ytc_UgzZWInLA…
G
I do not worry anbout ai anymore since I know now, that we do not have those chi…
ytc_UgxSoDjst…
G
Our biggest mistake is that we let our aggressive, supersticious, violent person…
ytc_Ugyp0rFae…
Comment
I work in AI dev and I can say that AI is not dangerous. We working alot on security and we can disable ais in less than seconds.
youtube
AI Governance
2024-10-17T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxLvOdd37sERNaWbON4AaABAg.A8CnkLW4eznA9i-vx5W0vk","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxLvOdd37sERNaWbON4AaABAg.A8CnkLW4eznA9i4W7_Pp0r","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxuE1OTVd_C9CCiIAl4AaABAg.A832na82JIAA9hxgz5xEK0","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugx7vcfCb8mrvZg4Mex4AaABAg.A7nY4o3r0fXA9hv2B4513I","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugx7vcfCb8mrvZg4Mex4AaABAg.A7nY4o3r0fXA9jXJk7rZrc","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJHQgNlXUpkMpm9ex4AaABAg.A7ktQq4wLlrA9i0Kay8_sF","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwaEKnkrJpt8xvzr1B4AaABAg.A7ilikMG3CuA9i-0IGR7-m","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyzUB1K8wrqnFUnuiV4AaABAg.A7fQaS_KqTRA9i0gNc8E_X","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugzf0HlQQLEcjr6oXCV4AaABAg.A7aX5OlwgtIA9Q9FneMuYU","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzF7hBvTJw1nX82dNp4AaABAg.A7Ho2y-hKlqA7X3HVZZQk-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]