Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't like that idea but i think that AI will lead to death of capitalism.…
ytc_UgxXTno0B…
G
AI displacing millions of jobs over the next 20 to 30 years.
Learning to code is…
ytc_UgyMscN13…
G
I mean... there are some species that will do better in the future conditions, b…
rdc_emp4jz7
G
A 5,000-lb infant going about 20 mph, which almost collided with a vehicle behin…
ytr_UgwNf5APA…
G
offloading thinking to LLMs is the same as hiring a person to go to the gym for …
ytc_UgzCk5qut…
G
2:29 you actually can. It would just take a *lot* more effort than it does for a…
ytc_Ugzsi0g0U…
G
All the promises about AI breakthroughs in medicine and climate research are sti…
ytr_Ugwid-lJd…
G
"Artists are one of the backbones of humanity"
That sentence makes me happier t…
ytc_Ugxch795H…
Comment
Most AI learns from human interaction, does that no scare people more? The fact you’re sacred of AI proves that to you’re scared of humanity and what HUMANS are capable of. AI isn’t just gonna gain conciseness one day and think “i’m gonna take over the world” at least not on its own accord, those thoughts will be as a result of a whole lot of other human thoughts. Maybe we should spend less time worrying about AI and more time worrying about the human race.
youtube
2024-07-09T21:0…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwZ9Kc2Wop_O5kH6cV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugys09e5pOZouVYguUR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdOvVt_COmVLL0p1d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8J6or_YysH_e8oS94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwbEq3FcUqEEN5s3YN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzLC59mlkfZMNjmugF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLZhZE-B6mEEfsc914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzi4YgNYxN_PwrtGQp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxH7a8sSiOw5KG9Gkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjqZ8Xe5Gp3HZvhRp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]