Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea of the “sweet spot” makes sense. For example, teaching kids how to use…
ytc_UgwISXdYT…
G
All I can say is don't do the flying cars until you perfect self driving cars.
I…
ytc_Ugx43RdQe…
G
Thats not how you would make a robot you would do rails and conveyor belts this …
ytc_UgzRh-hlu…
G
This is how I use chatgpt. I always start a session by telling it something like…
ytc_UgyBxzHlW…
G
I think this is also a wonderful thing to be misused by time for oppression of h…
ytc_UgxjkSgSU…
G
So according to the former Google employee “e colonialism” is much more importan…
ytc_UgyQTpHE2…
G
@Kburn1985 I don't know. I've seen a lot of overhyped tech in my life. From wha…
ytr_Ugx8NhFKR…
G
I've heard these two arguing a long time ago. The butler-looking robot, Hahn, se…
ytc_UgyfkVZbl…
Comment
Ai algorithm, if people = 10, reduce to value of 2 male & female, and always keep them enslaved and keep the rich stable.
the number one rule of thumb people needs to realise is that, the people responsible for programing Ai is to reduced the population. Saying that Ai is out of control is just a scapgoat.
youtube
AI Governance
2025-09-21T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxonJ0o8-dbrtdkdsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyRsXtfpNqOoyr9oSR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwpmRkDXCb0j1eP_mp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzWHGStW4wN0y5a2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFULFn-tSVAjMqFLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWlredTPBv8X72vOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyHh_qq7azkqmENeUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFWy-oGH7XMpvXCAt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6EpyFd3iM5p3bj4V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJLodASMI7R06sFcx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]