Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Our company trialed a coding assistant AI and it happily spat out the most egreg…
ytc_UgyneO_mo…
G
I think at one point it won't matter if the emotions it's expressing are simulat…
ytc_UgwcGRjUJ…
G
It's probably slightly more concerning for gen alpha. Half of Gen Z didn't need …
rdc_ohoqb0f
G
You are correct. But you have not understood my point and that might be my fault…
ytc_UgyMW8DMT…
G
@TopMusicAttorney - Show me how - I am very interested in additional guidance r…
ytc_UgzhAifuE…
G
If AI gets to smart and is a threat, just turn the switch the off position.…
ytc_UgzlxTUFW…
G
One element you've overlooked or forgotten: humans don't automatically go gunnin…
ytr_UgwbJe0LJ…
G
Nothing to nervous about, is this video deep fake? 24 billion code is generated …
ytc_UgyJ1-QX-…
Comment
When you have intelligence devoid of empathy, you have a monster. Intelligent people can look at something that appeals to them and, in an unbiased way, consider what it offers as well as what it threatens. The two co-exist like poles of a magnet. If the threats outweigh the offerings, this intelligent person will scrap the idea. AI threatens WAYYYYYY more than it offers. Are we collectively able to govern technology to a certain degree and ban certain things as we try to do with nuclear weapons? No. There's too much money to made and too little real intelligence. We will create our demise. The gears have meshed.
youtube
AI Governance
2023-07-07T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4c6EeG43DkA6VbvV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtojqzCyNwbzHwHXx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmCb1EKYlnXzxlTLZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMe6lqwRzLf7Bs16R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvxsPe4FXvChBqnrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKv4Pb04tyiYaSkFp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwcr0fo36v7unjHyMR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1ozxqjC2lCp-R0EV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyztUT7Q2kWMLO-6Od4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzxjvz9DIbggChqeG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]