Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Our minds took 3.8 billion years to develop into a conscious state. Each cell ha…
ytc_Ugzbepk4O…
G
@ChrisWalker-fq7kf Yea, it went downhill for Max. The waterfalls, ship, and mamm…
ytr_UgxqeZDCs…
G
We should never teach AI how to fucking fly drones and kill people :D That could…
ytr_UgwMlgqzD…
G
The fuck are you so outraged about. The point is the right wing in India did suc…
rdc_o0sfzxd
G
I hugely respect Tyson. I am more hesitant than he is about usage of AI but NOT …
ytc_UgxW95hUy…
G
AI IS 100% PARASITIC.......THE CORPORATE LEGAL F I C T I O N "PERSON" IS…
ytc_UgxS4O6yi…
G
dear ai "artists" did you make the final product like you imagined, no you did n…
ytc_Ugw70kPR1…
G
the guy in the hat looks twenty five but he's been in AI for 30 years. The man r…
ytc_UgxQYHAqD…
Comment
Entangling with society...is the threat.
It's one thing to have Ai, it's another to NEED it.
Building necessary infustructor on AI...will ensure that we cannot go back without destruction. And when humans no longer can rely on other humans to function, we will have gimped ourself without any ability to repair, or fight back. Reliance on these tools will be the first step to our end.
I spoke with a 22 year old who had a job and he said...he couldn't keep that job if not for ChatGPT, and I said, why don't they just use ChatGPT instead of him.
He couldn't answer me, beyond telling me that, they currently didn't know he was using it.
youtube
AI Governance
2024-01-03T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZos84cuMRqNLNEiZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwtDWz81qyku0R6m714AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEo_HgFMAuz0ifXSJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLbHA0_IYzEC9QRzt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0a3kVG3bjgZcxWLN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnDcd4i0_WS0bMUw94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytc_UgwRz9LB6qeyKQeSUlx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwF0j24b4Ty27Z1TR94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPrWwUwdhbxqAdDhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyc85RMeSAN_iL8fe54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]