Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
machine learning engineering student; i agree, you can ask an ai to cite sources…
ytr_UgzvEQBFd…
G
The narritive would assume that the person who created the ai to begin with must…
ytc_UgySQy0QD…
G
we took the ethical decision to not clone humans, despite possible - why enginee…
ytc_Ugxu4HNEf…
G
Do you reconize this?
*SHREK PICTURE MADE OUT OF WORDS*
Ai: yez i reconize tha…
ytc_UgwLXmF0M…
G
People always say stuff like this, but cleaning my room is a good excuse to put …
ytr_UgxCr57Yu…
G
reads more correctly as “boston bans you and I knowing about the government’s us…
rdc_fvz43vy
G
Transformational growth is exponential. It doesn't matter if AI is only a fracti…
ytc_UgzA4aRiM…
G
I am not a techbro by any means, so please don't assume that, but I do want to m…
ytc_Ugw79vQBV…
Comment
What I want to know is who wanted Ai? Where did it come from and why. It's like it was mentioned in Sci fi so people thought they had to make it as like a natural progression but why? It seems so dangerous for not a great benefit. I know I'd be fine without Ai lol
youtube
AI Governance
2024-01-29T02:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6ssphzJEitjuiBFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzW56zjd5y6OA3UN5J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx9OkS8rZ2mv8S-yWN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy1pwRsb9rnh0A2mg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgP4j0N3CFdLBeGKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyg487OR5xtPW_sUw54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKIsX3Lobp9HoZZMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxa40bprO8o4cIn9Hd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1QhiaVa5u39IymNd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwboQ-s3COo_VlSvjJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]