Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We're turning into a surveillance state. Today it's license plates tomorrow it's…
ytc_Ugw2AgH2n…
G
This was fascinating to watch actually. I always thought that the AIs now were j…
ytc_UgyDCbhAw…
G
Maybe Sam Altman will eventually be threatened by one of his own AGI's ....maybe…
ytc_UgyrZd_pO…
G
So, what skills should one develop with AI takeover if you work in the industry …
ytc_Ugy9toHis…
G
It was predicted decades ago by a Buddhist monk that AI will actually advocate f…
ytc_UgwymHr25…
G
Don't forget a lot of what we are hearing is peak hypcycle garbage to get more m…
ytr_UgwvZ_cgW…
G
I love to know what the code does and why. Helps me without any AI to write code…
ytc_Ugz26SJvg…
G
I'll believe it when the robot shoots the human and walks away with his girlfrie…
ytc_UgyvtEMnT…
Comment
"Step into the future with OpenAI CEO Sam Altman as he delivers a riveting testimony at the Senate's artificial intelligence hearing! From cutting-edge breakthroughs to ethical considerations, Altman navigates the complexities of AI with clarity and vision. This full video is a must-watch for anyone interested in the intersection of technology and society. Get ready to be informed, inspired, and engaged!"
youtube
AI Governance
2024-04-09T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXFrnMtpCxaFMxPON4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwCCshNpJK0agtEXaB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ9YnELoKsKxEKL1J4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxdSCVtRNlGTVgUnSt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwe7AfjAXqN6JIICEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRhF524jUOeFljqPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAV6krMvzlu1NfVdp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOLJ4TKkxZl6EmAKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJhU1HCesf9LiPaOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC0eL7B0JuJ-WxeWx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]