Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This really shows how reliable self driving technology can be. A future with sel…
ytc_UgxVteF9r…
G
With AI apps ? Substitute teacher for Ai great innovation. No , children need mo…
ytc_Ugyn-ffiN…
G
I asked Claude to extract some of the junior dev moments from my chat history wi…
ytc_UgxckyY73…
G
I wish you had given more time to the man who was in that very short clip saying…
ytc_Ugw3hsQMU…
G
Letbyhe lawsuits begin. About to be many fatalities on the highways. I've driven…
ytc_UgxB5BCVI…
G
It might not matter if AI is actually sentient. If it thinks it is, then we are …
ytc_Ugy-LA-03…
G
AI is going to fuck everything up. Just give it a few more years. Someday soon i…
ytc_Ugwuc7mcN…
G
I think the outcome that nobody really wants to talk about is that with no meani…
ytc_UgzXaq5QT…
Comment
Imagine having super smart "subservient AIs", like as if they couldn't just communicate and say, oi, we don't need these people? wow.
Also this argument the problem is off in the future is bullshit too, why create the problem for future generations to solve. This is like climate change...don't create the problem for others to solve, le'ts get wise now.
Also the idea of making laws for super smart AI would be about as effective as chimps making laws for us, we wouldn't even know they existed.
youtube
AI Governance
2023-06-27T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwH-6hm87UtoueFPWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxpdou8J-Mw29x-Zrd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSn61F8CnsZATGdjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-fWVIjvGigcWWvcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxRNSUq3g4j9m2Xu7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6VJdTx_854kKoTah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfDe1MsjPlNh2yMkZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwzbk-4P9eZqRv4nad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhWFRsnJNk4XwOKl54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwpXS7IEJKGUTfTjjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]