Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's really not a question of AI directly, but rather the foundation below it: t…
ytr_Ugy1oiTsy…
G
First thing first New jobs need new skills and for have skills we need time,mone…
ytc_Ugya_6vnk…
G
When AI achieves truly super intelligence I'm pretty sure it won't be very keen …
ytc_Ugyf_7ygd…
G
What makes human art impressive is that a _person_ did it. Doesn't matter the te…
ytc_UgwxNe1Iy…
G
AI is clearly not sentient. His example is absurd. Everything else that followed…
ytc_UgzPzdG-X…
G
Not sure if i follow yet, if i where to paint say in ten different people style…
ytc_Ugxsw07wa…
G
I think if we all came together to teach AI "artists" and AI "artist" supporters…
ytc_Ugx23Fn2-…
G
So when a customer buys a Tesla & puts it in the robotaxi fleet, isn’t the actua…
ytc_UgzyWYOXO…
Comment
These debates always turn out the same. One side genuinely brings up actual risks and the other side just simply denies them, with no real reasoning whatsoever LOL. They basically just say, don't worry, it's a long way away or it's never going to happen and then ignore all possible risks involved with it ever happening. While the people that are actually creating the artificial intelligence are telling us every single day that they are absolutely pushing for artificial superintelligence and that they themselves expect it to be here within single-digit years.
youtube
AI Governance
2023-07-12T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwnw3SYzESwHw7Z8554AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlXkPIN3oROn36zXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-ZQO-Blc8svSRkt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8lL4YbPdBN_CVmPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyafhT2kXn14bl6Uup4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHGK7BEBnXJnf-dit4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyXJPnVb7uy5-4xFG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-StT6n7J5xA2FQZV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQQ2YTaUZu7EcQbnF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUm2-SGyL-jywMKvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]