Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eddy Burbank and Hank both posted 1hr+ AI videos today. Looks like I’m not sleep…
ytc_Ugx51tCux…
G
Would be hilarious if people figure out ways to get the AI to give you refunds…
ytc_UgzHM46Lh…
G
It would be kind of cool if every other country had equivalent technology, so th…
rdc_cq6dm14
G
Ya, but if AI is uploaded unto a robot, that robot must have the 'mucle' memory …
ytc_Ugxjn6_n_…
G
Why are they making so much AI that takes away jobs and our livelihood? Well let…
ytc_Ugzf015oh…
G
An AI as successful as marketed only removes labor scarcity, not resource scarci…
ytc_UgzGEvmHm…
G
The thing I think about is how many aberrations before ai develops separate idio…
ytc_UgxDGPkvp…
G
In this video he’s conscious that AI cannot code 100% and still needs human inte…
ytc_Ugyto_HYu…
Comment
Wolfram is careless with the concepts and Yudkowsky corrects him. Wolfram talks about what he was working on the previous night and that he has spent 45 years working on things. Yudkowsky listens patiently. Wolfram appears to try to break down and scrutinize Yudkowsky's thoughts but succeeds poorly overall. Again, the uncomfortable realization becomes that Yudkowsky, at least in terms of thinking and speaking rigorously, triumphs. It is ominous that this is always the case. It indicates that his thoughts about AI are correct. Can someone please shoot down these thoughts of his and give us hope?
youtube
AI Governance
2024-11-13T01:1…
♥ 58
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZjk-dccsmE4r1CbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzN-dfsvH0_3hTj87Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBrsbkOUjTW8bZHgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2Qq17d-rNew-K7hJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmT97vvYHntMl9Y5d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJEWyj3-VMGPf5UR14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw75_NQVGIiLn5jb9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIcUBDH-ncdjtaAw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkP3JTDL_ibbhpF8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"frustration"},
{"id":"ytc_Ugz4NAtgI9yTWXsehN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]