Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe intelligent machines, as they evolved, would value love higher than humans…
rdc_cthtz9j
G
Full self driving beta and autopilot are like level 1 and level 1000 . Two total…
ytc_UgwWmr4QR…
G
@erikburzinski8248 that's a good idea, simply banning deepfakes would be very h…
ytr_Ugylt9ENf…
G
For such a brilliant person, it’s difficult for me to think he couldn’t foresee …
ytc_UgzGR-qOp…
G
We have people who bearly know how to run a computer trying to make laws on AI I…
ytc_UgzN8xvI0…
G
AI is not a problem for artists only it’s a problem for so many other profession…
ytc_UgwX4ccdc…
G
All was foretold, and much was forewarned...all IM saying is, if something reall…
ytc_UgzLWSEMY…
G
I keep checking AI responses and challenge its mistakes. It can only see what it…
ytc_Ugwu9j2zb…
Comment
as someone who uses her brain to work, one of the core competitive advantage that I have over AI in the future is the cost to make me think. One needs to pay for the energy to power AI. Similarly to human labor vs robotics, the cost. One robot can do 100 people's work, but yet they still employ people en mass in India and the old China, because 100 people are still cheaper than 1 robot at the time.
youtube
Cross-Cultural
2025-09-30T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzxMY6M48pNJixjHj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPHW-M2T_TmcR2rFV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwap99EzJlL4YiPW5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw08dkSdxBVtqqKajN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_v5qam1FWxKz-4v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxhi-7aWeFzIaWMs514AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgfpAvwYDldXT4CBV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2k8rQOdrQlXec5AN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgOjkStdRfyfZBJDx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz01eLzyTm6b_FN51h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]