Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I say that I guess I am sort of in the middle
I think AI in the basic is somethi…
ytc_UgxFwpsGM…
G
I'm not a Tesla fanboy, but i do have a NIO, and our car has Lidar, Radar and c…
ytc_UgzNTQbBP…
G
If you see colour then you're racist. There is only one race the human race. You…
ytc_Ugy3yskZj…
G
Está na hora da igreja subir breve breve vai acontecer as máquinas estarão no lu…
ytc_UgxD2JwqI…
G
That's an interesting observation! Sophia does have a unique presence, much like…
ytr_UgzB2DG11…
G
On a lark i turned on the ai notes for phonecalls with customers at work and it …
ytc_UgxrwV3dP…
G
If a business nowadays becomes 50% more efficient, you, the worker, do not: rece…
ytc_UgxFxB0Mh…
G
Just drove from city centre to Boryspil, only took 35 mins, little traffic and n…
rdc_cfkw2lu
Comment
Elon Musk can afford to not think about it. He can just give his children a billion each and they won’t have to think about it either. Everyone else is fucked if AI does what these people claim.
The good news is that there’s no real evidence we’re heading towards super intelligence, there’s strong evidence that we’re a bubble, and no replit is not very capable at producing software, the AI software is complete garbage and we’re still a far way away from it working well. However it will probably happen eventually, just unless there’s a breakthrough (which there may well be!) I think 20 years isn’t likely, maybe 75 years
youtube
Cross-Cultural
2025-10-15T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwkM5UXkl1mTHCruM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxEvQOs-Pvw_aY7Jl94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy227rKlHEDMCt-bD94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjoZMmGCA6lFT5XaV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTUvtjjq8atq5azyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwX9blEvEdPZv3rVD54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzCE7ZsY6CbKGCdkd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKNxj2AjFAI6hjvbN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxu0unhwUk7eLuKXBB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwnw6PdRKr8lg8Up-t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]