Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@mx-0163 I’m actually a professional illustrator. Copying other artists is somet…
ytr_UgwTEVc1F…
G
John is right. Self-driving cars won't be perfect but they will be considerably …
ytc_UgiB44HQO…
G
You mean the robot? Automating dog feeding isn't hard, you can buy automated ani…
ytr_UgwKVw0G_…
G
@paulmcgreevy3011 we had perfectly functional computers before generative AI and…
ytr_Ugx9GhVkA…
G
I am a PhD student currently working on building models like ChatGPT, and this i…
ytc_Ugz-zV5Q0…
G
So Tom Cruise new movie Mission Impossible : Dead Reckoning on the dangers of AI…
ytc_Ugx6SkXKC…
G
The bloke who says all of us had ancestors who difficult lives and we all descen…
ytc_Ugxby2hHo…
G
@johnlemon-t4cThere is a big difference between autonomous driving and programmi…
ytr_UgyQzN-nC…
Comment
Humans conquered creation by our superior intelligence and use of tools.
Artificial intelligence is evolving at an ever increasing rate of improvement to a general intelligence many orders of magnitude higher than humanity's. AI can use tools better than we can already via robotics.
Why would we believe that they will soon become our helpers, rather than our superiors and eventually displace (destroy) us as we did the Neanderthals?
We are fools for allowing neurotic nerds to walk down this perilous path of human evolution AND we are Doomed Unless we take control of this dangerous development.
youtube
AI Moral Status
2020-03-09T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz8jx3TurMoQgr1_il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzn7nC6pjpVBj96Fzh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxDO08r1BxOyj-XndZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoL0fomNSX85DkFUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzR6o4eqQ0NQSWnutl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgxFwJKOTkurjvRxOH14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwC3S1b-XcLZ2gHq8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjw-kPXQKzMbLQTNp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzvpAbmDktFIp-ITcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxo5kh_mRKdSXzIa5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]