Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do people act like there can be only one AI? You don't have to pick sides at…
rdc_nsey4g1
G
AI Scientists earns more money if we think AI is supernatural.... its not..... i…
ytc_UgyGlG-Ic…
G
New rule of the internet: rule 416 if a artist sees AI art they are legally allo…
ytc_UgwLOnQnI…
G
"If you can't beat them join them" just lean AI and lean how to interact with ai…
ytr_Ugw7lKd1y…
G
A quota of 25 deliveries per hour is physically impossible. You couldn't make th…
ytc_Ugx2UVAMw…
G
You’re like 65 aren’t you. ChatGPT can be used as a cowriter in school not simpl…
ytr_UgzWy0LTI…
G
I wonder if ai know how many tp sheets are left in the roll behind my back as I …
ytc_UgzufCVMB…
G
AI can never replace your good ol salesmen- good luck with that. It can generate…
ytc_Ugymn3lzv…
Comment
I'm not scared of AI, they are already used today in the form of Deep Learning and Neural Networks. The smarter ones, such as self-driving cars, use alot of processing power requiring them to use a very high-end CPU/GPU that is alot better than most consumer grade PC's. I don't think its impossible for an AI to decide to take control and not listen to instructions, but there would be a failsafe, and aspects of the AI that arn't directly controllable by the AI, for example the motor control could go through extra programmed code before it reaches the physical motor. AI robots would be expensive, because of their high end components, and also all the hardware required wouldn't also be cheap either, for example for a drone with a gun: the high-end motors to hold the gun, the extra motors to fire the gun, the cameras, all the sensors, the frame, and the processor would be expensive for a terrorist to get their hands on, and while they could afford a few, they couldn't spend thousands of dollars for a swarm of drones. A few drones with guns would be very easy to shoot down and destroy.
youtube
2019-04-22T23:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy3boToPB_xWnwDHgh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyh2gJc47ez_S9dRKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyIE2I8RCmsT7k9A9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzldFL3hAVhJj1xO9B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXHaMIBlkxJAOtj8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFQY3tdogCJuB7cOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9HDU1etCXTMNqZPJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy5xzYdlGdJWiWktBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwP-7b0tk7S3HzwoAB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyVZ6vNhke3sRzYqV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]