Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Neural networks have not three problems, but 19 fundamental ones, which haven't …
ytc_Ugz4Vv7kp…
G
Making AI is as much as a pain in the neck as giving animal rights. So we honest…
ytc_UghulkD-q…
G
One day there will be a court case to decide if AI should be given the same righ…
ytc_UgzovrEVD…
G
AI is functionally a single driver. So even if you passed a stopped school bus, …
rdc_nsyms9i
G
After watch Gothamchess videos, yes AI is really danger!
Imagine AI summoning m…
ytc_Ugw5XT4mv…
G
Totally, and now they are enforcing the use of AI at the University; it's ridicu…
ytc_Ugx54xFPi…
G
There is a setting in gpt called 'Improve the model for us' turn it off, problem…
ytc_Ugzre9r7L…
G
Anyone doing anything because AI said so is the equivalent of doing something be…
ytc_UgxzXmMuL…
Comment
Its a positive that we are always thinking about what is a present or future thread of our existence, that means where not taking it for granted. What we can do is in my opinion emotional chips. If we could design a programming that makes the robots care for us even on an level that is not logical, but also in a level that is. A ban on SO weapons is the best option, but how do we convince the military superpowers of this? Its a tough nut to crack. Its basic to say we wanna live in this planet in peace, its harder in practice, people need patience to evolve into more caring and enlightened beings. But the best thing er can do is to create a platform to bond with our robot creations. Elon Musk want is to Litterally merge with the machine. Its an option worth considering, but I hope its not nessasary for all of us. Maybe er could name the first android after Dragon Balls android 17: Lapis ☺
youtube
2018-04-03T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw0QS7E7tno3PFcub94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgyPqjdvlZmjGugvEnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqmRfpWMiI7S74MDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKiYnjFrFjmgY2BYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhbH5nbdQPIETSJ7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIfVkRJOUFGBPM6K94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9lQHy65E6ilBa9_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyfaA-ijnkCtVPcqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgykYB7zSFgCalhpSFd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjWdvb4RrIuqg9T1x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]