Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To say the car has to make an ethical choice and must hit another vehicle is a f…
ytc_UgjSjaD1a…
G
Jajajajaja esto termina en Terminator , porque la IA ya se da cuenta que tiene …
ytc_UgygLSG3e…
G
My mom 'hates' me because I started pointing out the AI art that popped up on he…
ytc_UgzcvG4oO…
G
AI isn't sentiment......it's an algorithm that takes ideas from humans and jams …
ytr_UgxLxklOO…
G
I always support the Native people keeping their Land. JCW. I highly prioritize…
ytc_UgzAy6pbZ…
G
I don’t know how it won’t look at humanity as competition for resources, and as …
ytc_Ugwl6vQw9…
G
As a truck driver who's been doing this for years, I see this as a safety issue.…
ytc_UghWN1dIL…
G
The thing about this new generation ai called “LARGE language prompt/model” is t…
ytc_Ugw96To0y…
Comment
As a amateur philosopher: treat A.I like a regular person. From birth to adulthood treat it like child teaching it from right to wrong. Teach it philosophically problems to it make it understand cause and effect. Don’t start throwing math, science, war tactics from the get-go. We have to learn to learn to live with it and help it understand us humans. And then like roman times we are so advanced that we restore to philosophical thinking again. To accomplishing world hunger to no more wars to defeating viruses and diseases. We need to make A.I understand us and make three laws like the I,Robot movie
youtube
AI Governance
2023-07-18T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqZQJgQhG7oI5FaXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUPzX6PUPMiNHooaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4uAt-S19aT_KNIXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziqX7OMU-mF7tBxpB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq2CJeI9AKDIVFe2x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2vghQGXer2OYRdRB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9BQ3_XeQFzxaXJ1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFftChFGB5aOEYuhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgysSrN3l38AOaKzWwB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCUkTp4sShPyDrIvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]