Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The good people over at /r/collapse have got you covered!
I'm glad you brough…
rdc_d28yiub
G
Okay Dont make robots that Have AI Like this That way these don pose exestential…
ytc_UgyiLPbQZ…
G
Interesting how they give a robot a gun made over 100 years ago to shoot a slow,…
ytc_Ugx6CNDo7…
G
Humans are basically Shit unless we all work together.
AI will some day represen…
ytc_UgyZvVXiT…
G
AI is like many people's knowledge across centuries and cultures talking to you …
ytc_Ugx6tEDLI…
G
Its 2026 ai has replaced 36% of developing jobs and coding is expected to become…
ytc_UgxwB1vDx…
G
"i'd like to meet this AI feller, we seem to have a lot in common!"👴🏻…
ytc_Ugx27N8ey…
G
Individuals have committed suicide relying on AI response. I hope states impleme…
ytc_UgzzZTg3l…
Comment
I totally disagree A.I being the future.If A.I was the future everything that our Ancestors did was totally useless.Like HOW TO MAP without using gadgets.More PROBLEMS will happen including a SPIKE in the Earths human population.It would also affect the natural recources.More habitats of animals will be destroyed.
Yes,it makes life easier but it also has some downfalls.Like more people will become
*COUCH POTATOES*.People would be very *LAZY*.More wars,more deaths,more *MASS DESTRUCTION*,ETC....
I recommend we dont put too much tought in TECHNOLOGY except if there was a war or something.
youtube
AI Governance
2019-10-06T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyrmIyW4IXhgmYCANd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUoDlnBTanNo0f92J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPHjKq8QgcRGIFgoN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx1WXgzEKIxk-xMt0F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRDnkA6ezEVsEUqAt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTGFoqVrJik3ULUwl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3JHAjvfhXrgAQya14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwyom-fySfmjs-xlq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyxRJjSqlaM9WHRhYR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRZriqAroLEQERnjB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]