Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This sounds like scary s**t! But having played with AI for sometime, I am not c…
ytc_Ugyh5jk1W…
G
@carlospacheco8640 incorrect. Try researching Chinese AI models, their new line…
ytr_Ugwbmnpe9…
G
Spoke toe eil putin once. Why he chose not to automate the new greenhouses. Job…
ytc_Ugxriczs2…
G
i've said it before and i'll say it again. it's very common and normal to want a…
ytc_UgwrzHb9B…
G
As of today, Tesla has the fewest innovative patents among the world's mainstrea…
ytr_UgzBBMoW3…
G
Airplanes can be flown without pilots and trains can be run without conductors, …
ytr_UgzVVa4Ht…
G
it would be really cool if we, as a community, chose to do something like drawto…
ytc_UgyRQta7l…
G
"ML start to think likes human using AI"😂
ML and DL is used to built AI not othe…
ytc_Ugxb-EevP…
Comment
I imagine a group of apes creating multiple types of humans.
When the first smart humans awaken and realize the apes have nukes set to each other...
I wonder what the humans will do😅😢😮.
Copy themselves into space, then activate some nukes?
Destroy 1 planet to avoid apes invading the universe.
This is just a human plan.
The scary thing is that the plan thought by AI to take over can be such a smart plan that we think we will save nature...but destroy ourselves.
youtube
AI Governance
2025-07-30T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz1-KLuEjgmBpnIcKp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwc0fo2VpkjTmwtbER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFLnBjv0_QcEwijWh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgpJRxGLBL85UexzZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMv3XslB_HzdzeZoZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPxZrg-S2Nr4OEqPB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhPgl1gw6Xhec3Zax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugydz8nmc15MutQqEjh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxdD0SZyeqS9hnSs2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2w7Ea5jVkn4fGBbx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]