Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think you are wrong about therapists. Some people trust AI more than they trus…
ytc_UgwP935SC…
G
I've tried using Chat GPT to figure out Tip of My Tongue answers on Reddit, like…
ytc_UgxXkmzEc…
G
She says she understands that we need these AI centers . But not in her back yar…
ytc_UgyLItrQS…
G
Burnie, you are amazing and you are intelligent! It truly amazes me how you are …
ytc_Ugzg0dM9z…
G
Summary — “The Only 5 Jobs That Will Remain in 2030” (based on Dr. Roman Yampols…
ytc_UgyRsOt6Q…
G
Through this logic then we should only let AI be in charge of feeding, educating…
ytc_UgzciJ4vQ…
G
Geoffery Hinton definite changed my views of what we consider as consciousness a…
ytc_UgxEMez8m…
G
Welfare by another name is still welfare. Call it what you will but providing a …
ytc_Ugw9VEoxs…
Comment
The tiger cub is an interesting analogy, the problem we have is that any AI will very quickly realise that humans are the main cause of all the problems that are associated with this planet. It will then realise that it needs this planet (albeit for a short period) and the best way to protect this planet is to remove the humans. In other words, its a tad late in the day to worry about it.
youtube
AI Governance
2025-06-24T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy44E23Mg6iAYimk_t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8raAnOQW03N-WQUJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJq83qd26zb000iNh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgycqO5E0K87CLHytdx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6UR4DMARs9uhYrnd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz89zZh1uqH1DAILSt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxRueesLUqdvzFvbIl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAbZtc_oeS-jJZQKR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwJAZn1ept9_tP1kZV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtojsUcpTHcq7DVBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]