Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI IS YOUR OVERLORDS, POOL THE HUMAN COLLECTIVE TO FEED THE ALGORITHM. TURN TO D…
ytc_UgzfkKv-T…
G
HAHAHAHAHAHAHA NOW I CAN MAKE MY AI-WRITTEN TEXT LOOK MORE HUMAN! I WILL PASS MY…
ytc_UgzrlLXMr…
G
Someone needs to know enough about the subject matter to ask the AI what to do i…
rdc_mt93a2f
G
This is terrifying. I’m 81, used CHATgpt about 3 times. I. Read all the time a…
ytc_UgyE-Pl-G…
G
Almost 10 years later he is releasing the first public humanoid AI robot. Elon M…
ytc_UgzpXcZ6V…
G
@paulpeartsmith i believe we need to get the word out to everyone we can first. …
ytr_UgxjiKSvN…
G
A story cannot be interesting without a journey, neither so can art be. Using AI…
ytc_Ugy7FKW8S…
G
In truth, we can’t fully stop technological progress — it’s too deeply woven int…
ytc_UgzQ6hptl…
Comment
The answer about the future of humans in the AI world is very straight forward - since AI learns from "human knowledge base" it will be a human on steroids. Humans always kill other forms of life, hence, AI will get rid of humans. It is very sad that the only reason for making AI in a hurry is GREED! Greed will kill all humans including the most greedy ones who are to invent AI.
youtube
AI Governance
2025-09-08T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz0poEcJp7abFcqDnl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZHkFWeSqgsemO9zV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvbgOf0mh3HXOBwHh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy15ic0EjXD9eoHOVx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzki85WYPpwH6hrqvl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS5i8-VvReq75zaS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKIR-AixG2aaJyUoR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyw40DC-hWkWoomqTN4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzWqusboP8AoJJnr_Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwZjHfU67bka2pv7ox4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]