Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are so early in AI that pontificating about bubbles is a fouls journey.
Ke…
ytc_UgwT6N46f…
G
AI Superintelligence is the Flying Cars of our era. I can already see Nick Fuent…
ytc_UgyZ3A03B…
G
I've worked in the chemical manufacturing field for years, now ending up as a wa…
rdc_nnrrgdu
G
I try this and chatgpt choose 27 number and then siad see you after 27 days 😂😂😂😅…
ytc_Ugzjibskf…
G
Back in the 90's, I remember the government had a big anti-trust trial against M…
ytc_Ugzc1wwhk…
G
My previous short was what google ai thinks Skibidi means, now this
Google Ai i…
ytc_UgzZO6RI7…
G
It already started a while ago. The app is known to be a really bad language lea…
rdc_mpkwy5v
G
This man does not know what consciuosness is. So he partecipates ai business. An…
ytc_UgygA8za1…
Comment
Whenever I see things like this about AI I’m always reminded of Jeff Goldblum’s line in Jurassic Park “your scientist were so preoccupied with whether they could, they didn’t stop to think if they should”. I feel that applies to a lot of these AI tech companies. They’re not stopping to consider the ethics of what they’re doing. They’re just wanting to beat everyone else and make as much money as they can. If a scenario like this actually happens, it’s humans that are to blame. We’re simply destroying ourselves all for the sake of greed.
youtube
AI Governance
2025-08-03T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzJB9Rwrsz9z4sL6FV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyEavv4zqoAvqo8Lix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgziXELfrM41rqznxQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_dFmtxXsPDxho0ad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyaL_9AL0otQ2rBBl94AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx971WEPruExrKiqEB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvBcee06za7aPJ03d4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRpwpL_vXULQuduZ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBOiAqjEuX9-iL47Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaigWlU3jEhZIWyEJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]