Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is if AI gets rid of people in the work field then no one will be able…
ytc_Ugw4E-9Nb…
G
Look for david Adair on youtube re AI.. he has his own and is nervous about the …
ytc_UgyGFnxOS…
G
The problem is when you don't know you're coming up on a complex situation. The …
ytr_Ugx9A6ued…
G
@QSuperstar888 First of all the "sitting at home", it's not because of me. It's…
ytr_UgxvT-vYS…
G
Are you asking if Google still uses searches with a "-AI" parameter to train the…
rdc_n3x6nra
G
It has manually teached targets the code can be optimized to speed it up its an …
ytr_UgzEQz7EA…
G
From the godfather of AI - In hindsight "I wish i had spent more time with my wi…
ytc_UgzwP68jA…
G
Another day, another case of "We tested the ethics and empathy of the big patter…
rdc_o7phhjk
Comment
So, to protect us from AI, we're going to use AI... This is so stupid. All of your reasoning amounts to you giving up and hoping another machine could protect you from a different machine. I don't get advice from scientists without a backbone. The religious view that science will protect us is a crazy mindset. Science gives us the tool in order to understand the world to the best of our abilities to a point and that's it whether we act positive or negative is all on us. Right now we create enough food to feed over 10 billion people and yet more and more people starve to death than the previous year in the past couple decades so I don't have high hopes for this species.
youtube
AI Responsibility
2025-05-21T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx_zIuAwEeIJ9CCZ2t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBbVKGhKeO1wcFEpR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxiImBAjwf60DwG4Rd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8_xlgX8DoJLZWj-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzhOl3L5wixbhZFVUJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjWnbfXmRYAuVsuA54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZYWXPgP4EmlDQdqB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBLXCKixhFMMOtZNJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVyxiK29yzuLE6YSt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6n-BtEGrWJDbM9np4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]