Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's a thoughtful point! The dialogue highlights the balance between AI and hu…
ytr_Ugwq2LS4-…
G
@thewannabecritic7490 that if people are using AI the wrong way,
Because I do…
ytr_Ugzgs_fF0…
G
you put sm effort into making ur videos like finding certain ai generated emotio…
ytc_UgwwA4G4H…
G
Good thinking. Let's distract everyone with sex-bots, to distract from all the c…
rdc_lzdubxi
G
There is no Problem with AI as long as it can't move about eg ROBOT MACHINE don'…
ytc_Ugx9yb0vx…
G
The great question really would be - “ aligned to whose set of values “ ? Misali…
ytc_UgyYbhXni…
G
hey just an fyi, youtube is also training ai on the videos, right now it's fairl…
ytc_UgxfoGb-F…
G
You train an AI on human natural language artifacts and it will act like it has …
ytc_Ugx218Xke…
Comment
Another example of people just not understanding what AI is and is not. AI isn't actually intelligent. It just combs through text it is given and parrots back similar results to those text prompts. It doesn't know if what it's saying is factually correct or not, because again, it's not REALLY intelligent. It's essentially just a very fast, very efficient word scrambler.
youtube
AI Responsibility
2023-06-10T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwlpFnT5aSC9PGKK014AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy90Q7xEliGln9zI4x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxoYm-F6Qz_XGbBbth4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCvEvWEA-cd-VVpOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyV_hd1X-geGsJVh1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz98ufjdbhtvqRoGUF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxhTAUM-dfk8Xa3aFV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyFA6gxHUutcx_0byV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxEq6OPobdTWDJbrYJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw5t1FfdHxIe5Z47m54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]