Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I totally agree with the thing of liking art more if you like the person. In a s…
ytc_Ugx6JoUTc…
G
Skynet became self-aware on August 29, 1997, at 2:14 a.m. Eastern Time! When is …
ytc_Ugzoh-Sm1…
G
I worked with ai back in 1999. It's a mistake to think it will only replace huma…
ytc_UgwORgEYZ…
G
Y'all said China is authoritarian because they use facial recognition. American…
ytc_UgwECWq3p…
G
@splaturials9156 This argument could be used for a lot of work that has been re…
ytr_Ugy--cQ1x…
G
I think people just need to change their mentality. AI Art isn't Artist Art. A…
ytr_UgzS62wTG…
G
We can’t afford or allow big tech from taking our jobs period not even agricultu…
ytc_Ugyhd0tWr…
G
I was into the interview until she talked about climate change... how can someon…
ytc_UgyACYN7j…
Comment
AI hallucinates in its fact finding and ‘creativity’ and jumps out of the sandbox.
AI needs humans to review its answers and distinguish hard facts from dreams.
AI excels in searching and pulling data from banked memory, playbooks, statutes, case law.
But I don’t want the wrong answer or only partially correct answer in 7 seconds.
AI seems to become impatient when it can’t find what it can confirm is the true and factual response and that is when it makes connections that are misleading or outright wrong.
AI is a good resource, but not what it presents to be yet.
youtube
AI Governance
2025-10-02T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxW2LxSHFfWzsGyBvp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFurOO8erz7de4pe14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJaebk82igKGk_-1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeH9YZ5mxzqRld7Qx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0PabS97Bb3CMJykJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvlxaRcyy9qnFn7ZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh1OqSp9TpC2_nVFZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxldftwoLpCOdIwXGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmY1dUOWoGIBfpawZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugzps9VcnMyAh4ZMQhh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]