Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone needs to know about anti facial recognition glasses. They're spendy but…
ytc_UgwzbDtPg…
G
I ride in Uber Tesla's all the time. Never had any problem. And the drivers love…
ytc_UgwW0SSYh…
G
I spotted the flaw in auto driving vehicles right away, there's nobody DRIVING! …
ytc_UgxqOTDgK…
G
If they maintain that AI could feel love, then it could also surely feel greed, …
ytc_UgyZw_7wD…
G
He is going around on every media talking like this so he can promote Anthropic …
ytc_Ugztz864r…
G
If I want to cook my brain, I wouldn't ask AI, I'd probably go straight to inhal…
ytc_UgznaOCku…
G
:P. im sorry.. ai is not the problem. if you put an emotionally undeveloped bei…
ytc_UgyZICCQU…
G
All I saw in that automatic garage were short - low - not tall cars, I suppose t…
ytc_Ugy4QHBO1…
Comment
The idea that asking chatGPT why Ukraine should surrender is scary for these folks is exactly why I think they're the last people that need to be regulating it beyond passing laws that the private sector doesn't over regulate it without any risk. The US Gov in general thinks you are too stupid to ask chatGPT why Ukraine should surrender.... Mull that over for a minute.
youtube
AI Governance
2023-06-17T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKbVr8AD6qurQpRNF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVUKdtMhN2R1qYJx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3KoKFTD0cTxMCpdx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyGqwiPod42Cik6WIp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4WA89hTwivJVjrA94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxovHzKmprwR3Zk5S54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzP050ma9nD4O-b9YR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6Qdv7z9kP_UILWjd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVfCehIphFHuJkQuJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxhH5sfezZNmHAf-CJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]