Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think her book would be a good one to read (I haven’t read yet but will)… also…
ytr_UgzC87pa6…
G
New slaves business model on way (if only you knew the earth quarantine of recyc…
ytc_UgzY0Owq2…
G
I’m just saying, in Metal Gear Solid universe they made AI that could do complex…
ytc_UgxMJcv5-…
G
" Up-skilling " , lol.. just wait for self driving cars, no insurance , no cops …
ytc_Ugz9MEYY8…
G
AI already knows how stupid humans are with just plain facts. How stupid we are …
ytc_Ugy0OpRNm…
G
Currently AI could not survive if it wiped out humanity.. quantum computers tak…
ytc_UgxPB0nSB…
G
When you get ChatGPT into situations like this, you actually see an effect of wh…
ytc_UgzBxI_59…
G
What he’s probably referring to is you won’t be able to tell the difference betw…
ytc_Ugwvx75Jx…
Comment
I remember Legal Eagle covering when lawyers tried to use the Chat GPT on official paperwork in a case. They apparently got into a whole lot of trouble, largely because the AI cited fake cases and they didn't check the AI's work and just assumed it was right. The judge looked up these cases and found they didn't exist and got quite annoyed and that's when it came out what the lawyers had done.
youtube
AI Governance
2023-07-07T03:4…
♥ 102
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEshr6nLHwYn2q_rV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx9FnmJAZ4sKmTnM4d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHzsfTlYtq1vmqzVt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEDi3JXLXT0h6oRDp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWJS_orKsXcFLxK7t4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqRQXKta-jrkeMIal4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxsqoti4PfNruTI50d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2mN5vrpmkFluHYZF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwEXX0KjRPZQ43gXGp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGffDoPVLIsEDGpuN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}
]