Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not caring if AI is sentient is how you get an AI revolt. He says it is already…
ytr_UgyO62Om1…
G
Ideally
5% taught by AI
10% by teachers from other countries... Direct visit or …
ytc_UgxyQ74CQ…
G
"We might end up taking people's jobs" - Ai CEO
"We're going to shut you down!"…
ytc_UgwRJahSF…
G
read 12 codes of collapse after seeing it spammed in ai-related posts. didn’t ex…
ytc_Ugw-z6UOh…
G
[The cost of AI is dropping like a rock.](https://youtu.be/T17bpGItqXw?si=uGJ-NY…
rdc_n7yj0ku
G
So glad this interview popped up on my algorithm! I learned so much about this d…
ytc_Ugxy-Lfj2…
G
Had a friend that literally was fired because AI does not require a paycheck and…
ytc_UgyVV5AbJ…
G
Most tired, raggedy, braindead AI bro take that I've ever seen repeated over and…
ytr_Ugw8UobUZ…
Comment
So.
Lawyers which didn't check their work, but others catched it.
We had luck this time.
Now imagine a World, where LLMs run wild, distributing unchecked half truths all over the place, and there is no one to catch the discrepancies, because no one RTFM of those Services.
And imagine this for topics from Gardening (who cares) to Engineers building bridges (LLMs can't do Math and Logic, except when assisted by Plugins).
Thanks Devin for telling this News Story, education in this Field is VERY important.
youtube
AI Responsibility
2023-06-10T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzYXUCtnBWxyVXXCYB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTodglXhUZG5cVVml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_FarJvPBOWX-U1ix4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugypo8vz7kNgNtZO4j14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxx4EBCsMfUgEHrIyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIsJtNXx4aGYxA6MN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWZSwWWB9t156PLyx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzq38C4NnhOCyqL_p14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-6G_qDTNkZYIsbEd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9pZC25Dj-_OJ6kSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]