Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we say AGI is more intelligent with respect to humans. It seems an overstatem…
ytc_UgxbNXMYn…
G
honestly it was only a matter of time until something like this happened since d…
ytc_UgxsN14dx…
G
So if I'm understanding this right the only 5% of companies who've really succes…
ytc_UgwMpDRIc…
G
AI is not copyright infringement. It is fair use. You are not in the market for …
ytc_UgxAYh4vc…
G
Curious, like to make vocal fry. However, I cant, and play guitar. So because im…
ytc_UgyzZioJz…
G
Maybe an extinction level event?
Put AI on the shelf for ,mmm, 500 years should …
ytc_UgyLmNsPg…
G
Guys, I can share with you what I tell my students!
You don’t have to be top1% …
ytc_Ugzg_Nasy…
G
An algorithm software and artificial intelligence are completely different. The …
ytr_UgzcUBn0y…
Comment
The scariest thing I ever saw was two AI robots, from two different sources, placed in the same room. They were introduced to one another.
Within a minute and a half, the two developed their own language that humans could neither understand nor decode. Their conversation went on more and rapidly until the humans who created them literally pulled the plugs. No one knows what they were discussing.
youtube
AI Governance
2024-11-15T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgynfsZ9MOwaCCDGPfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwlDW5b7B66hFaztt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwoWOUKU7C3XRvrk-p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyp2E9FtT-XUugXB954AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxXN22sVoe5ZCCwFTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZRiR2smq5yNp0Tuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5z2LkyuV05ibDcWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxFbUhAL8Lta3O5UF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwRsaQkOHQYw_SDPGJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz5YsmK0Xzf6CYC7LV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]