Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Forcoy that’s true to some extent but this doesn’t help at all to stimulate it …
ytr_UgxOOVDdW…
G
Thank you for this great talk, Ted
Please forgive my tearful academism.
I agre…
ytc_UgzqjKHaY…
G
I agree with Rick Rubin’s comments about art and the expression of a point of vi…
ytc_UgxBk-uC9…
G
As a software engineer who interacts with AI probably 3 to 4 hours a day. Its ab…
ytc_UgwlIkV3g…
G
I hand paint all of my work, and I make Timelapse videos of my process. It’s nec…
ytc_Ugz_ucQ45…
G
If you pay for ChatGPT it's in the legal documents that confidentiality can be b…
ytc_Ugwea_6ZN…
G
if anyone is wondering, AI neither has actual feelings, nor consciousness. Unles…
ytc_UgxSec0Kp…
G
The old CEO is full of s#?t. Humans are using AI trained by Humans and billionna…
ytc_Ugy130o6R…
Comment
bro Nividia replaces you in seconds WSJ. EVERYONE.
No one's regulating AI. Even if an AI company guarantees it, it's not in their interest to power down their AI.
They'll run your life. Either you need a pretty investment portfolio to retire or work in craftmanship that requires manual dexterity (which'll also be replaced.)
Anything you touch, see, hear and smell is easily generated by AI.
I know 2030 is not gonna be a happy place.
The suicide rates are gonna be high.
Nividia uses AI to prompt another AI. Which means a human prompt engineers will no longer be needed.
Let's just wish our kids have an happier life ahead.. hope for the best y'all.
Soon AI cops controlled by government will show 0 emotions to catch, arrest & kill you.
A future abandoning humans over AI.
Everyone thinks today won't be that day. But someday soon.
youtube
Viral AI Reaction
2023-06-04T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzTxWe3f9_PwfVSkzd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8hgfAAAmzWj4-dGJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugys8WwHYwwgesRu57x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzeJK8D12cjwuvdOKh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywIwFD-IMSGy5iWFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJxQeBINYI6BJJhK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxf5aUL_Ry7JVRhKKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIpaj8BfNQh6ysEq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzi0g0Lh81A8Q0b-il4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDsbO9DPGliMvkRoF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]