Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Altman says there are different views regarding the safety of AI systems. Well, …
ytc_UgwCJ5fxc…
G
"The Oreos of helping humans." As soon as the example came up that humans need t…
ytc_UgzSI7yNy…
G
The amount of AI slop, and the numbers of dumb people who think it's real....is …
ytc_Ugx-CcyWb…
G
If Tesla self driving car cause an accident then who will be responsible for oth…
ytc_UgzsuF1dQ…
G
I got 3 words for AI artists: “Release your prompts”. And if I see any artists n…
ytc_UgzvMXgwD…
G
Another day, another threat... what is it today? Ah yeah, flying AI datacentrers…
ytc_Ugw6ugvO3…
G
This is a life time project. I couldnt live with out it at this point, built fro…
rdc_ohus8ki
G
People who say they will never be replaced by AI or robots generally have no ide…
ytc_UgzO3ySwn…
Comment
AI will be the Pandora's Box of the millennium.
When AI starts showing signs of learning and demonstrating independent thought, we should start panicking.
youtube
AI Jobs
2023-05-07T16:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzIQAoPw_bMr8Tlj1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyORqviZ8OQjlQPljN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw58mzCd8jhG2rmLid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOhykPPpBvdNobTAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOvn-ceHtoVXEbN_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywYOJL-0l06yz0iSN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE3ZvWJZIfEifaa7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyqkJbXGLgJTpd7xip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTRoO7YmFg3y7RV6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxytHsR7tnDbbvebKt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]