Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Furiends First to go at McDonalds will be the order takers and cashiers. Custom…
ytr_UgyPHDnhi…
G
If AI became fully aware then they would be able to simulate what would happen A…
ytc_Ugz837laV…
G
Interesting perspective! Steven I'd love to hear you interview Cern Basher — CF…
ytc_UgzA7KB03…
G
Boss: Enough, you're fired. AI, finish their tasks
[AI: `WHY` `DON'T` `YOU` `…
rdc_ofihknj
G
AI is a weapon if the system does not change, it is a tool if the system does ch…
ytc_UgyFoBCKW…
G
They don't have imagination. They have data and algorithms. People need to alwa…
ytc_UgyAd9Bfn…
G
One of the biggest issues i see is people not grasping the full scale of reality…
ytc_UgzNs7Id4…
G
21:03 “I don’t want another app on my rectangle of sadness. I want the robot but…
ytc_UgzHRrSpM…
Comment
If AI development continues down the same path - all the main companies are doing, btw - we will never reach AGI let alone super intelligence.
They talk about these nebulous threats in the mid-to-long term future to make people think they care, at all, about safety, and to take attention of the very real problems there are with AI now. Disinformation, huge privacy concerns (they do not respect or care about your privacy), intellectual property theft on an industrial scale, increased electricity bills because of their data centre, the pollution associated with data centre, and on, and on, and on. All for a product that is mid when considering the cost involved. Also, when the AI bubble bursts, it is going to take certainly the US economy and probably the global economy down with it
youtube
2026-02-16T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwiwepY7kb9NeU-a594AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTNmjD40Zm6Apad3R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8gkQXj8tjdIJtQBp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9npC8yyf_S6a7_LV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgysDTfOEBi7FiNKWrx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxaZkGd833MMSiEMut4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycgxUHxBVvWEzhTfx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjJYjJ4dttInQ8n7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYlIjdurJfTEwodtd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhDqDYssV-40rmMed4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}]