Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Altman is intentionally downplaying the potential of AGI. Comparing gpt-4 to AG…
ytc_Ugxz9NZDl…
G
If AI takes over and is benevolent, we will live like our pets do now. We'll go …
ytc_UgyfarFlV…
G
Anyone ele already ee this scenario coming : a worker ai is suddenly considered …
ytc_Ugwyzfc4X…
G
The trajectory of technological advancement points toward an inevitable integrat…
ytc_UgwkB5nMU…
G
Ai can't do anything without command what you give....humans always confuse what…
ytc_UgzvVHZZO…
G
So, nationalise all these AI companies? The party of small government wants to h…
rdc_o792oiw
G
AI is winning art competitions and helping people create like never before. We a…
ytr_UgywgP5ar…
G
Not keeping up with all other countries on the development of AI sets us up to b…
ytc_UgxBhPJop…
Comment
I disagree with Neil on the AI job replacement argument. Unlike the transition from horse to automobile, where the loss of jobs was offset by new ones, this is the first major leap in human evolution that is not designed for our convenience or to expand trade networks, but is focused entirely on streamlining and eliminating jobs by mega-corporations. I agree that AI has its place and will be extremely beneficial to society, but to pretend that the jobs lost will be offset by new opportunities, while struggling to provide tangible examples, speaks volumes. Population decline, as a result of economic hardships, may ironically offset some of the impact down the line, but the economic crash is coming faster than anyone cares to admit.
youtube
AI Moral Status
2025-07-24T11:0…
♥ 224
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxrWJh1jzZmkAe_BLx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgzO4z7a2S9DzFtR3dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx-RSTR2D19kXjDDIN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwc9PXN4Doaw_3Zo7R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgxLbO-GBr3uQyy7opd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw4sCR-7hBpryYjGjp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzlVJChXy3nmWCWdwR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwyFM97rq2IynGXyLt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgyMRbwjTdEB8aoSu1B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugyd2mgM-HDjzuilk194AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}]