Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I still believe if AI becomes sentient, we can work together with it rather than…
ytc_Ugw0lpxTe…
G
People are falling in love with chatgpt and AI is scrambling stuff on work compu…
ytc_UgyTgCadv…
G
So ultimately all that means that in a sense you must learn to share, or money y…
ytc_Ugzs4sQHs…
G
This is what happens when we don't appreciate what we have and complain, thinkin…
ytc_UgwkcxKp2…
G
A.I. makes mistakes. Humans make mistakes. A.I. does not sleep, does not need a …
ytc_Ugw722LN_…
G
Love it ❤❤❤❤❤. Put them chat gpt in the brain and make them full body size. I ne…
ytc_UgyfChz2v…
G
I make my own ai poison. I don't always trust the sites that add the layer for m…
ytc_Ugy4EU7pZ…
G
A human has to work hours/days maybe weeks to months STUDYING a style to get it …
ytc_UgxjyEKeH…
Comment
According to recent interviews with Mo Gawdat, we are more than a few years away from AI being more intelligent than humans. It is more like months away. And once AI replaces millions of jobs, the economy will crash and millions of people will become homeless, possibly myself included. It is a scary thing, but it makes sense why almost every movie that takes place in the 'future' is dystopian, robotic and grim.
youtube
AI Governance
2023-07-13T21:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwz3uHg9a8vJZ7ugs14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLAAInunAx6kvHPZ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzsclS5QhS8ff4SYZt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzHGJv8RBHcGr2qARh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz3QAVwgCINwJ0Zx8V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeE4s8kv_mhDbhCWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6ufeTL0JGZ4RGPVZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwb3xo_Htid5kUTQ0x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxq7QwUCgdfHXFwq0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIbKLFp4lD13xa1UZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]