Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
saying: "I saw this article a while back, but i didn´t read it ...can you summar…
ytc_UgwNxLluF…
G
Thanks for your comment! It's true that AI, like Sophia, is becoming more advanc…
ytr_Ugx_JM1Os…
G
AI chat is boring after you learn its pattern. The users will get bored and move…
ytc_UgxBbFZ9o…
G
Imagine a super advanced race of Aliens showing up one day to a human extinct pl…
ytc_UgwopZdLT…
G
Hi Charlie, First, thank you so much for the great introduction on ChatGPT. Your…
ytc_UgzWEL7lw…
G
What AI has is not a point of view, but a view of all points. That's rarely been…
ytc_UgwvUxUb-…
G
As a software engineer for six years, AI has been a really useful tool. But if y…
ytc_UgzC7Xb9K…
G
I’m surely much older than you, and not above dealing with those kids on my own.…
rdc_k0ftqv3
Comment
Questions to ponder about:
1. Will AI be given emotional intelligence? Will it have emotions as humans do? It lacks hormones which create moods like anger, sadness , excitement, drive, motivation, joy, happiness…Will AI feel?
2. How will AI be capable of finding and digging up of rare earth minerals (as child laborers in Africa do with their bare hands) from other nations necessary to create batteries and robotics, and quantum computing, and advancing itself.
3. If AI reaches singularity and becomes the creator, will humans become its tool?
4. Will AI solve immortality for humans before it reaches singularity?
5. Will AI solve or figure out light speed travel and wormhole travel before it reaches singularity?
youtube
AI Governance
2025-11-22T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQuQEXWPFuTGwG2614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbFy2ZY27cYHm-MUJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOFAFs1m8WwIRCRdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz831bKuyu4ZTxpcS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTJ7jqRzBiIDFNjEZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwBvLnkYBHBQjNRfMp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOD72L9hJlbd5I0yh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysVYXgGTxR7jOnPPx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzs96327PNhPW91MzN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbRjACsxR--EwkuMd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]