I missed the admission deadline for a graduate Philosophy program so I created something to guide me over the next several years.
Starting with Euthyphro next week. Feel free to join or follow.
Comments
baubino•1d ago
I appreciate the ambition behind this and I think the concept of the project warrants discussion. The flaw here is one I see so often with AI projects, which is that there’s a fundamental misunderstanding of the mode of knowledge that you seek to replicate (or supplement) with AI.
First, no doctoral program in the humanities is pedagogically based in reproducing existing knowledge. Conversely, every doctoral program is built around producing new knowledge. This is important because the point of doctoral study is to generate insights that don’t yet exist.
AI absorbs information and they can remix, summarize, and share that information. AI cannot produce new information (i.e., novel interpretations that generate new knowledge) because AI cannot reason. Doctoral study has as its aim the production of new knowledge and (in the humanities) we use reasoning to do that.
If doctoral study trains students to produce new knowledge through reasoning and AIs cannot reason, how will an AI assess a student’s outcomes? If an AI generates a kind of simulation of logic with its next-likely-token structure, how can it assess intellectual productions that are unlikely and thus actually novel?
All this to say, your project sounds like it could be a great learning tool. It’s unlikely though to provide the framework for producing new knowledge that is offered by a doctoral program.
- My thoughts as a PhD holder (though not in philosophy) and as a longtime teacher/advisor of doctoral students.
baubino•1d ago
First, no doctoral program in the humanities is pedagogically based in reproducing existing knowledge. Conversely, every doctoral program is built around producing new knowledge. This is important because the point of doctoral study is to generate insights that don’t yet exist.
AI absorbs information and they can remix, summarize, and share that information. AI cannot produce new information (i.e., novel interpretations that generate new knowledge) because AI cannot reason. Doctoral study has as its aim the production of new knowledge and (in the humanities) we use reasoning to do that.
If doctoral study trains students to produce new knowledge through reasoning and AIs cannot reason, how will an AI assess a student’s outcomes? If an AI generates a kind of simulation of logic with its next-likely-token structure, how can it assess intellectual productions that are unlikely and thus actually novel?
All this to say, your project sounds like it could be a great learning tool. It’s unlikely though to provide the framework for producing new knowledge that is offered by a doctoral program.
- My thoughts as a PhD holder (though not in philosophy) and as a longtime teacher/advisor of doctoral students.