I'm a Computer Engineering student, this is something I've been asking around because I want to make sure I am doing a right choice before changing. To be clear, I don't dislike programming at all, but I’ve been grappling with a worry that is killing my motivation to continue learning to a deeper level of it.
Now, I know my fair share of C/C++ and can handle intermediate concepts like pointers and memory management. However, I no longer have the drive to manually code entire projects from scratch.
Recently, faculty at my school have been discussing how AI is shifting the programmer's role from an architect and builder to just architect, where the AI becomes the builder. I already have seen people showing this here. For example, someone I know recently constructed a basic Operating System (kernel/userspace separation, scheduler, POSIX like syscalls, etc.) by guiding Claude to code it based on the OS theory that he has being studying himself. The fact that a student could pull that off with or AI assistance is impressive, but it also makes me wonder the following.
What is the point of me grinding to build/learn to build full blown programs manually if I can guide an AI to do it for me, provided I know the fundamentals? This has really led me to consider changing my major to either another engineering major that is more "real world" focused, or going to study a double major in physics/chem.
I love building things. The thing is I don't see why teach myself code beyond this, just so that in the end, by 2030 what means being a software engineer already changed. It is happening already as far as I can see.
Now, I am not trying to say that AI will replace developers entirely, or that computer related majors are dead or anything, but for example, with Meta starting to do changes to their interviews, and other companies following after them, the role of what these used to be is shifting fast.
What we call "AI" has only been mainstream for about 3 years and is already at this level. By the time I graduate in another 3 years, tools might be able to handle hallucinations and edge cases much better. AI is not a thinking things, in the end is somewhat of a predictor, which can get better as time goes on.
Anyway these are the things that are in my mind. I really would like advice of people that are actually in the industry or in research to tell me what they think, thank you.