I love your enthusiasm! But as someone who works in semiconductor development, I feel a bit like it is time to abandon this branch of the technology tree for now again. Maybe I am just disheartened from the PhD stress, but where does it really lead to right now? Following up on Moore’s Law right now just seems to promise higher efficiency and lower electricity demands while actually that is mainly greenwashing attempt IMO (lower resolution technologies are more energy and resurce efficient when considering resource demand during production; high device density leads usually to increase of the number of transistors which are operated parallely, so while the single FET is more efficient in dynamic operation, the whole chip might have much higher leakage). At the same time this efficiency is used as justification to just increase the calculation load whithout considering if it is useful (e.g. LLMs). Resources might be better allocated for More than Moore/architectural approaches e.g. for neuromorphic computing to actually reduce the immense AI computing load coming up.
I love your enthusiasm! But as someone who works in semiconductor development, I feel a bit like it is time to abandon this branch of the technology tree for now again. Maybe I am just disheartened from the PhD stress, but where does it really lead to right now? Following up on Moore’s Law right now just seems to promise higher efficiency and lower electricity demands while actually that is mainly greenwashing attempt IMO (lower resolution technologies are more energy and resurce efficient when considering resource demand during production; high device density leads usually to increase of the number of transistors which are operated parallely, so while the single FET is more efficient in dynamic operation, the whole chip might have much higher leakage). At the same time this efficiency is used as justification to just increase the calculation load whithout considering if it is useful (e.g. LLMs). Resources might be better allocated for More than Moore/architectural approaches e.g. for neuromorphic computing to actually reduce the immense AI computing load coming up.
Sorry for the rant, I think I gotta quit my job.