Back about 200 years ago Alois Senefelder invented lithography as a way to print on paper. So what does that have to do with making computer chips? He used some oily stuff to protect areas where he did not want the ink to stick. The oily stuff resists the ink. Computer chips are made using photolithography. That is like lithography except instead of sharp point to draw, you use light.
The first thing that we need is a substrate, the stuff that we want to draw on. It is covered with a chemical called a photoresist. Photoresist, something that resists light. A lot of chemicals can act like photoresists, many of them are polymers. They change when exposed to light and usually they cross link and don’t get washed away during the development (look below). Next we need something called a mask, which is like a stencil that has the pattern that we want to draw on the substrate. If we shine light through the mask, the light only goes through parts of the mask that are clear and shine on the substrate. So the pattern on the mask is transferred to the substrate. Think about making shadows with your hand. To make a computer chip we need to draw a very thin line. The key is to ‘draw’ with very fine resolution.
Resolution is a measurement that tell us how close we can put two things together and still tell that they are not one thing. Alois used a very sharp point to draw his lithographs. Today’s computers have things in them called transistors that are so tiny about a thousand of them can fit across the tip of Alois’ pen. Once we are done shining the light through the mask on to the photoresist we need to develop it. That involves some other chemicals that wash away the photoresist usually where we did not shine the light. What is left is our pattern...