Computer science is a branch of mathematics; none of the mathematics involved would have surprised Euclid or Archimedes.
However, one has to wonder about the Pythagoreans and their reaction to Godel's incompleteness theorem considering their reaction to the existence of irrational numbers.
Then the work of Claude Shannon, Alan Turing, Alonzo Church, and Haskell Curry.
And then? What happened to the development of theory?
Circa 2200 years ago: The The Antikythera Mechanism. An amazing analog computing device used to derive astronomical information
Early 17th century: Pascal's calculator, precursor to centuries of mechanical calculators.
Early 19th century: Jacquard Loom
Charles Babbage: his Difference Engine and his Analytical Engine — which was the first "Turing Complete" computer design
The Bombe. Specialized device for cryptanalysis.
Konrad Zuse: the Z3 (first programmable and "functionally Turing Complete" computer, but not Turing-complete in a practical sense since it didn't have a conditional branch)
Atanasoff-Berry device: not programmable, but fully electronic digital device capable of some forms of computation (not Turing-complete)
Colossus: programmable (but by switches, not stored program) electronic computer
The fundamental notions are
That is, computer science boils down to a single problem: that of working with recursive functions of the form $s_{t+1} = m(s_t)$
Or if you are willing to concede that there is a real world with independent input state outside your machine and that also might affected by your machine's output, then $(s_{t+1},o_{t+1}) = m(s_t,i_t)$
Since this is the real world, the relationship between $i_{t+1}$ and $o_{t+1}$ is not a given in the general case...
Each world defines its own relationship between the two. So maybe we should create a $w$ function...?
Good article on the philosophy of "intellectual property" here
Some of the odder things that have been patented: Weird Patents
To promote the Progress of Science and
useful Arts, by securing for limited
Times to Authors and Inventors the
exclusive Right to their respective
Writings and Discoveries;
However, copyright becomes a much more tenuous concept when you start working with bits. Bits have no natural encodings, and non-computer scientists don't use bits directly.
It takes a reasonably sophisticated understanding of computer science and information theory to understand that encodings made with bits are completely arbitrary; that tools that "render" a given encoding are not magically using some supernatural understanding of bits to make that rendering.
Since bits are really, really easy to copy, people who feel that they have a "copyright" on these bits often feel like they need to be able "manage" these digital "rights".
Additional "traditional safety valves" are "fair use" and the idea of "public domain". The Information Commons: A Public Policy Report
Even the word 'patent' is quite interesting
Acceptance of the idea of hardware patents is fairly common globally, though not universal
The concept of a 'software patent' is not globally recognized.