[Candy, thread started around July 5, 1993]
Hello All -
I am a begining (returning) math student, and am enrolled currently in
"Basic Math." A lot has been added to math since I saw it last.
I would like to know why "0" and "1" are so special. These two
numbers seem to be included in the definition of most "properties"
that we have covered so far, and they just seem to be more "powerful"
than the other numbers. They seem to come up more often in general.
This coupled with the fact that computers run off of 0's and 1's - and
no other numbers - has my curiosity running very high.
Can someone tell me if 0 and 1 really are more significant than the
other numbers? Can it be explained in terms a beginner would
undersatnd? Or am I just reading more into 0 and 1 than there really
is?
Thanks - C.F.
------------------------------------------------
[To Candy]
1. You read more than there is. 0 and 1 are as "yes" and "no". You
know games where after a small number of yes/no-questions one is able
to decide about someone's profession or so. Or try this one: Think
about a number, then ask a friend "What number did I think about?" and
"You may only ask: is it bigger than ...?". After very few attempts
your friend will find the number.
2. As you see however, there is more in than it seems. Large
information can be stored in sequences of 0s and 1s. Any word of our
language can be translated in such a sequence, for example let 00000
stand for a, 00001 for b, 00010 for c, ...., then you can write any
character (a,...,z,space) as a sequence of quintuples of 0s and
1s. Hence you can codify any book which has been written in this way.
In the genetic code our entire biological program is contained in
sequences of four letters G,A,T,C. Write 00 instead of G, 01 instead
of A, 10 instead of T, 11 instead of C, and you can write the genome
with 0 and 1.
------------------------------------------------
[Candy]
Josef -
Thankyou for your explanation. I have printed it (and G.B.'s also)
because I need more time to ponder what you both have told me. The
"ask a friend" game you mentioned helps a lot. Thankyou -
C.F.
------------------------------------------------
[Candy]
Hi S. -
I see - computers only want opposites to work with ("yes"or "no"
"true" or "false," "on" or "off") to reach their goal, like a process
of elimination maybe? Your examples about other numbers being special
(2 always makes even numbers, etc....) will give me LOTS to think
about. BTW, I started math studies out of a desire to learn to
progrm, but one must complete courses up to and including trigonometry
befor one can get into the first programming class. Your electronics
class sounds like something I'd be really interested in.
Regards - C.F.
------------------------------------------------
[Candy]
Hi S. -
Boy, you're not kidding about the Compuserve bills!
Yes, I have the desire to learn the math I need. I finally realized
it runs our world. When I was a kid, I had no idea why were doing
"arithmetic." I really thought THEY were just running us through our
paces just to make us work, those mean nuns . I'm off to a late
start with math, but it's a much better one than before, for sure.
C. 00111000
P.S. - I saved your name & user I.D. - Thanks - P.S.S. - I know, too,
that math is not needed to learn computers, that one can teach
oneself. But I want a formal education in math anyway because the
options seem to be much greater with this skill.
-C.F.-
------------------------------------------------
[Candy]
Hi T. -
You must have been reading my mind - that was going to be my next
question to S., about which two digits could be used to run a
computer.
I thought I was sneaking up on some great metaphysical truth about 0 &
1. Oh well....back to arithmetic....
C.
------------------------------------------------
[To Candy]
I think you are becoming confused. One has to distinguish between 0/1
= no/yes and the neutral elements of operations (as some other helper
remarked). In arithmetic this second role is more important. For
example, you want that for every numbers a and b the difference a-b is
also a number. But what when a=b? Then you need 0 (and for b>a you
need negative numbers, another matter). The 0 then allows you on the
other hand to decide, if two numbers are equal: You ask "is a = b?",
and the answer is "if and only if a-b = 0!". Similarly for 1. What is
the number, which multiplied with any a, gives always a? It is the 1.
------------------------------------------------
[Candy]
You are right, Josef. I may have asked for more than I can handle
here. At least I have learned that there is not a relationship
between 0 and 1 in mathematics, and the 0 and 1 used in computers.
Your latest response suggests that 0 (when it is the minuend)
designates equality between two other numbers, and that 1 multiplied
by any positive whole number (except 0) equals that same number, and
that these arithmetic operations are not considered by a computer
using 0 and 1 for it's purposes. I hope I am understanding this
correctly.
C. F.
------------------------------------------------
[To Candy]
Also 1 times 0 gives 0 (sorry). Of course the computer takes account
also of the arithmetic properties of 1 and 0. Your question is very
deep and has therefore many answers. One main point could be the
following, which is also important in computer science: In every
computer language you can do immediately arithmetic operations, as
adddition and multiplication. On the other side you may use impressive
software as some big word processors or data bases. And one could
believe that this second type of software is much more difficult to
program than mathematical software. But this is only appearance,
because you don't see the work hidden in the arithmetic operations
already embodied in the language. For example, usually you can perform
integer arithmetics for numbers as big as 2 billions, say, but what,
if you want to multiply (or divide!) numbers with thousands of digits
each? Then you have to redo anything from scratch, and you shall
discover that programs for common arithmetics (even only emulating
what you are accustomed to do with pencil and paper) are among the
most difficult programs existing. This means, that even such simple
mathematics is very rich in structure, and contains in this way many
different ramifications of the questions you posed. So 0 and 1 appear
under many different faces, and the tool (or glue?) which allows you
in any moment to pass from one interpretation to the other is
mathematics! There are no absolute mathematical statements, in some
sense, but the content of any mathematical theory are the logical
connections between statements, and by studying these connections you
can go forth and back between the different interpretations of the
"same" thing.
------------------------------------------------
Josef
I greatly appreciate your efforts in helping me understand the many
places that these two numbers can occupy, and the many different ways
they can be intertpreted. I knew my question was "deep" but I also
know that my level of understanding right now is quite limited. Your
comments about the big, "impressive software" are very interesting. I
have a renewed respect for my $25.00 pre-algebra tutor. I have
learned that math is not as rigid as I thought it was, but I also
think mathematics is probably one of the only ways that some truths -
whatever they are - can be expressed. Your last response contains a
great deal to think about - I have printed it and put it into my five
pound "Arithmetic" book.
Candy
------------------------------------------------
[Kevin]
Candy,
Please don't "keep your imagination in check." Math is the
product of imagination. Math should be broadening your view, not
narrowing it. If something stimulates your imagination then let it.
You can't just check your imagination before entering math, then get
it back four years later when you get your degree.
One and zero are of special interest. Every number is
divisable by 1. Every number multiplied by zero is zero. Pay
attention to how zero and one are different from other numbers. These
differences will continue to appear in amazing places.
I had a professor who did a study of "random" numbers
generated by various methods. He found that the numbers 0 or 1
occurred more often than other numbers. (Sorry I don't have the
reference)
Kevin
PS, you said you were interested in programming, but didn't have
enough math. Actually, programming is not all that math intensive.
Many programs are more interested in symbolic manipulation than math.
If you have a computer, you might want to get a copy of BASIC, read
the manual and start programming. The best way to start is to copy
some sample programs directly from the manual, get them running, then
see what changes you can make.
------------------------------------------------
[Candy]
Kevin -
I'm really glad you responded by teling me not to "keep my imagination
in check." Frankly, I don't really want to. I like the way you put
it - "You can't just check your imagination before entering math, then
get it back four years later when you get your degree." Your
professor would not have been able to conduct his study without the
means ( or imagination) it took to come up with the "various methods"
used to generate those random numbers. His findings are quite
interesting!
I do have BASIC. And I havn't the slightest idea what to do with it.
I am presently reading a book entitled "Absoulte Beginner's Guide to
Programming" written by Greg Perry, which is excellent for beginners.
It helps you prepare to learn to program - it doesn't teach it just
yet. After summer school ends, I should have time to finish the book
and enter in one of the sample programs he has in the book.
Thanks for your input, Kevin -
Candy
------------------------------------------------
[Kevin]
Candy,
I find it helpfull to copy programs from the book, compile and
run them. My biggest problem in programming is misspelling the name
of a variable. You might want to try the QBasic that comes with MS
DOS 5.0 and above. I think it is a little more user friendly than the
older compilers. If you have $50 bucks, you might look into the
QBasic development package or Visual Basic for Windows.
Again the first step is to copy programs from the book.
Figure out how to compile the program then get it running. Do you
have any friends who program? You might peek over their shoulders.
If they program in C or something, you might see about getting the
same compiler.
kd
------------------------------------------------
Kevin -
0 and 1 again?!?
I've heard the term "Boolean" before, but in reference to a way to
phrase your terms while searching a database, like Knowledge Index.
And yes, it is a COOL name! Thanks for the tip!
Candy
------------------------------------------------
[Carlos]
Candy, BOOLEAN, in programming, simply means that the variable can
have one of two possible values, either TRUE or FALSE. These are
usually represented by 0 and 1. Another interesting thing about 0 and
1: Any number raised to the 1 power will give back the same number
(e.g. 5^1=5) and any number raised to the 0 power is equal to (you
guessed it!) 1 (e.g. 5^0=1). This last one I've never understood why
(or, better yet, has never been explained to me). If you have BASIC,
by all means do copy the examples from the book. See what they do and
then start changing values to see what happens. Then, add your own
instructions!! This is most fun and enlightening. And let me know if
you need any help... Regards, Carlos G. (YV)
------------------------------------------------
Candy,
I may have some old BASIC programs I've written through out the years
lying around. Anything from printing cute messages to really neat
gravity simulators. I'll let you know when and if I find them...
Carlos.
------------------------------------------------
Hi Carlos -
Right! I forgot about that one....where anything raised to the 0 power
is equal to 1. This is the strangest property of 0 of all of them.
Thanks for your offer of help with BASIC - I'm sure I'll need it!
Candy F.
------------------------------------------------
Carlos -
.....a "gravity simulator??" I see the announcements talking about
new files and I've heard that term before, but what does a "gravity
simulator" do on a computer? Well, I think I should play with the
cute messages first! Actually, I'd feel quite accomplished if I
learned how to get my computer to display something I wrote in a
program.
Candy
------------------------------------------------
[David]
Carlos: on the question of why anything to the zeroth power = 1: This
is one of those things that had to be "backfilled" to make the rest of
the system work. (In fact, much of what is taught as "foundations of
math" was actually poured in under the house of math after it was
already built and occupied! That goes especially for the whole
emphasis on 1 and 0 that Candy started this topic with. If a teacher
insists that the whole structure is built up from 0 by "successors"
and "incrementing", you should take that with a few pounds of
salt. That "foundation" was poured around 1910, long after most of the
math we normally use was invented.)
But back to zeroth power: I'll try to show it by example.
4 to the 5th power (4^5) means 4 * 4 * 4 * 4 * 4. 4 to the 3rd power
(4^3) means 4 * 4 * 4. If you divide 4^5 / 4^3 you're really saying
4 * 4 * 4 * 4 * 4 ------------------ 4 * 4 * 4.
When you do the division, you're left with 4 * 4 on top, or 4^2. The
short-cut is to subtract the exponents when dividing the complete
terms, so that (4^5) / (4^3) = 4^ (5-3), or 4^2. If you then try
(4^5) / (4^5), subtracting the exponents gives you 4^ (5-5) = 4^0.
Since the actual fraction (numerator same as denominator) reduces to
1, the system forces us to decide that 4^0 is another way of writing
1.
------------------------------------------------
[David to Candy]
One more hint about learning: You'll probably learn programming
_much_ faster and more solidly if you start right away trying to solve
a problem or make a "software gadget" that you really need. Most
likely you won't finish the project! But when you're trying to bend
the computer to _your_ own purposes you'll get acquainted with the
parts of Basic that are compatible to _your_ needs and way of
thinking, rather than just trying to memorize everything. When you
start by weaving a few things into your existing thought mechanisms,
the rest of it will fall into place much more easily.
------------------------------------------------
[William]
Candy-
I didn't see your original note but judging from the replies I have
seen, you are looking at ways to get started in programming.
As is frequently the case, your repliers seem to me to be taking for
granted that "programming" equals cooking up recipes of itty-bitty
instructions for a machine of some sort.
1 fetch something
if you came up empty, go to 2
you did get something, so do something to it
if it worked, go to 1
it didn't work, so do something
2 stop
Etc.
What gets lost is the process of thought that leads up to the
recipe.
Nowhere do you ever actually say what the program is _supposed_ to
do.
That is, _if_ you know what the program is supposed to do and _if_
the program actually does what it is supposed to do, then everything's
fine.
But _given_ such a program, it is not generally possible to figure
out what it is supposed to do and it is therefore not generally
possible to figure out if it actually does that.
To people who start out by assuming that there is no _other_ way to
"program", of course, this is "reality".
But there is an alternative.
To distinguish the two, I call the familiar step-at-a-time notion
"machine-based programming" because that is what it is.
I call the alternative "language-based programming" for the same
reason.
Language-based programming always starts out with statement about
what you want:
I want to know how to make this gizmo go (or take me) from A to B.
Well, if there is no way in the world to make the gizmo go (or take
you) from A to B, that's the end of that story, but if there is a way
then you have to be able to break the process down into at least two
steps:
get_gizmo_go(A,B,[Action1,Action2,...]) if
makes_gizmo_go(Action1,A,X),
get_gizmo_go(X,B,[Action2,...]).
Plus, of course,
get_gizmo_go(A,A,[]).
(You don't have to do anything to make the gizmo go nowhere.)
Of course, you now have to do the same kind of analysis on the
makes_gizmo_go problem that you just did on the original problem, but
one thing already becomes clear about the difference between the
machine-based approach to programing and this languahe-based approach.
That is, once you have completed this analysis down to the point
where the only problems left are problems you already know how to
solve, you stuff the analysis into a computer and you pose a question
like
get_gizmo_go(home,office,How)?
and the computer comes back with
How=[left_from_driveway_to_14th,
right_on_14th_to_frisbee,
north_in_frisbee_to_mabels,...
park_the_car]
So a machine-based "program" is always the answer that could have been
derived from an appropriate analysis of the problem the "program" is
supposed to solve, given the appropriate question.
In short, you can derive a machine-based program from a general
description of a problem and a specific question, but you generally
cannot work it the other way around: given only the answer (the
"program"), it is not generally possible to reconstruct either a
description of the problem the program is supposed to solve or the
particular question which gave rise to the program.
Bill
------------------------------------------------
Let me add a more algebraic discussion to David's nice explanation of
a^0 = 1.
Let a^n denote a raised to the n-th power, for example a^3 = a*a*a
(i.e. a times a times a). Then a^(n+m) = (a^n)*(a^m), for example
a^(2+3) = a*a*a*a*a = (a*a)*(a*a*a). This holds for every real number
a and every natural number n (i.e. n=1,2,3,...) greater than zero. Of
course, for n=0, you cannot define a^0 in the same manner.
But what could a^0 be? Call it x, for the moment (and assume a is
different from 0).
Then xa) = (a^0)*(a1n) =(a^(0+1)) = a^1 = a, so x*a = a, and the only
possibility (if you want that the same rules as before continue to
hold) is x=1. Therefore you must define a^0=1. Now it is natural to
define also 0^0=1.
After this you discover that you can extend the definition to negative
exponents: For example 1= a^0 = a^(7-7) = (a^7)*(a^-7), therefore a
raised to the power -7 has to be the reciprocal of a raised to the
power 7 (this requires that a is different from 0). This is the
reason why in scientific publications you find often negative powers
of 10 when small numbers appear: 10 to the power -2 is 1/100, 10 to
the power -6 is 1/1000000, and so on.
The important thing is that "the mathematics continues", that means
that the common rules continue to hold also with these new
definitions. This is true also for the other rules: In a manner
anologous to the above, starting with observing that ((a^2)^3) =
(a*a)*(a*a)*(a*a) = a^6, and similarly ((a^n)^m) = a^(nm), we try to
obtain ((a^(1/2)^2) = a^1 = a, and this means that, if we put a^(1/2)
= y, then the square of y has to be equal to a, i.e. y is the square
root of a. Similarly a^(1/3) is the third root of a, i.e. that number
z whose third power gives a. Now you may calculate for example the
second, third, fourth and so on root of 5, say. You shall find that
the numbers you obtain come closer and closer to 1, whilst on the
other hand 1/2, 1/3, 1/4, ... come closer to 0. So you may say, that
for the exponent converging to 0, the powers converge to 1, another
reason for putting a^0 = 1.
------------------------------------------------
Candy,
Why do you stick on Basic? However, one may do some real programming
training also in Basic:
1. Learn basic input/output instructions. 2. Think in subroutines.
3. Try to imagine what happens in the computer's memory.
And then: Think about going over to C! (Usually one says at this point
"to Pascal or C", but I say "to C"). Of course this depends on the
soft- and hardware you can use, but it doesn't depend on you! It's
more fun and more efficient to know what you are doing.
Josef
------------------------------------------------
[Candy]
Hello Josef -
You wonder why I stick on Basic? Well, according to much of the
advise I've been getting, it is the place for beginners to start
learning how to program. My eventual goal is to learn "C"- this seems
to be a very popular language, and used by many employers. Here at El
Camino College, one must get through Trigonometry before taking the
first course in Pascal, which is required if one wants to get either a
Certificate of Completion or an Associate of Science in computer
science. Also required is one semester in either Fortran or Assembly.
At this time I am involved with a Fortran decendant, a language called
Automatically Programmed Tool (APT) because of my job. But it can
only be used with "machine tools" in the manufacturing industry, which
I am trying to leave.
Candy
------------------------------------------------
[Candy]
William -
<>
The book I am reading at present, "Absoulte Beginner's Guide to
Programming" by Greg Perry addresses exactly that topic in detail.
This is why I think it is such a good book. I agree completely that
one's thoughts must be organized before beginning to write a program,
and Mr.Perry demonstrates this very well using everyday activities
that we take for granted, like making a sandwich for instance. It's
quite amazing the things we humans take for granted - the ambiguities-
that computers cannot. A sample program shows precicely every
decision and step that must be taken to make a sandwich. It is a
long, involved process!
Candy
------------------------------------------------
Hi David -
That's the part I'm looking forward to - making a "software gadget" I
really need - or just for the sake of doing it. Learning by "rote"
will be a hard habit to break, for sure, but taking math will surely
help with that - I'll HAVE to think!
Candy
------------------------------------------------
Candy,
One noted computer scientist (Djikstra) once said something like
"those students who learn Basic as their first programming language
will be seriously brain damaged and unable to have a future in
computer science".
Note, however, that Basic has improved quite a bit from what it was
originally, when Djikstra made his comment (although I don't know if
you're using a "new and improved" Basic or something of a more generic
"garden variety" Basic.
Gary
------------------------------------------------
Gary -
Uh-oh. I think I should find out who Djikstra is and read more about
him. He may know something about Basic that us common folk do not
know. Hopefully he was not referring to any bad programming habits one
can incur by learning Basic first. Most people I know who use Basic
are not concerned about computer science matters, so they would not be
affected by whatever DJikstra is referring to.
I have the "compiler" for QBasic.
Thanks for the tip - Candy
------------------------------------------------
[Bob to me]
>> Now it is natural to define also 0^0=1. <<
I disagree. 0^0, like 0/0, is clearly an indeterminate expression. Its
value can be arrived at only by noting the context in which it arises
(as a limit). It's better to avoid trivializing subtle points such as
this.
.... Bob
p.s. For an example of what I mean, try the following: let
f(x)=exp(-1/(x^2)). Let g(x)=x^N, for N>0. Then
lim(f(x),x->0)=lim(g(x),x->0)=0. But the limit of (f(x))^g(x) is 0 if
N<1, 1/e for N=1, and 1 for N>1. All of these could be reasonably said
to be 0^0.
------------------------------------------------
Candy,
If you can choose between Fortran and assembly, take assembly! It is
not important that you effectively arrive at developing programs in an
assembly language, but it is truly the best way to learn
programming. The reason is that programming is no mysterious art of
priests, but simply knowing what one does. If you like 0/1s, you shall
like assembly very much. After some experience with assembly, the path
to C is smooth. Obviously! Because C is simply another, bigger and
completer assembly language with impressive input/output and
file-handling instructions added (I use C on the Macintosh, which has
its own big software embodied for all that external stuff, you would
be surprised how small and easy C becomes). When you look at program
listings, you can have the impression that Pascal and C are very much
alike, but the way they work is completely different: C is a
(sophisticated) assembly language, Pascal is a so-called "higher"
programming language (as is Basic), which hides you what happens in
the computer. When you program in C, you in any moment keep thinking
in terms of the long thread of memory, where your data lie, and you
keep seeing those data before your eyes. This prevents you from making
many errors, if you become accustomed to use your instructions not
conventionally, but always imaging what happens. There are a lot of
confusing instructions in C for file and string handling, use only few
of them, very often you can write (in one row!) your own instruction
for them. Look at this for example: In C a "text" or a "text string"
are exactly the same thing, simply a series of bytes in memory. Assume
it begins at a position A. Conventionally the text ends when for the
first time a byte which contains the value 0 is encountered. So
instead of using C's own instruction for calculating the length of the
string, you use this one:
for (X=A,n=0;*X;X++,n++);
Of course you have to declare your variables before coming to this
point: X and A are pointers to a character, n is a variable of integer
type, and the instruction means, translated to a pseudolanguage:
----------------------------------------------------
put X equal to A;
put n equal to 0; loop: if (*X is equal to zero) goto end;
add 1 to X (i.e. put X equal to X+1);
add 1 to n;
goto loop; end: COMMENT: now n is the length of the string
beginning at A
---------------------------------------------
If X is a pointer (this is the same as an address, only that you know
also about the meaning of the piece of memory it points to), then *X
is the content of the byte indicated by X. For example, assume X has
the value 50000, then *X is the content of byte 50000 on your
computer. Therefore adding 1 to X means going to the next byte (if X
points to bytes, if X instead points to objects which are 4 bytes
long, X+1 points to the next object, i.e. to the byte 4 bytes after
the byte corresponding to X).
What means
for (A,B;C;D) P;
where C is a condition, anything else are instructions? This means
do A; do B; loop: if (C is false) goto exit;
do P;
do D
goto loop; exit:
Of course usually there shall be no "do" (you shall write simply A;).
The difference is between "," and ";". Instead of A and B there can be
only one or no instruction. Usually A and B should not interfere,
because it depends on the compiler which one is executed first.
After some training with assembler, you will find C easy and funny.
Exercise (sorry): Try to translate in pseudolanguage the following
program, which copies the string pointed to by A to that pointed to by
B:
for (X=A,Y=B;*X;X++,Y++) *Y=*X; *Y=0;
------------------------------------------------
[To Candy.]
Sometimes I think you are a computer.
J.
[Referring to msg. #101422, for example.]
------------------------------------------------
[Frederick to Candy]
I am familiar Basic, Fortran, Cobol, C, Pascal, and BAL (Basic
Assambly Language of several Dialects, each different brand of
computer has its own).
My experience has shown that QBasic or Compiled QuickBasic is a
good general choice for beginning or advanced programming.
BAL has proven profoundly helpful to my general knowledge of
computers. C is too casual, and Pascal too formal for me, and both
have cult followings.
Fortran and Cobol I consider archaic. Computer languages are
like different brands of anything else, you find one you like, and go
with it.
I have designed a definition for a new language that
incorporates the good points of C and Pascal into a QBasic like
enviroment.
I have decided to call it Altran, short for Algorithm
Translator, maybe I'll even make a compiler/enviroment for it.
------------------------------------------------
[To Bob]
It is indeed a subtle point. All these rules are not natural laws, but
simply we try to extend ways of operating we know to be successful to
more general objects, which sometimes a priori appear to be only of an
auxiliary nature, but sometimes give rise to new powerful realms of
mathematics, as negative or complex numbers.
So in the case 0^0 = 1 we are not completely successful in this
program, for the reasons you explained (one could also consider the
sequence 1/n, which tends to 0, and since 0^(1/n) = 0 for any n, one
sees that 0 is indeed a bad point for exponentiation. Mathematicians
are accustomed to take it away from the domain of definition of the
exponential function. In this sense I would agree that 0^0 = 1 is more
or less a convention for school mathematics and scientific formulae.
The case 0/0 is even worse. Assume that 0/0 is a number and that the
usual rules hold. Then
0/0 + 0/0 = (0*0 + 0*0)/(0*0) = 0/0, therefore 0/0 = 0 (as you see
after subtracting on both sides). But now 1 = 0 + 1 = 0/0 + 1/1 =
(0*1 + 0*1)/(0*1) = 0/0 = 0, a contradiction. Therefore when we try to
define 0/0 as a number, we don't only loose continuity, but find
ourselves violating simple arithmetical rules.
[Here I made, of course, a mistake. 0 belongs to the domain of the
exponential function, I meant probably all the stuff together, that 0
has no logarithm, that 0^(-1) is not defined etc.]
-------------------------------------------------------
[To Candy.]
My opinion is, that with Basic you can do many things, but you don't
learn any deeper programming technique. At least, this is a real
danger with Basic. On the other hand, when you think about what you
are doing, then even Basic is good for learning. You shall soon find
that it is easy to obtain some action on the screen, and that it is
difficult to program complicated mathematical operations. But the
programmer's participation and autonomous organization of his work is
more important than the language he uses.
------------------------------------------------
[Carlos]
Candy, A "gravity simulator" works this way: You define x number of
planets, giving them mass and velocity. You also define the law of
gravity, using the general equation for gravitational
atraction. Define a starting position for every planet. Then you apply
the gravitational equation to all planets and plot their positions on
the screen. Keep doing this, using the new positions, and you have a
fairly decent model of how planets interact with one another. For
fun, you can model the solar system, and then send a comet flying by
very close to a planet and see what happens ! It's fun. re:
printing messages Here's a very short program that prints a message
over and over: 10 FOR A=1 TO 100 20 PRINT "CANDY CAN PROGRAM!!!" 30
NEXT A Have fun! Carlos.
------------------------------------------------
Josef-
It is good to know these things, particularly about Basic. Your
opinion coincides with that of another forum member, and both of you
seem to be very much informed on the subject.
"But the programmer's participation and autonomous
organization of
his work is more important than the language he uses."
I will keep this thought in mind as I approach Basic.
Candy
------------------------------------------------
[Candy]
Josef-
It is good to know your opinion about Fortran,Pascal,"C", Assembly,
and so on. It will be some time before I will need to choose the path
I will take with my education, but due to the responses I've been
getting here, the decisions I make will definitely be informed ones.
It looks as if I have my very first lesson in "C," right Josef? I
have printed out your message, and am not going to attempt the little
exercise at the end until I can get a full grasp of the ideas you are
trying to pass on to me. Let me study your example for a little
while, and then see if I can come close to understanding it enough to
try.
Candy
------------------------------------------------
[Candy to me]
Why do you think this?
------------------------------------------------
Frederick -
Thanks for your input. I hope the Basic Assembly Language is the same
as the "assembly" language offered at school. Well, I have plenty of
time to determine that!
You design program languages? I cannot imagine a greater
understanding of what goes on inside a computer than that.
Candy
------------------------------------------------
Tim -
You seem to be in agreement with Josef about this. At this point I
feel that I am ahead of the game because of your (collectively)
feedback. Maybe I should look in on the "Religious Wars." Is BIX a
forum? Never heard of this one before.
Good news about Djikstra - thanks.
Candy
------------------------------------------------
[Kevin]
Frederick,
I would put Visual Basic at the top of the list for new
programmers wanting to program in the Window's environment.
kd
------------------------------------------------
[Kevin]
Candy,
The biggest advantage of PASCAL is in the program structure.
You have to declare your variables at the top of the program, etc..
PASCAL was actually designed with teaching in mind. It was designed
to teach concepts used in programming. If you are planning to take
Pascal, you might purchase a copy of the text early. If you are going
to have to buy a Pascal compiler for your class, you might as well get
it now, and start with PASCAL. One problem with Pascal
Since PASCAL is used to teach programming, you will find
Pascal used heavily in academic settings, such as in mathematics or
physics.
As for BASIC, it was designed as a language that laymen could
jump in and get simple jobs done. The Early basic compilers didn't
include such necessities as the ability to break a program down into
subroutines. BASIC is the choice of businessmen who want to do the
programming themselves. You will see it used heavily in database
packages, etc.
The QBasic compiler is much better than earlier versions. I
wouldn't hesitate recommending it to new programmers. I would warn
you away from BasicA or GWBasic (Gee-Wiz Basic - Bill Gate's first
program)
Since the Basic learning curve is smoother, Basic is the
language of choice for many business applications. There are jobs for
Basic programmers in small businesses.
C is a lower level language. It is more powerful, but has a
higher learning curve. It is the application of choice for software
developers. A program developed in C will be smaller and faster than
those deveoped in BASIC.
The older versions encouraged programmers to develop bad
programming habits. The QBasic compiler is a lot better. One
suggestion is to never use the GOTO statement, always use subroutines.
kd
------------------------------------------------
Kevin -
I've heard that before about Pascal. Actually, the whole reason I
ended up back in school - in the math classes - was because I decided
I wanted to take Pascal. But the prerequisites were math, all the way
through Trig before you can get into Pascal. Knowing my math is weak,
I decided I can't lose by doing this.
I read an article about a teacher at Stanford (I forget his name) who
said he didn't think Pascal was necessary. But if it is a good
learning tool I would want to take it anyway.
Candy ------------------------------------------------
Candy -
BIX is the Byte Information eXchange, a completely different bulletin
board run (or used to be run) by BYTE magazine. You'd have to sign up,
etc. Stay here - it's bigger and easier to use.
-- Tim --
------------------------------------------------
[Tim]
Candy -
>> But the prerequisites were math <<
That's nothing but silly, and it's a carryover from the time that most
programmers were programming in FORTRAN, or came from math
backgrounds.
Requiring trig for Pascal programming is like requiring biology for
English Lit. My son, 12, writes in Visual Basic and TurboPascal - and
certainly hasn't had trig yet!
Pascal was originally a good learning tool, much better than Basic for
teaching structured programming. Borland, with TurboPascal,
popularized it and turned it into a practical language.
-- Tim --
------------------------------------------------
"BOOLEAN" is derived from George Boole, who is
considered the founder of "modern logic". Logic before Boole was
really too complex for anyone to understand -- for a view of
"pre-modern" logic, read Voltaire's Candide or listen to Leonard
Bernstein's musical version.
Margaret
------------------------------------------------
[Tom]
But what trig is good for is logical thinking - the mental gymnastics
necessary to prove a trig identity are very similar to the ones needed
to write a decent algorithm.
------------------------------------------------
[To Candy.]
I cooked a Boole for you:
not (a OR b) is the same thing as (not a) AND (not b), not (a AND b)
is the same thing as (not a) OR (not b), not (not a) is the same thing
as a.
The first two are called De Morgan's laws. George Boole (1815-1864)
and Augustus De Morgan (1806-1871) were friends (I have my books at
hand).
Hi, I swear I did not want to be didactical this time, but thinking
about books it comes to me that the real founder of 0 and 1 was
Leibniz (1646-1716). He discovered that calculating with only two
digits (instead of 10, namely 0,1,...,9) should be much easier, and
this is one of the reasons why computers do use 0 and 1, because it is
easier to control 2 different states instead of 10.
------------------------------------------------
[Kevin]
Tom,
Trig is an interesting exercise in logic, but I see no reason
to have someone put off programming till they have a trig class.
Certainly, trigonometry is necessary before the school hands out slide
rulers. I've been writing programs for businesses, and have yet to
use the SIN function.
kd
------------------------------------------------
Tom -
Granted, and not to disparage the discipline - I've had calc through
partial differentials, which isn't much compared to most of you math
jocks on here, but is a lot for a computer nerd.
But so many people - especially women - are frightened away by math
prerequisites, when in reality it's the logical thinking that is
required. What _should_ be required is something like Beginning
Boolean - leave the dreaded 'math' label off altogether. A solid
grounding in boolean algebra will go far.
-- Tim --
------------------------------------------------
[Tom]
You may have missed my point - the point is not in knowing what a sine
function is so you can stick it in an accounting program. The point
is that after solving one or two hundred trig identities (sin 54 - sin
18 = 1/2 is still my favorite) that you are able to formulate a
mathematical argument - an algorithm, if you will - with a little more
ease and precision than if you hadn't done those identities.
------------------------------------------------
[Kevin]
Margaret,
<< Logic before Boole was really too complex for anyone to
understand. >>
I wouldn't want to lump all of pre-modern logical systems with
Voltaire. Candide, for that matter was written as parody of the
"optomistic" philosophy of Alexandre Pope and Leibnitz*.
It was written to demonstrate absurdities of the philsophy which
Voltaire held in his youth.
There had been several thousand logical systems put forward
prior to Boole. Some had been models of elegance such as the work of
Euclid. Others such as the work of Hegel are extremely difficult to
decipher. Generally, pre-modern logic depends on prose to communicate
ideas. "Post-modern" are on the use of symbolics. I think it is a
mistake to say a logical system is more better because it uses symbols
instead of prose.
IMHO, many of the Post-Modern logical systems have acheived
degrees of excess complexity unheard of in pre-modern logic.
kd
*I am stealing from the critique of Candide put forward in "A Survey
of French Literature, Vol 1" by Morris Bishop, Harcourt Brace
Jovanovich, 1965.
------------------------------------------------
[Kevin]
Tom,
No argument from me, students are better off learning trig.
It should be a requirement. I disagree with having students put off
their first programming classes until they have had trigonometry.
Trig should not be a pre-requisite to programming. I think the
earlier we can get student's fingers on computers, the better
programmers they will become. Imagine how much more interesting trig
would be to students who are ready computer literate. As they start
proving trig identities, their minds might light and think, "hey, I
could do this with a computer!"
kd
------------------------------------------------
It is sometimes difficult to think coherently about certain arguments
with the fingers on the keyboard. I myself, for example, have no
difficulty to continue to think rather correctly about things which
may be formulated in words, while looking at a screen and typing, but
mathematical operations are complicated enough that you have to take
paper and pencil and write your calculations down. Therefore I think
one should use the computer for illustration and numerical
investigation, but real mathematical reasoning in my opinion cannot be
learned sitting before the computer. One of the questions in this
thread was: "Is mathematics necessary for programming?" By my
experience it is not necessary. I think mathematics is an important
basis for engineering, physics, chemistry, and for special tasks in
applications (say statistics) and research (say biology). As a
mathematician I would be glad if I could unite mathematics and
programming, but I found always that they are two different types of
work. Of course one can use mathematics in programming (rather seldom,
more the training or as a source of examples) and one can use
programming in mathematics (with more effect), but they remain distant
disciplines. My answer would therefore, for the moment, be that none
of them is a prerequisite for the other, but that both, may be, are
necessary for being able to understand the evolution of human
knowledge and to take an active part in it.
------------------------------------------------
[David to Candy.]
I haven't looked at Basic Assembly Language (BAL) but have talked to a
friend about it. When I showed him some 8086 assembly code, he said
that it was almost totally different from what he had done in BAL.
------------------------------------------------
[Candy]
Hi Tim -
>>and it's a carryover from the time that most programmers were
programming in FORTRAN...<<
That does make a lot of sense. Also, I was told by a co-worker that
it also is the school's way of weeding people out of the programming
classes. This is probably true, too. But in any case, I can sure use
the math training. At my job, I use a small amount of trig to figure
angles and sides of angles, and I MUST have my trig function book.
I've just taken the last two tests in Basic Math. I must say, it was
NOT an easy class! And it's the last time I'll ever take math during
the summer. No more compressed classes for me!
Candy F.
------------------------------------------------
Tim -
I thought BIX was another forum here in Compuserve. No, I don't plan
on going anywhere else at all - I like it too much here!
Candy :-)
------------------------------------------------
[Candy]
Hi Josef -
It's good to see you smile! I enjoyed your last message.
Are you a math teacher?
------------------------------------------------
[To Candy.]
Yep.
------------------------------------------------
[Kevin]
josef,
I have been wondering about the long term effects of computers
on the discipline of mathematics. You are right, we can't do the same
type of mathematics that we did at the turn of the century on a key
board. On the other hand, there are things that I can do on a
computer a lot better. Traditionally, mathematics has matched the
tools available for doing math. For example, turn of the century math
was heavily influenced by the slide rulers. Many of the equations and
techniques common in the 50s were designed so that engineers could
reduce large quantities of information down to a few well known
continuous functions, then look the answer up in a table.
With computers, mathematicians are no longer scared of large
sets of data. Instead of trying to find continous functions which
approximate the data, people will be happy. The next generation will
be much more interested in seeing ideas put forward in computer code
than in symbolic logic.
In a computer age, we will see more people interested in math.
In 1930, people were happy to delgate math to a close-knit band of
human calculators. In the 21st century, we will see a lot more people
doing their own math. The next generation will not have the patience
to delegate math to professors. They will be wanting to jump into
problems with their computers, and to get things going. We will hear
a lot more people in math lectures saying "Can you repeat that? and
in ENGLISH this time!"
kd ------------------------------------------------
Kevin -
True enough. It is a rather sad commentary, isn't it? But just think
- if Pascal can be programmed mathematically (or do we say
"scientifically?") as well as otherwise, and I complete these math
courses, just think how much more use could come of it!
Candy
------------------------------------------------
[To Candy]
What do you mean? Mathematics is the science of the infinite, in a
sense I explain (because of course there is also a mathematics of
finite structures). Mathematical reasoning deals with the consequences
of simple start statements, in situations where those consequences are
not directly perceivable (so this comprizes not only infinite
structures, but also finite ones, which are so complex that the full
truth about them is infinitely far away). This is not to begin a
discussion about mathematical epistemology, so I am briefly saying:
There is no mathematics of simple structures. A multiplication table
for numbers between 1 and 20 is not mathematics. Now the tasks of
programming usually are much too simple for being called mathematical
or scientific. In this sense there is no scientific or mathematical
programming. There are mathematicians and logicians who are trying to
develop general methods for program verification for example, but
usually these efforts are distant from what is needed in practical
problems. I'm sorry for this message, I'm only trying to understand in
which direction you are now going.
------------------------------------------------
Kevin,
What I meant in my first reply, was that one has to distinguish
between the "doing mathematics on a computer" and the mathematics
itself. I wanted to emphasize, especially in teaching, that what one
does sitting before the computer screen is not mathematics. It is very
difficult, if not impossible, to develop an algorithm "on the
computer", you have to elaborate the algorithm with pencil and paper
(or taking a walk) and only after this you may run it on the
computer. When you say that computers changed mathematics, this is
certainly true, but one should not forget that many classical
mathematical theories have shown their high potential in the new
applications. Examples are the use of linear algebra, homogeneous
coordinates and differential geometry in computer graphics, or the
theory of fractals, which may be considered simply as a small facet
only of the power and richness in content of complex function theory,
and where two other disciplines of once pure mathematics, namely the
theory of Hausdorff measure and Hausdorff dimension and again linear
algebra (a by no means trivial field of mathematics) emerged to public
significance. The calculus of finite differences was highly developed
already at the beginning of the century. Of course, new disciplines
arose (not always adequately appreciated by traditional
mathematicians), as fuzzy logic, neural nets, genetic algorithms,
while in statistics, for example, many theoretical points should be a
challenge even to the pure mathematician, and in probability theory a
wide gap is still open between the sophisticated techniques in
research and the applications. Turning to the theme of this
discussion, I think that in teaching one has to explain clearly this
distinction, that one has to learn mathematics and to think about
mathematics for its own sake. Mathematics is a very rich discipline
(and it is certainly a serious guilt and in some sense a social
suicide of many professional mathematicians, that they underline much
more willingly the "difficulty" of mathematics than its wealth of
content) and even who is only interested in its applications can make,
in my opinion, no better choice than of dividing severely his time
between the classical themes of mathematics and the fascinating tool
the computer certainly is. I appreciate much your last statement
("repeat it in English!"). Mathematicians have to rediscover their
discipline and and to create a custom of willingness to communication.
------------------------------------------------
[Kevin]
josef,
RE: << It is very difficult, if not impossible, to develop an
algorithm "on the computer." >>
This statement has an absurdist element. Programming is the
practice of writing algorithms. If you can program on a computer,
then you can, defacto, write an algorithm on a computer. I have found
that when I start programming, I will start thinking in key strokes
and computer commands. Just as, when I work in mathematics I start
thinking in symbols. I think it will be common for next generation
mathematicians to be thinking in computer languages instead of
symbolic logic when they do math.
The symbolic Logic we use today evolved on the two dimensional
surface of a chalkboard. If you watch a professor writing a proof, he
will use the two dimensional surface to develop the logical structure
of the proof. The summation symbol and the integral symbols were
designed to be written on a two dimensional surface. The next
generation mathematician may find it easier to think in terms of
passing parameters to a function.
I concede completely that computers are not an end all to
existence or of mathematical thought. Human thinking happens on a
different level than computers. There are many things which we can do
better with paper than with computers. Computers will not be an end
all to our existence, but they will have a profound effect on the
language and logical systems we use.
kd
------------------------------------------------
[Margaret to Kevin]
Yes, I should have said "father of modern *symbolic* logic". Being
a computer-type person, I tend to think of symbolic logic as the only
kind -- but of course, it is not. The idea of Leibniz was to
characterize all concepts with their appropriate symbols (sort of like
hieroglyphics). With this "universal language" as the basis for
communication, conflicts and misunderstandings would be avoided, and
the whole world would be in harmony.
"Modern" symbolic logic, while using symbols to represent certain
ideas, restricts the field to which these symbols are applied, thus
resulting in a simpler, although less universal, system.
Margaret
------------------------------------------------
[To Kevin]
[if p is a prime number, and i is an integer then mod(i^p,p) =
mod(i,p)]
This is the relation of Fermat and Euler. A transparent proof is the
following:
1. Let d be the greatest common divisors of the integers a and b. Then
the set of all integer linear combinations xa + yb (with x,y integers)
coincides with the set of all integer multiples kd (with k integer) of
d. This is an easy consequence of the euclidean algorithm.
2. From this it follows that the diophantine equation ax + by = c has
solutions if and only if d divides c.
3. In particular, if p is a prime number which does not divide a, then
the diophantine equation ax = 1 + py has always solutions, and this
means that there exists an integer x such that ax - 1 is divisible by
p.
4. This shows that for any prime p the set Z/p of integers modulo p
(or, equivalently, the set of numbers 1,2,...,p-1 with multiplication
modulo p) becomes a group, which has, of course, p-1 elements and in
which 1 is the neutral element.
5. In group theory one shows that in every group with m elements the
m-th power of any element is equal to the neutral element. Therefore
for a prime p we obtain that the (p-1)-th power of any integer a,
which is not divisible by p, is congruent to 1 modulo p, i.e., a^(p-1)
- 1 is divisible by p. Multiplying by a we obtain that a^p - a is
divisible by p, and this of course remains true also if a itself is
divisible by p, therefore your equation holds for every integer a.
-----------------------------------------------------------------
This is the standard, mathematically most appealing proof. A direct proof
is simpler:
Let again p be a prime number and assume a is not divisible by p. Then
one shows quickly that among the numbers a,2a,3a,...,(p-1)a each
congruence class modulo p appears exactly once. Call m the product of
1,2,3,...,p-1. Then we obtain that m is congruent to the product of
that second series of numbers, i.e. to ma^(p-1), modulo p. This means
that there exists an integer k such that m(a^(p-1) - 1) = kp, and, by
a fundamental property of prime numbers, since p doesn't divide m, it
has to divide a^(p-1)-1.
-------------------------------------------------------
Unfortunately
the converse is not true: Even if you can show that a number q has the
property that for every integer a we have that a^q is congruent to a
[modulo q], it doesn't follow that q is prime (numbers with this
property have, however, some practical importance and are called
Carmichael numbers). In fact, there is no easy criterion for
primality.
--------------------------------------------------------
Number theory is one of those big classical and once "pure"
mathematical theories which in the near future could assume an
important role in applied mathematics. In answering to your reply to
my message, I may remark that I referred especially to teaching, where
I find that there is a danger having students sitting before the
computer making experiments instead of thoroughly studying a
mathematical theory in depth. It may be really true, on the other
hand, that the new technologies shall change deeply our working
habits. It is a generation conflict, not between humans, but between
languages in some sense. In the transition time we have perhaps to be
cautious not to loose too much of already achieved knowledge.
------------------------------------------------
[Bill Magaletta to Kevin]
If you mean that the apparent greater ease of thinking with pencil
and paper, instead of with a keyboard, is just that - apparent - I
disagree. I have been programming for more than 25 years, and I am an
expert typist, but there is simply no comparison between typing and
being able to move around the page, scribble, cross out, invent ways
of emphasizing on the fly, and so forth. The only computer input
device that might be better than a pencil would be something that was
controlled directly by the brain.
- Bill
------------------------------------------------
[Kevin]
Bill,
At what age did you first start programming? When did you
first have an opportunity to program on a computer with a video
display? People go through different stages of development. During
my developmental stages, the main tool available for academic work was
the pencil and paper. Imagine kids growing up on video games and
computers. Will the pencil still hold the same allure? Some
psychologists claim that our personalities are pretty much set at age
five. Were you playing with a computer at age five?
The message is about generational changes. How will these
people think? Don't be surprised if it is different. Kids who grew
up after writing was invented had a different view of life than their
ancestors. Kids who grew up after the invention of the printing press
had a different view of the world. Kids who grew up with tv and radio
had a different set of values, and approaches to life than their
parents.
kd
------------------------------------------------
[Bruce F.]
Candy,
my research was in topology> algebraic topology > group actions on
manifolds > 'ultrasemifree actions and unoriented bordism' (in order
of narrowing of the field).
BASIC is not the best first language, Pascal is. and while
programming does not use a lot of typical mathematics, it is very much
in the style of mathematics. math is a study of forms, patterns,
properties, structure. it just so happens that numbers are very
useful for much of that, but math does not mean numbers. there is
much much more to math than numbers. programming relies on using
symbols to represent things that are then manipulated. you need a
good grasp of structure and patterns to do that.
------------------------------------------------
Hi Bruce!
Thanks for your feedback on Basic and Pascal. Actually, I'm glad to
hear it because for some reason I'm anxious to get on with Pascal,
which makes my math a little more motivating. Furthermore, your
description of math - that it is much more than numbers - greatly
intrigues me, and I am all the more motivated. I am discovering a
whole new world, and I realy like it.
I just got back from a week's vacation - sorry about the delay!
Candy
------------------------------------------------
Hi Josef -
I just returned from a weeks vacation, and just received your last
message. Please don't apologize for your message, because I, too, am
trying to understand in which direction I am now going. I was only
trying to find out what the proper term is for referring to
programming that uses mathematical formulas. It was just a question
of semantics, that's all. Right now, all I know is that I want to
develop skills in math at least as far as trigonometry, and that these
skillls can be used in programming. For what purpose, I don't know,
but that's the least of my problems. Please bear with me - I'm sorry
for any confusion my own confusion has caused you!
Candy
------------------------------------------------