[MUD-Dev] Moore's Law sucks (was: 3D graphics)

Brandon J. Rickman ashes at pc4.zennet.com
Fri Feb 13 19:08:13 CET 1998


On Fri, 13 Feb 1998 16:24:56, Mike Sellers <mike at online-alchemy.com> wrote:
>-- Moore's Law still rules. :)  

The tiresome Moore's Law rhetoric.  I made a feint at this topic on 
another thread (actually it might have been this one...) but Mike's
convenient rehash has given me a new opening.

Moore's Law: the computational power of computers doubles every <make
up a number between 12 and 60> months.

Problems with Moore's Law:

"computational power" merely refers to a measure of how many operations
a chip can perform in a fixed amount of time.  The higher the MIPS
(Million Instructions Per Second), the "faster" the chip.  Like the
benchmark numbers used in computer ads, this measure is pretty meaningless
without a context.  I could design a chip that returns the value of
sin(PI) faster than every other chip, so if we compare the speed at
which my chip is able to computer a certain operation (in particular,
sin(PI)) my chip looks like a winner.  Unfortunately my chip is awfully
slow at calculating sqrt(4).  The functional power of a machine is
highly subjective and is particularly dependent on software.  You
can do less with more using Windows95, for example.

Second, the amount of computational power available on the consumer market
is far beyond what anybody could actually use.  The average non-business
computer user does not have and does not need these machines.  In fact,
most businesses don't need these machines either, but buying new 
equipment is always good when you can write off the depreciation on
your taxes (and you can't write off things like employee benefits or
long-term business planning).  The people that are actually using the
fastest available machines are usually closely tied to the computer
chip industry in the first place, like the chip designers using fast
chips to design faster chips.

On the plus side, as big business needlessly upgrades their machines the
"obsolete" machines are falling into the hands of artists, educators, and
non-first world citizens.  This market is not reflected in Intel's 
sales reports and Intel has no idea what people may be doing with those
machines.

Third, designing for non-existant technology is a dumb-assed design 
constraint.

[Aside: there is the old argument that goes:
If I start computing today it will take me three years to finish.
If I start computing a year from now it will only take me one year (two
years total).
Therefore I should wait until next year.

This is a clever bit of rubbish.  If the task will take _at most_ three
years today, then there is a chance it will finish _in less than two 
years_.]

Designing for an imaginary machine is a gamble.  Some people can afford
to make that gamble, and some of them might make a lot of money off of
it.  But overall, blindly accepting high-stake risks is not only
foolhardy, it is bad business practice.

Lurking in all of this is the trendy (since WWII) practice of Designed
Obsolescence.  Large groups of people (artists, educators, and non-first
world citizens) have realized that obsolete technologies aren't.  People
still use Windows 3.1.  The Y2K problem is a problem because systems
put in place twenty years ago are _still working_ (maybe not the
original hardware, but the original architecture).  The problem 
with Designed Obsolescence is that isn't sustainable; at some point
a product is released that is of superior quality and future demand
drops off.

Moore's Law has been manipulated by an aggressive advertising campaign.
Computers now do less with more.  Productivity has not increased.
(Productivity was hardly even measured before computers entered the
workplace, so the argument is moot.)

This all began with:
>I had a fascinating discussion with a guy from Intel recently.

Hardly an objective source.  I once heard that VRML was the future of
3D, but I think it was Mark Pesce who said it.  

To somehow tie this back to a list-relevant topic: Mike is advocating
that product cycles should be targeted towards cutting-edge machines,
because cutting-edge is cool? important? profitable?  Someone has to
have analyzed this claim with actual numbers by now.  If a product
is delayed by six months/a year (an obvious risk when you are pretending
to program on a machine that you don't have) doesn't that indicate there
needs to be something more to the product than "cutting edge" design?

I'm all for progress in the world of muds, but I think the design
criteria, especially for the upcoming generation of graphical
muds/UOII/whatever, should be focused on the strengths of what is
already successful.

A short list:
- having a large and divers world to explore that can be affected by
players
- semi-intelligent interaction with non-player creatures.
- emphasis on social relationships and actions, in particular:
   - being able to walk around naked/inappropriately dressed
   - tinysex
 
Things I don't buy that have not been proven successful:
- wholesale ecological/economic simulation 
- high-bandwidth/dedicated network solutions

Things I don't know what to think about:
- high turnover rates designed to increase software or subscription
sales (as perfected by America On-Line)

- Brandon Rickman - ashes at zennet.com -
While I have never previously found a need for a .sig, this
may be considered one for the purposes of this list



More information about the mud-dev-archive mailing list