[MUD-Dev] Moore's Law sucks (was: 3D graphics)

Adam Wiggins nightfall at user2.inficad.com
Sat Feb 14 02:51:11 CET 1998


[Brandon J. Rickman:]
> On Fri, 13 Feb 1998 16:24:56, Mike Sellers <mike at online-alchemy.com> wrote:
> >-- Moore's Law still rules. :)  
> 
> The tiresome Moore's Law rhetoric.  I made a feint at this topic on 
> another thread (actually it might have been this one...) but Mike's
> convenient rehash has given me a new opening.
> 
> Moore's Law: the computational power of computers doubles every <make
> up a number between 12 and 60> months.

Hrm, I've never heard any number other than 18.  The actual number doesn't
matter as much as the principle.

> Problems with Moore's Law:
> 
> "computational power" merely refers to a measure of how many operations
> a chip can perform in a fixed amount of time.  The higher the MIPS

It does?  Once again, if anyone ever uses it in this context, I assume
they are taking it too literally.  Computational power refers to the
total capabilities of the machine.  This includes quite a bit more than
the chip.  Most folks know that chips aren't the bottleneck in much of
anything you want to do with computers nowadays.  RAM, disk access speed,
network bandwidth, and bus speed are all signifigantly lagging behind
processors.

> slow at calculating sqrt(4).  The functional power of a machine is
> highly subjective and is particularly dependent on software.  You
> can do less with more using Windows95, for example.

This is true of any tool.  However, a generalization of the sort Moore's
Law makes, at least in my semi-humble opinion, doesn't refer to any of this.
If one were to buy a new sort of screwdriver which added a pull-out bar
for extra torque, few would complain for faulty advertising just because
they don't work out as much as they used to back when they used the old
screwdriver, and therefore are actually able to turn the same amount or
less with the new torque-bar feature.  This doesn't impact the generalization
that the new screwdriver can provide you with more torque.

> Second, the amount of computational power available on the consumer market
> is far beyond what anybody could actually use.  The average non-business

Oh?  Explain to me why I spend so much time waiting for my computer at work,
a state-of-the-art Pentium with plenty of RAM and a nice fast hard drive,
to perform the routine tasks I do from day to day?  Why do I wait for
3D Studio to render long animations, or Photoshop to load up, or twenty
minutes for Developer Studio to do a complete rebuild?
Naturally there are plenty of other things to blame here other than
hardware.  I'm one of the first to complain that modern software packages
are bloated and top-heavy and consume far too much resources for what they
do.  Nevertheless, this still falls into the category of 'computational power'.
Far too many times I've heard 'Oh, computers nowadays are so fast there's
no point to even optimizing' which is what the above seems to advise in an
incedental sort of way.  It's *extremely* easy to use all the computational
power availible today, and then some.  Whether this 'use' is justifiable
or not is another argument.

> computer user does not have and does not need these machines.  In fact,
> most businesses don't need these machines either, but buying new 
> equipment is always good when you can write off the depreciation on
> your taxes (and you can't write off things like employee benefits or
> long-term business planning).  The people that are actually using the
> fastest available machines are usually closely tied to the computer
> chip industry in the first place, like the chip designers using fast
> chips to design faster chips.

Hrm.  Well, most companies I've worked for do have a real problem with not
sending the computing resources to the folks that really need them, but
generally the best rigs go to the 3D artists, who most certainly use all
the power they are given.  Even with a nice GLint card, plenty of RAM,
a multi-processor machine, and all that junk - manipulating a mesh with
upwards of five or ten thousand polygons, especially with detailed textures,
is pig-like.

> On the plus side, as big business needlessly upgrades their machines the
> "obsolete" machines are falling into the hands of artists, educators, and
> non-first world citizens.  This market is not reflected in Intel's 
> sales reports and Intel has no idea what people may be doing with those
> machines.

This is true, but as usual, this is not your market.  For a long time I
used my 286 as a dumb terminal to log on to a university's computers.
I was *not* a target market for any software.  I couldn't afford a new
computer or new software, nor did I need or want anything that I could
purchase for my machine that I already had.

Your market *is* the folks who run out to buy a top-of-the-line rig
every couple of years.  Actually, this brings up another point - I
think we lost whether or not Mike was talking about client or server
software.  A mud client is more concered with video acceleration and
internet bandwidth than any sort of processing power.  A server is worried
about speed, RAM, and mass storage.

An example of what I believe Mike was getting at - racking your brain
trying to come up with killer optimizations is occasionally a huge waste.
Orion and I spent the first year of our project obsessing over the amount
of RAM and processor time our mud took.  We spent long amounts of time
trying to squeeze every last bit out of the structures we allocated,
and building extra lists to speed up some of the game loops.  This was because
I thought it would be running on my 486-33 with 4 megs of RAM.  By the
time we were well into the project, we had it running on a Sparc of some
sort at the university sporting a nice big RAID drive and multi-hundred
megabytes of RAM.  At that point the fact that our base server took up less
RAM and processor time than tcsh was only amusing, and not at all useful.
We ended up going back over and undoing a whole bunch of our optimizations
that we labored so hard over and replacing them with what we really wanted
to do in the first place.

> Third, designing for non-existant technology is a dumb-assed design 
> constraint.

Quit beating around the bush, Brandon.  Tell us what you *really* think! :)
This is an extreme statement.  Designing for non-existant technology is
impossible.  Designing for propossed technology that is currently in
the works isn't much fun and is a gamble (as you say below), but can
sometimes pay off large rewards.  (case in point: Microsoft's first product)
Designing for currently technology with an eye on what the future holds
is only logical.

> [Aside: there is the old argument that goes:
> If I start computing today it will take me three years to finish.
> If I start computing a year from now it will only take me one year (two
> years total).
> Therefore I should wait until next year.
> 
> This is a clever bit of rubbish.  If the task will take _at most_ three
> years today, then there is a chance it will finish _in less than two 
> years_.]

I thought the last line was supposed to say, 'Therefore I should not
bother and go write a MUD instead.'

> Designing for an imaginary machine is a gamble.  Some people can afford
> to make that gamble, and some of them might make a lot of money off of
> it.  But overall, blindly accepting high-stake risks is not only
> foolhardy, it is bad business practice.

See above.

> Lurking in all of this is the trendy (since WWII) practice of Designed
> Obsolescence.  Large groups of people (artists, educators, and non-first
> world citizens) have realized that obsolete technologies aren't.  People

Of course.  Largely this is the old peer-preassure routine; someone
tells you your hardware is obsolete, therefore it must be.  Obsolete is
relative, and changes from person to person.  Bill Gates thinks that
UNIX is obsolete.  We could consider him to be one of the most influential
people in the computing industry - does this make his view apply to 
everyone?

> still use Windows 3.1.  The Y2K problem is a problem because systems

People are also still using DOS, System 7, NeXTStep, AmigaDOS, and
so forth.  Most non-engineering universities are still using a version
of UNIX from a decade ago.

> put in place twenty years ago are _still working_ (maybe not the
> original hardware, but the original architecture).  The problem 
> with Designed Obsolescence is that isn't sustainable; at some point
> a product is released that is of superior quality and future demand
> drops off.

And here's the key.  Obsolete is relative to whatever else you can get.
If you can get something that does the task in a more efficient manner
than what you currently use, you could say that what you currently use is
obsolete.  This does not make it worthless by any means, or even
that an upgrade would be worthwhile for you.  My mom is using Microsoft
Word to type up her documents for work, and she complains that she misses
the word processor that we had on our Tandy 1000 a dozen years ago.
That word processor is most definitely obsolte.  However, that doesn't have
any impact on how useful of a program it is for her.

> Moore's Law has been manipulated by an aggressive advertising campaign.
> Computers now do less with more.

Yup.  I fondly remember when I had the later versions of DOS booting in
4 seconds flat.  Now I have to wait for several hundred megs of RAM to
scan, IDE controllers to detect drives, CD-ROM drivers to load up,
Windows 98 to grind the disk for several minutes while it loads...
this is progress?

> Productivity has not increased.

I disagree with this quite strongly.  Perhaps people are 'lazier' nowadays
(in the same way that people are lazier since the invention of cars,
or microwaves), but this doesn't mean that you can't do more with less
effort.  I used to spend dozens of hours trying to come up with
very simple character animations viewable from a single angle with DPaint.
Now I can do a good character animation from any angle in about with a 3D
animation package.  Have my skills as an artist improved?  Hell no.  Even
something simple like the wand tool in modern paint programs reminds me
of long, tedious hours spent 'cleaing up' images by hand, try to hunt down
stray pixels.

This also depends how you meassure productivity.  One might say that the
average artist can create about the same number of art pieces per period
of time N as they could ten years ago.  The difference is that the pieces
are probably higher-resolution, higher color depth, higher framerate, and
more easily modifiable.  Whether you see this as a huge improvement or
not (I don't, really) is subjective, but one cannot deny the truth of the
hard numbers (1280x1024x24 bit vs 320x200x8 bit, for example).

> (Productivity was hardly even measured before computers entered the
> workplace, so the argument is moot.)

Productivity is a fundamental yardstick by which any endevor is meassured.
This is a factor independant of computers, or businesses or humans, for
that matter.  Since computers had such an effect on it (both positive
(good tools) and negative (internet games, *ahem*)), it became popular to
try to meassure it 'accurately'.

> To somehow tie this back to a list-relevant topic: Mike is advocating
> that product cycles should be targeted towards cutting-edge machines,
> because cutting-edge is cool? important? profitable?  Someone has to

I don't think that's quite what he said...you probably should have quoted
a bit more.  What I got out of it was, "Don't hold yourself back for
fear of having something completely useless.  As long as you don't go
crazy it's likely you'll have the technology to support it, if not now,
then soon."

> have analyzed this claim with actual numbers by now.  If a product
> is delayed by six months/a year (an obvious risk when you are pretending
> to program on a machine that you don't have) doesn't that indicate there
> needs to be something more to the product than "cutting edge" design?

I'd say the primary problem is not accurately predicting where the
technology will go.  We ran into that heavily with our current project.
We thought that 3D acceleration would start supporting massively higher
pology counts.  Instead it's focused on more features for better looking
models.  We've since had to seriously scale down the detail on our models.
On the other hand, we didn't expect any particular gain from the processor
side of things, but the caching on the MMX (and later) chips is so good
that the game runs at nearly 150% the speed on an identical machine
(that is, 200 mHz vs 200 mHz MMX).
That's what you get, and yeah, it's a risk.  You do the best you can to
guess.  As with anything like this, taking the safe route and going for
the lowest common denominator might end up with your either wasting time
with pointless optomizations, or ending up with a product that appears
obsolete next to all the others.  Like it or not, computer users aren't
much interested in obsolete programs.  Just try to convince any
hard-core Diablo player that Angband is a much better game along the
same lines and see what response you get.

> I'm all for progress in the world of muds, but I think the design
> criteria, especially for the upcoming generation of graphical
> muds/UOII/whatever, should be focused on the strengths of what is
> already successful.

Well, of course...

> A short list:
> - having a large and divers world to explore that can be affected by
> players

Implies lots of data transmited from the server.  Internet connections
are certainly getting vastly better in a hurry, but they are still a huge
bottleneck.  As much as I agree with you, I tend to think that anything
which increases this load is a Bad Idea, at least for right now - especially
if we're designing for current technologies like you suggest.

> - semi-intelligent interaction with non-player creatures.

Kind of open ended, so don't really have any comment.  This is probably
mostly CPU time, so shouldn't cause any problems unless you get crazy.

> - emphasis on social relationships and actions, in particular:
>    - being able to walk around naked/inappropriately dressed
>    - tinysex

Or, to take this in a broader sense, focusing on more detailed interaction
with the gameworld rather than anything inherently limited by technology.
Most certainly agreed here; this is what most everyone really cares about
anyhow.

> Things I don't buy that have not been proven successful:
> - wholesale ecological/economic simulation 

Needs a lot of testing time, and a change in attitude of the game players.
I think it will happen; indeed, I'm pretty sure large scale world simulations -
which what we think of as muds are gravitating towards - won't be able to get
along on hacked in world anomolies like mud-style shopkeepers forever.
As the system becomes more inclusive, kludges like this will become more
and more obtrusive to the game.

> - high-bandwidth/dedicated network solutions

My main problem is that even though bandwidth is increasing, hiccups are
just as common (if not more common) than ever.  This is what makes online
gaming so frustrating much of the time; several second pauses on an
otherwise clean connection for no aparent reason.  Cable modems certainly
won't solve this.

> Things I don't know what to think about:
> - high turnover rates designed to increase software or subscription
> sales (as perfected by America On-Line)

For a traditional mud, sure.  As online games branch out, we'll get something
partway between Quake and UO - something that you can play casually if you
like, but still represents a fairly detailed world simulation.  I tend to
think something like this could deal with, and maybe even thrive on,
a high turnover rate.  I tend to think that action games do - fresh meat,
as they say.




More information about the mud-dev-archive mailing list