Back to home page

DOS ain't dead

Forum index page

Log in | Register

Back to the forum
Board view  Mix view

FPC 16-bit (Announce)

posted by Rugxulo Homepage, Usono, 15.05.2013, 19:00

> Why do you think that nearly none of the great compilers are also 16-bit
> hosted?

Which ones are "great"? Please list them.

Historically, there were many many commercial DOS compilers. Quite a few are even still sold! But these days it sells more to target certain systems (Mac, Win, Linux on IA-32 or AMD64) than anything else.

16-bit PowerBASIC for DOS claims to fit compiler and libs in 300 kb. Extended Pascal's first implementation (AFAIK) was for 16-bit DOS.

> While possible (at least for 286pm), it is only possible if you
> craft specially for it, and the market after the main "turbo" lines wasn't
> big enough anymore to warrant that.

You get nothing for free. Any cpu requires handholding. It's either behind the scenes or you have to do it explicitly, but either way it's not by default.

"Market" is totally relative to popularity (ego, politics) and money, not technical reasoning.

> And even if sb would have tried, all the 16-bit aficionados couldn't be
> pried from previous generation turbo compilers with a crowbar.

More browbeating from newer OSes. Every geek wants to declare some other tech "obsolete" or "dead".

Did you know DOS is dead? OS/2 is dead? Pascal is dead? WinXP is dead? Silverlight is dead? Flash is dead? Blackberry is dead? Nokia is dead? Even many many people have tried to declare C dead (which it kinda is but still lives on with POSIX) in lieu of C++.

Do you see the pattern? It's all marketing. They want to sell their newer, "improved" version. And if yours works fine, you won't upgrade. So if they can't sell you on features alone, they declare (and coerce) other things to be less useful, deprecated, etc.

And as last resort, if they can't "win" through normal means, they lower the price (for a "limited time" ... sound familiar?). Then they can claim "millions of users" and jack the price up again, esp. with a "newer upgrade". Then it becomes established and you're stuck with it (and all the various changes that happen on a whim).

This is the nightmare that is "modern" technology. Chase the latest, ignore pre-existing. Often newer tech is more tightly nailed down than a fort. So many restrictions, so many rules, so narrow, so expensive, but that's how the "market" wants it. If anything, we only have things like GNU and *BSD as a complete opposite rejection of that attitude, even if they too fall into the trap sometimes.

FreeDOS ... FreePascal ... what a joke, eh? Why are we replicating old, dead, obsolete tech? Why aren't we all using whatever flavor-of-the-month? It's obvious nobody can get "real" work done with inferior tools, right?

> > If you can't write something useful in 500 kb, you can't write anything.
>
> Baseless claim.

The largest 8086 instruction is six bytes. The largest 286 instruction is 10 bytes. That is plenty of room for code. You never ever see an assembly program that even comes close to exceeding 640 kb (MZ limit), but with HLLs you almost never see anything below 100 kb. (Even .so or .dll doesn't help much anymore.) Remember DOS386's old signature? For some people, even "Hello, world!" can't fit in 100 kb. It's "impossible", they say.

> Yes. But it doesn't solve the basic chicken-egg problem in a way that
> matters. It tries to solve it with minimalism and if you try to build on
> it, you fall in the same trap as Unix did.
>
> IOW if you specify bootstrapping in some limited dialect (like C or
> standard pascal), to get proper bootstrapping, all core infrastructure must
> be in that limited dialect.

Unavoidable unless you find a way to do some kind of meta-programming to target more than one intermediate. It's possible, but very few developers try.

> Even GCC gave that up recently, when they moved to C++......

They didn't give up anything. Some developers forked and cleaned up some bits (with more to come) with a reasonable subset of C++. C++ is well-supported, popular, and standardized, all of which are things GNU cares about.

Re: bootstrapping, I can only assume they expect people to use older GCC to build older G++ to build current GCC, if needed (assuming their platform was ever supported and working in the first place). E.g. 2.7.2.3 can build 2.95.3 and other K&Rs can allegedly build 3.4.6.

> You misunderstood my point. It wasn't a dare to come up with even more
> outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit) is
> so bad.

It must be bad because nobody supports DJGPP (32-bit) anymore. Even working code is thrown away. I just don't understand that. It's not rational, it's apparently all about promoting whatever else they prefer instead.

> > Useless because it lacks a full dialect (e.g. Delphi 7) or because you
> > yourself will never need to use it?
>
> It serves no useful purpose in any way. It is a subset of an obsolete
> toolchain that got 3 newer versions. Even a pm TP7 compiler is a total
> different world featurewise.

So many people brag about C++ or Ada. Yet how many who bragged still use the same dialect they used back in bragging times? (Ada83 anyone? AT&T 2.0 anyone? Python 1.52 anyone? Perl 4 anyone? Scheme R4RS anyone? Ruby 1.8.4 anyone?) None. They all "upgrade" so as not to incur the wrath of the deprecation police (aka, embarrassment at using outdated, inferior tech). The only problem is that there's only so much you can add to something before you have to rename it entirely to avoid a stigma (many companies fell into that renaming trap, and it never helps anyways).

> > All such harping like that just reminds me of BWK's
> > "real work, not toys" bullcrap. (Some people just like to exaggerate.)
>
> I think the point of this thread should be more that one era's real work
> are toys in a later era.

They never even had access to the original tools, so they don't even know how it was supposed to work. So they create their own warped variant and call it superior. It's easy to laugh at our "inferior" elders. But tech is very short-lived, so eventually everything gets replaced (usually for no good reason, but anyways ...).

> > Oh, that's right, it's often political bias, not technical, that limits
> > software.
>
> It is technical too.

No, it's not, almost never. If anything, people prefer halfway working implementations to nothing at all. But it's obvious some people don't want to support certain things.

> > You have to stop somewhere. You can't add indefinitely.
>
> Like with everything, such requirements change with the requirements of the
> times. No I don't think there is a hard limit somewhere.

If you need 1000x more processing power and RAM (and libs and tools) than anyone else before you, you're either a). uber genius, or b). dumb as a brick.

> > Perfectly fine? GCC has undergone ten bazillion changes over the years
> and
> > broken a lot of things. It's a pain to build, it's a pain to install.
> It's
> > too POSIX heavy
>
> Yeah. But while it is not universal, at least it can be used adequately for
> certain tasks.

Anything can be used for "certain" tasks. GCC 1.x was considered decent too. Same as 2.x. Same as 3.x But nobody uses those anymore, they are "dead" and "inferior". Even if they still work (!), people throw them away. Is 4.x that much better? Only if you think "modern" is irreplaceable.

BTW, in case you haven't noticed, a lot of changes have happened due to indirect influence from Clang (written in C++), which is preferred by Mac OS X, FreeBSD, Minix, and even Embarcadero.

 

Complete thread:

Back to the forum
Board view  Mix view
22632 Postings in 2109 Threads, 402 registered users, 426 users online (0 registered, 426 guests)
DOS ain't dead | Admin contact
RSS Feed
powered by my little forum