Back to home page

DOS ain't dead

Forum index page

Log in | Register

Back to the forum
Board view  Mix view

FreeBasic and FreePascal (Developers)

posted by marcov(R), 19.07.2014, 16:42

> > > 1997? You seriously want to pretend that you're using the same (or
> even
> > > similar) setups as back then??
> >
> > No, but the comparisons were much the same.
>
> Not at all, it was a different world. We never thought we'd ever have (much
> less need) multi-core Gigahertz or Gigabytes of RAM or Terabytes of hard
> disk or 64-bit cpus at home. Back then, 32-bit / 4 GB max RAM / FAT32 was
> good enough for everyone.

And that was already enough to not care if a binary was 50kb or 1MB if it did something useful.

> > Though mostly TP/BP7, not TP5.5
> > which was nearly completely forgotten by then. It only came back into
> view
> > due to Borland setting it free.
>
> Just to compare to 1997's GCC for x86: 2.7.2.x only supported 386 and 486
> [sic], and -Os didn't exist yet. But it was still good enough for Quake!

I never used DJGPP. The only thing I used gcc for was to compile a kernel.

> > And despite being in the last kickings of the mainstream DOS age, the
> > writing was clearly on the wall.
>
> MS had been trying to replace DOS since mid-'80s, with OS/2 (and of course
> NT, later on). They abandoned stand-alone MS-DOS in 1994.

And in that period, together with an explosion in computer use (elder aunts getting PCs), the focus shifted from Dos to Windows. That was what I meant with writing on the wall. I didn't like it back then, OTOH I did like LFNs.

> They abandoned
> Win9x in 2001 with XP (and support ended in 2006, as I'm sure you
> remember). With XP, they declared [MS-]DOS "dead".

XP was the final nudge for holdouts (well, now the 64-bit migration hammers that home even harder). However during that period it was already clear that DOS' mainstream use was over.

> But now even XP is dead. So are a bunch of older OS versions (Linux 2.4,
> Mac OS X PPC, OS/2 4.x, FreeBSD 6.x/7.x). So what?

(Of those I sometimes still use OS X PPC)

> > Even people still creating dos programs (like me then) wanted
> productivity.
> > Getting rid of the 640k barrier and LFN was the norm (nearly all ran
> under
> > win9x, even only to multitask dosboxes).
>
> DJGPP 2.0 already supported all of that in 1996.

I never cared for DJGPP, except as FPC donor (think as/ld/make/gdb/extender). (or any gcc derivate for a MS platform)

> > Of course sb would whine that BP7 would generate smaller binaries, and
> > pointed to some badly maintained LFN unit for LFN support and protected
> > mode (that killed the 640kb limit, but not the 64k limit) as scapegoats.
> > Then sb else would comment that TP6 generated even smaller binaries.
> >
> > Then sb would whip out TP4 and convert the results to .COM, finally sb
> > wrote an application that did a bit of 32-bit register access and an LFN
> > int in assembler, and proved that assembler was Turing complete. (not
> that
> > anybody asked)
>
> AFAIK, TP used generic MZ .EXEs, hence it was limited to 640 kb max .EXE
> size.

Real mode TP's like 5.5 and 7 were 16-bit so had a 640k. Moreover it didn't have huge memmodel (or eq) so allocations were limited to 64k. Worse, it had a 64kb static data segment.

> (Not sure about BP7 DPMI stuff. NE? That probably had different
> limits.)

Code limits were pretty far away. Memory was better stretched but suffered from exhaustion of selectors (8192 max), but still for the above 64k limits.

> However, the quest for smallest size is indeed useless for extremely
> trivial programs. Then again, is optimization ever useless? And if so, when
> do you draw the line?

Noticable. 100 100-200kb files on my 2.5GB HDD back then were not noticable (compared to the 20-50kb they were in TP)

> How would you ever know if it's good enough?

That's the point. If you don't notice without deliberately taking a closer look, there is no difference.

> Good for them for being apathetic (not!). Give them a medal.

They don't care. Apathetic is something else, namely being overstimulated or weary and not being able to take action because of it.

> What makes people so accepting of what they see being spewed out by
> compilers?

Most simply consider it a given. But in this particular (size) case, they don't even care, within bounds.

> How would they ever know if it's good enough?

How do you?

> "Well, it's not
> gigabytes. Well, it still fits in RAM. Well, nobody complained!" Just
> pretend that it's already optimal, then you don't have to do anything.

Having hypothetical optimal situation isn't even on their radar. They want to do things. And even if they care about "optimal", they will focus their attentions on optimizing something they will actually notice.

> > That's the point. It is only a *PROBLEM* in minimalists minds. Even back
> > then I had better things to do with my time. You only do something about
> > size, if it is really, really prohibitive, not because of some
> underbelly
> > feeling that it is not "right' (e.g. during a brief WINCE stint).
>
> Then why does everyone (e.g. FreeBSD, Cygwin, SourceForge) compress every
> single binary package?

Server load. Less bytes moved. Easy on the connection (theirs and the dlers). Without visible downsides since reversable. But they do things in bulk, and connections have had practical limits much longer than HDD sizes

> For that matter, why optimize anything at all by default?

Because many you do notice, and come without much downsides. (except for the compiler builder). Size optimizing binaries by working the RTL or switching compilers is a wholly different reason.

Only a few extreme cases actually switch compilers purely because of performance.

> For that matter, why write portable code? It just takes longer. Obviously
> you have better things to do with your time.

That is certainly true, and I don't write portable code by default. E.g. my work apps are hopelessly windows only.

> Sure, "Hello, world!" is useless. So is worrying about 100 vs. 200 bytes.
> But that doesn't mean optimizations are useless. What do you think
> "optimize" means???

In general or in this case? The compiler optimizer achieves easy and preferably safe optimizations in speed or size.

But binary size comes more from library architecture and target specific factors than compiler codegeneration.

> > Nobody is really planning any form of development traject. They just
> want
> > to be confirmed that they reached the pinnacle of their personal quest,
> the
> > smallest binary and demonstrate their "knowledge".
>
> But it is real concrete "knowledge" as opposed to just blindly saying,
> "Good enough!" without any sort of measurement.

It is only concrete within their frame of mind sacrificing things other don't want to sacrifice. Locale, language awareness, exception, RTTI etc.

> Resources are inherently limited. Computers are full of all kinds of
> arbitrary limits, some intentional, others not. Anything that pretends
> "virtually unlimited" is a liar.

Of course. But you are talking about microoptimization *AFTER* the sane things have been done. THAT's the solution searching for a problem context.

> > doing more to fix this. At least in FPC, and I have been doing it for a
> > long time now. It might be not enough in your eyes, but that doesn't
> mean
> > that nothing is done, and that there is not an eye for the worst
> excesses.
>
> I don't remember complaining here in this thread about FPC at all.

1. look at the title.
2. you replied to my reply to w3* msg that was full of FPC vs TP55 comparisons based on size.

> We're
> just saying, in general, that most compilers are suboptimal.

I think something like FPC is fairly close to optimal. It could be more optimal, but the interest and the manpower lacks from that.

> Just because
> you want to pretend it's not important or already good enough doesn't mean
> that's true.

I say it hardly exists as real problem, and that it is mostly a hobby of a few people. And as additional evidence I present the fact that most of those people microoptimize and minimize old compilers instead of actually carrying responsibility in OSS projects.

> Sure, people can complain about anything, but half the time
> they actually have a point.

That's not my experience. I think complaining is as much to relieve tension and stress. But my opinion might be clouded by having spent a few years on an helpdesk.

 

Complete thread:

Back to the forum
Board view  Mix view
15112 Postings in 1359 Threads, 247 registered users, 13 users online (0 registered, 13 guests)
DOS ain't dead | Admin contact
RSS Feed
powered by my little forum