Back to home page

DOS ain't dead

Forum index page

Log in | Register

Back to the forum
Board view  Mix view

FPC 16-bit (Announce)

posted by marcov, 08.05.2013, 23:39

> > More importantly, I don't even see anything that you name is on the same
> > level as FPC. They are not decade long polished production compilers,
> but
> > one of attempts of people.
>
> I wasn't trying to compare anyone to anyone else. And certainly such fluff
> as "production" wasn't on my radar at all. All projects start small
> initially. Unfortunately, most of them fall apart later because they can't
> handle the complexity. Don't you think that is something that should be
> avoided?

No. Most are simply not worth keeping. The initial part is not the hard part, and I see no reason to expend extra effort to "save" it. Integration costs are probably higher than redoing the work.

And redoing the work is then more favorable because it can then align better with the assumptions and architecture of the codebase you are integrating it into.

Growing from the initial situation to a sustainable project is usually the bottleneck, not the initial effort.

> > And again, your bootstrapping problem is mostly self-imposed. I don't
> see an absolute need to have a 16-bit hosted compiler.
>
> No, of course not, only if someone could only access 16-bit only machines,
> which is more rare these days than 10 years ago. (But 16-bit host is far
> from impossible, even for "real world production commercial compilers".)

Why do you think that nearly none of the great compilers are also 16-bit hosted? While possible (at least for 286pm), it is only possible if you craft specially for it, and the market after the main "turbo" lines wasn't big enough anymore to warrant that.

And even if sb would have tried, all the 16-bit aficionados couldn't be pried from previous generation turbo compilers with a crowbar.

> But let's be honest, my printer most likely doesn't care if I use 16-bit or
> 32-bit or 64-bit software.

Depends. Afaik bioses went partially 32-bit internally because of USB, so if you have an USB Printer ... :-)

> Most reasonable problems (and programs) don't (or shouldn't) care
> about bitsize at all.

They do. Since they routine use datastructures > 64kb, and in general don't expect limits in the memory subsystem. And they generally value features over memory conservation. That's the problem with 16-bit, it requires special effort, not compiling general code for it.

> If you can't write something useful in 500 kb, you can't write anything.

Baseless claim.

[P4/P5]
> This was not necessarily meant to be a production tool out of the box.
> You'll have to adapt it. As is, it's meant to be a portable
> self-bootstrapping example of the original language as standardized.

Yes. But it doesn't solve the basic chicken-egg problem in a way that matters. It tries to solve it with minimalism and if you try to build on it, you fall in the same trap as Unix did.

IOW if you specify bootstrapping in some limited dialect (like C or standard pascal), to get proper bootstrapping, all core infrastructure must be in that limited dialect.

Even GCC gave that up recently, when they moved to C++......

> I certainly don't expect it to influence FPC (nor you) at all, just
> mentioning it for completeness.

Completeness of what?

> > > <sarcasm>
> > > Forget 32-bit, try 40- ... scratch that, 48-bit.

> > Why put the holy limit at a 16-bitter with a 20 bit bus ? Why not go for
> > some real iron, like a 8-bitter ? :-)
>
> Already
> been done. Besides, I don't have any 8-bitters.

You misunderstood my point. It wasn't a dare to come up with even more outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit) is so bad.

> > Or 1998 or so, when computers routinely got 16MB+
>
> You don't need 16 MB for a decent compiler. You don't need 4 GB (or even 1
> GB) for a decent OS.

Depends on "decent" of course. And on the size of your programs.

> > IMHO such efforts will never break free from mindless minimalism, and
> are
> > thus doomed from the start.
>
> I'm not suggesting a OISC esolang, just something that is reasonably-sized
> and doesn't need everything and the kitchen sink.

While that sounds reasonable, all the examples are extremely minimal, and seem to favour size over function in an extreme way.

> > He is building a FPC backend. The frontend is unmodified. FPC is a pm
> > codebase. Do you actually READ my posts ? :)
>
> I'd be surprised if he supported full Delphi 7, but I guess it's not
> impossible.

He says using huge and pm there is a fair chance. Contrary to compilers, many classes of applications are not that memory hungry that 16MB is a problem.

> > (xdpascal)
> > > It's mostly TP3-ish, as he claims, targets 16-bit DOS, public domain,
> > and
> > > is Delphi (32-bit Win32 .EXE) hosted.
> >
> > Yes. Exactly, useless.
>
> Useless because it lacks a full dialect (e.g. Delphi 7) or because you
> yourself will never need to use it?

It serves no useful purpose in any way. It is a subset of an obsolete toolchain that got 3 newer versions. Even a pm TP7 compiler is a total different world featurewise.

> > > No, I'm not saying this particular one is better or super useful, just
> > > saying it's a start
> >
> > For what? Suicide by compiler frustration?
>
> I've not seen the FPC guts, but I doubt it's much cleaner.

It's not how it looks, but what you can do with it.

> > > an existing project with similar goals, so it's worth
> > > mentioning (IMO).
> >
> > Not a production compiler, not even in the same category.
>
> I never mentioned "production". I wasn't comparing anyone, just mentioning
> an interesting example.

And I never mentioned being interested in minimalist bootstrapping philosophy.

> All such harping like that just reminds me of BWK's
> "real work, not toys" bullcrap. (Some people just like to exaggerate.)

I think the point of this thread should be more that one era's real work are toys in a later era.

> > For any compiler creating a new backend (architecture target) is a
> problem.
> > It is simply a lot of work. But better a good compiler for a few
> targets,
> > than something worthless that supports "everything"
>
> If that were true, then why do we have so many competing technologies?

Legacy, preference, slightly different application domains and tradeoffs.

> Oh, that's right, it's often political bias, not technical, that limits
> software.

It is technical too.

> > > and "must have all optimizations from the entire world".
> >
> > Yes. As you have shown, there is enough cruft already. Feature dropping
> is
> > not an option. Features are important. 16-bit hosted not, except for
> fun.
>
> You have to stop somewhere. You can't add indefinitely.

Like with everything, such requirements change with the requirements of the times. No I don't think there is a hard limit somewhere.

And, something you are probably well aware of as Dos user, people tend to earlier drop old features than stop creating new ones.

> Worse is when
> features start conflicting (e.g. GNU Pascal's Borland stuff, lots of
> workarounds for that). Eventually complexity falls in on itself.

IMHO GPC was victim of its own implementation limitations and the fact that the devels only reluctantly accepted the need for Borland compatibility. And even then not fully.

I sometimes got the feeling they were fighting gcc more than they were benefiting from it.

> > Modula2 and Pascal have no subset. They are different languages with
> > different keywords and different blockstructure principles.
>
> Not 100% identical, no, but very very similar, due to common ancestry.

Yes wrt principles, not that much in the actual language. Blockstructure is different, operators are different, case sensitive etc etc.

Only the unit/module and scoping systems match, but that is more language semantics than language.

> > (TeX discussion removed, didn't see the relevance, except to illustrate
> > that before 1985-early nineties period portability is a b*tch)
>
> Decent portability is still relatively unknown.

That's because it doesn't exist.

Nearly all portability attempts starts to define a subset that is supposed to be portable. And that is always either too minimal (making the lowest common denominator awfully small, think standard C here), or too big (throw in one a slightly odd ball platform and you already have a problem, think Windows and POSIX).

> People make horrible
> assumptions and just call it portable because it happens to work (sometimes
> accidentally). I don't admire any project that "only" works on Mac, Win,
> Lin. It's often less out of technical reasons than just developer whimsy.

Right! It HAS to work on FreeBSD too of course!

> > > GCC and GNAT can brag all they want about portability, but they are a
> > pain to compile, too big, too POSIX-y, and just overkill.
> >
> > No. They are perfectly fine. Nothing wrong with it. Real production
> > compilers, no toys.
>
> Perfectly fine? GCC has undergone ten bazillion changes over the years and
> broken a lot of things. It's a pain to build, it's a pain to install. It's
> too POSIX heavy, and expecting it to work reliably on anything outside
> select targets (GNU/Linux using ELF, especially, their preference) is
> sometimes only wishful thinking.

Yeah. But while it is not universal, at least it can be used adequately for certain tasks.

 

Complete thread:

Back to the forum
Board view  Mix view
22632 Postings in 2109 Threads, 402 registered users, 430 users online (0 registered, 430 guests)
DOS ain't dead | Admin contact
RSS Feed
powered by my little forum