Back to home page

DOS ain't dead

Forum index page

Log in | Register

Back to the board
Thread view  Mix view  Order
marcov(R)

26.04.2013, 09:41
 

FPC 16-bit (Announce)

FYI

Initial 16-bit target support (small mm only for now) was merged into FPC trunk mainline yesterday, made by Nikolay Nikolov.

So FPC is now officially 16/32/64 :-)

Laaca(R)

Homepage

Czech republic,
26.04.2013, 16:14

@ marcov

FPC 16-bit

Wow! It is cool! I will test it.

---
DOS-u-akbar!

marcov(R)

26.04.2013, 22:30

@ Laaca

FPC 16-bit

> Wow! It is cool! I will test it.

Disclaimer: I (= Nikolay) still haven't tested if these things still work after the
merge.

- simple programs work. The system unit compiles and is implemented almost completely. All file and directory operations work, including long file name support. The only major thing not yet implemented is ParamStr and ParamCount.
- the DOS unit compiles also, but contains more functions which are not yet implemented
- the required external tools are the NASM assembler, the OpenWatcom linker WLINK and the WLIB tool also from OpenWatcom
- the object format used is ROMF, which is the most popular DOS .obj file format and should be compatible with most 16-bit compilers: http://en.wikipedia.org/wiki/Relocatable_Object_Module_Format
- memory model is "small", with code <= 64KB and data <= 64KB, all pointers are near and SS=DS
- the default calling convention is "pascal", but is currently incompatible with Turbo Pascal 7 due to the different memory model - TP7 uses far calls to call external routines and all pointers parameters are far and not near. TP3 may be more compatible, due to the use of near calls, but it still uses far pointers, so functions with pointer parameters will not work.
- the heap manager works, but the heap is limited to 64KB minus stack and static data size
- smartlinking works and is in fact required, because the code for the code for the whole system unit exceeds 64KB all by itself.
- there are still some 186/286 instructions generated, so code does not yet run on a real 8086/8088, but this should be fixed shortly
- floating point operations require an FPU. There's no soft float implementation yet

DOS386(R)

28.04.2013, 14:53

@ marcov

FPC 16-bit

> FYI Initial 16-bit target support (small mm only for now) was merged into FPC

COOOOOOOOOOOOOOOOOOOOOOOL :-)

Hadn't you dropped 16-bit support in favor of "GO32V2" cca 20 years ago ? It's never too late to correct critical faults :-)

---
This is a LOGITECH mouse driver, but some software expect here
the following string:*** This is Copyright 1983 Microsoft ***

marcov(R)

29.04.2013, 10:17

@ DOS386

FPC 16-bit

> > FYI Initial 16-bit target support (small mm only for now) was merged into
> FPC
>
> COOOOOOOOOOOOOOOOOOOOOOOL :-)
>
> Hadn't you dropped 16-bit support in favor of "GO32V2" cca 20 years ago ?
> It's never too late to correct critical faults :-)

I had nothing to do with this port :-)

I left 16-bit for new projects +/- 1997 when I started with FPC, and ported all remaining stuff to it in 2000. (mostly because I started using Windows NT/2000 at work, and soon after at home too). I kept a 386 mobo for a while for minimalistic stuff, but as hardware started to pile up, I got rid of it.

Note that it is all still very initial, and I haven't heard the exact plans (asked for rationale, Nikolay said "because I can"). E.g. how many TP idiosyncrasies (think inline($54) etc) are going to be implemented.

Self hosting (16-bit compiler that generates 16-it programs) is probably out of the question. (at least if I my memory is correct and 286pm only does 16MB max) So probably it will always be (64/)32-bit -> 16-bit crosscompile.

Yesterday hello world (and thus the whole RTL startup) ran on a 80186 (LX200 iirc)

Laaca(R)

Homepage

Czech republic,
29.04.2013, 12:52

@ marcov

FPC 16-bit

Crosscompiling from DOS32 (GO32V2) into DOS16 is OK - no need for realmode FPC compiler.
The ability to generate pure 8086/88 code is nice but it would be even more cool to support (via come switch) also 386+ and 686+ code generation and TP7 compatible memory model.
BTW: will be supported the inline assembler with all modetn instructions known to FPC?

---
DOS-u-akbar!

Rugxulo(R)

Homepage

Usono,
30.04.2013, 14:00

@ marcov

FPC 16-bit

> > > FYI Initial 16-bit target support (small mm only for now) was merged
> into
> > FPC
> >
> > COOOOOOOOOOOOOOOOOOOOOOOL :-)
> >
> > Hadn't you dropped 16-bit support in favor of "GO32V2" cca 20 years ago
> ?
> > It's never too late to correct critical faults :-)
>
> I had nothing to do with this port :-)

IIRC, FPC (circa 1993) was originally compiled with TP (16-bit) and intended as 32-bit TP dialect replacement (since Borland defected to Windows) and eventually became self-hosted for 32-bit DOS (GO32v1) target only. Then everybody went cuckoo for Cocoa Puffs for Delphi (1997?).

> I left 16-bit for new projects +/- 1997

DEC brought out the Alpha in 1992, why didn't you jump ship earlier? They have a Pascal compiler. Heck, GPC might actually build there too. ;-)

> when I started with FPC, and ported all remaining stuff to it in 2000.

Almost funny considering how much the 1.x series is regarded as so weak in hindsight.

> (mostly because I started using Windows
> NT/2000 at work, and soon after at home too). I kept a 386 mobo for a while
> for minimalistic stuff, but as hardware started to pile up, I got rid of
> it.

Yes, admittedly, 2000 and XP were flawed but still somewhat serviceable for DOS stuff. But stuff after that ... not so much. It's kinda ugly having so many wars over competing tech.

> Note that it is all still very initial, and I haven't heard the exact plans
> (asked for rationale, Nikolay said "because I can"). E.g. how many TP
> idiosyncrasies (think inline($54) etc) are going to be implemented.

There are other 16-bit Pascal subset-ish compilers and VMs out there. Even P4/P5 is well-documented, but I've not delved enough to even pretend to port that (to 32-bit, much less 16-bit) as actual native compiler.

> Self hosting (16-bit compiler that generates 16-it programs) is probably
> out of the question. (at least if I my memory is correct and 286pm only
> does 16MB max) So probably it will always be (64/)32-bit -> 16-bit
> crosscompile.

You can use EMS on 8086 or XMS on 286. IIRC, Jason Burgon (GVFM dude) once said you can get about 128 MB from 16-bit Borland DPMI, maybe more with HDPMI16 or (which?) Windows or DPMIONE. And of course always swap to hard drive if needed.

You really don't need lots of RAM to make a compiler. You can either split off into several passes (like early Modula-2) or just don't try to support the full Delphi XE2 ten bazillion functions, operators, keywords, forms, IDE, smart linker, etc.

> Yesterday hello world (and thus the whole RTL startup) ran on a 80186
> (LX200 iirc)

I know people love those, but they're hard to find. My silly Android tablet has aDOSBox [sic], but it's dirt slow. I haven't bothered much with it. I assume eventually somebody could port something natively useful to FPC, which more or less claims to support Android and ARM in trunk (right?). Pascal-S is too wimpy, but P5 is (so far) too hard. Well, and people demand graphics, forms, whatever, so P5 is probably uninteresting to 99% of Pascal fans. (Though using GPC builds of P5 or even old TP55 under aDOSBox is probably possible, if desperate, but I've had no huge need.)

There's another (Delphi-compiled, HX friendly IIRC) DOS cross-compiler (.COM only) subset on SourceForge, public domain, though I don't know how useful or interesting it is.

http://sourceforge.net/projects/xdpascal/files/

I've intentionally tried not to cloud this thread, esp. since I'm no compiler expert. But suffice to say, it's complex. The two biggest issues seem to be trying to use all available memory (and peripherals) as well as interfacing with the outside world (libs, ABIs, linking).

Quite honestly, I know it's not quite the same, but I wonder if it would be equally worthwhile to port something simple like Pascal-S interpreter. (I'm not sure if it's been done reliably before. There are a lot of half-broken offshoots and maybe Ada- or Seed7-hosted versions floating around, dunno.) Though I know it already can compile quite easily (minimal changes) with FPC, but I'm thinking about translating into other languages (Modula-2? Oberon?), not necessarily C. Though it's fairly pointless except as an exercise. Heck, I've not even gotten a grip on ye olde PL/0 yet. (As cool as self-hosting is, I can't help but wonder if it'd be wiser to be more neutral. Knuth's original WEB and TeX were very Pascal subset-ish, meant to be easy to port to other languages. Though that's another ball of wax I've not delved into yet.)

marcov(R)

30.04.2013, 17:15

@ Rugxulo

FPC 16-bit

> > I had nothing to do with this port :-)
>
> IIRC, FPC (circa 1993) was originally compiled with TP (16-bit) and
> intended as 32-bit TP dialect replacement (since Borland defected to
> Windows) and eventually became self-hosted for 32-bit DOS (GO32v1) target
> only.

Correct

> Then everybody went cuckoo for Cocoa Puffs for Delphi (1997?).

Early, D1 (16-bit, win3.x TP compiler based) is 1995. But I didn't migrate from TP, but from Topspeed Modula2. I used TP before, but I never explorer some aspects (like its pm stuff)

> > I left 16-bit for new projects +/- 1997
>
> DEC brought out the Alpha in 1992, why didn't you jump ship earlier?

SGI did Mips in 1990. But I used DOS and a bit of Linux/BSD then, and in those early times I mostly used W9x as I used DV/X before it , to run multiple dos shells and swap between them.

The main reason for me to abandon 16-bit was the 640k limit. (there was an extender for TSM2, but it was expensive, and I didn't have it).

TSM2 also didn't have an HUGE memorymodel, so I don't have much experience with that either.

I played with the thought to go back to TP and dig into the 16-bit extender stuff, but I was playing with *nix, and knew that would get more important. (for my use I mean, since that was the dorm's server and the always-on machine)

> They
> have a Pascal compiler. Heck, GPC might actually build there too. ;-)

I never managed to build GPC on anything in that period. I also got very mixed signals about its viability (e.g. different versions (both GPC and GCC) on different targets etc, and little community around it.

Even if FPC hadn't existed, I doubt I would have ended up with GPC.

> > when I started with FPC, and ported all remaining stuff to it in 2000.
>
> Almost funny considering how much the 1.x series is regarded as so weak in
> hindsight.

The requirements were different back then too.

> Yes, admittedly, 2000 and XP were flawed but still somewhat serviceable for
> DOS stuff.

This was early w2k time. I ran betas even before it really came out. XP was much,much later.

And anyway, IMHO it was not servicable. It was the proverbial last straw. DPMI memory limits, console programs starting up real slow, application programs assuming LFN already anyway

> But stuff after that ... not so much. It's kinda ugly having so
> many wars over competing tech.

I was already way beyond that then, since I had working FPC code on *nix.

Dos was important to me (a platform I used for 10 years), but not the center of my world. The portable code had inherited assumptions (like LFNs), and DOS was becoming the weakest link, so I just decided to move on.

> > Note that it is all still very initial, and I haven't heard the exact
> plans
> > (asked for rationale, Nikolay said "because I can"). E.g. how many TP
> > idiosyncrasies (think inline($54) etc) are going to be implemented.
>
> There are other 16-bit Pascal subset-ish compilers and VMs out there.

Yes. I fail how to see it is relevant for this discussion.

> Even
> P4/P5 is well-documented, but I've not delved enough to even pretend to
> port that (to 32-bit, much less 16-bit) as actual native compiler.

P4/P5 is not strictly a 16-bit compiler. I don't know where you got that idea? It is release as source, and meant to be portable.

It compile with FPC/trunk nowadays (and even mostly works), except for the bit where in ISO mode parameters are wired to file descriptors. A few paramstr() inserts, and it works, at least superficially.

> > Self hosting (16-bit compiler that generates 16-it programs) is probably
> > out of the question. (at least if I my memory is correct and 286pm only
> > does 16MB max) So probably it will always be (64/)32-bit -> 16-bit
> > crosscompile.
>
> You can use EMS on 8086 or XMS on 286.

Memory as in addressable via the addressing scheme of the compiler. Using EMS/XMS requires handcoding, or at the very least very coarse block level allocation (and no fine grained access). That won't do for a compiler's memory requirement.

> IIRC, Jason Burgon (GVFM dude) once
> said you can get about 128 MB from 16-bit Borland DPMI, maybe more with
> HDPMI16 or (which?) Windows or DPMIONE.

Afaik the 286 MMU was limited to 16MB. (the same limit as the ISA bus) Maybe you can get further with newer CPUs, but in that case you can also run the 32-bit version.

If your MMU can handle more, I would guess the hardware limit is having 65536 non overlapping segments of 64k, which is basically 4GB +/- some mappings required by Dos (like the first 1MB)?

> And of course always swap to hard drive if needed.

This is fairly useless for compilers. Swapping works fine if blocks of memories aren't use for a long time (iow if you have a working set of memory that at most times is sufficiently smaller than total memory)

Compilers mostly allocate finely grained.

> You really don't need lots of RAM to make a compiler. You can either split
> off into several passes (like early Modula-2) or just don't try to support
> the full Delphi XE2 ten bazillion functions, operators, keywords, forms,
> IDE, smart linker, etc.

Irrelevant. We are talking about retargeting an existing compiler to 16-bit, not new compilers and what you can dream up. If you want those dreams, get started :-)

> My silly Android tablet
> has aDOSBox [sic], but it's dirt slow. I haven't bothered much with it. I
> assume eventually somebody could port something natively useful to FPC,
> which more or less claims to support Android and ARM in trunk (right?).

Yes, but that are mostly not black and white things. Two ways to android even, but usually it is simpler to simply root it to debian, if you want development work done :-)

> There's another (Delphi-compiled, HX friendly IIRC) DOS cross-compiler
> (.COM only) subset on SourceForge, public domain, though I don't know how
> useful or interesting it is.
>
> http://sourceforge.net/projects/xdpascal/files/

Not useful and uninteresting. No strings, not sets, nothing. Even TP3/4 is better.

> I've intentionally tried not to cloud this thread, esp. since I'm no
> compiler expert. But suffice to say, it's complex. The two biggest issues
> seem to be trying to use all available memory (and peripherals) as well as
> interfacing with the outside world (libs, ABIs, linking).

Yes. That's the first sane remark in this post. But it misses the major problem. Debugging. Much more difficult than assembler or linker.

Watcom debug is said to support Dwarf,stabs and Codeview, but it is not clear what of those is supported for 16-bits target.

> Quite honestly, I know it's not quite the same, but I wonder if it would be
> equally worthwhile to port something simple like Pascal-S interpreter. (I'm
> not sure if it's been done reliably before. There are a lot of half-broken
> offshoots and maybe Ada- or Seed7-hosted versions floating around, dunno.)

While I don't have a direct interest in the 16-bit target from an user perspective (it's merely retro sentiment for me), I would say there are enough minimalistic attempts. Better have a cross-only compiler that generates good code and supports a complete dialect than forcing an artificial 16-bit bottleneck on the compiler to be 16-bit itself.

And more importantly, I think it is impossible. 16MB is simply not enough. Solutions to use more probably also can run the 32-bit compiler anyway.

> Though I know it already can compile quite easily (minimal changes) with
> FPC, but I'm thinking about translating into other languages (Modula-2?
> Oberon?),

I wasn't aware that any of those have currently active 16-bit efforts. Or anybody else even.

> Heck, I've not even gotten a grip on ye olde PL/0 yet. (As cool
> as self-hosting is, I can't help but wonder if it'd be wiser to be more
> neutral.

I'm not sure what neutral means in this context. IMHO self hosting is the neutral choice (no added dependencies :)

marcov(R)

30.04.2013, 17:36
(edited by marcov, 30.04.2013, 18:03)

@ Laaca

FPC 16-bit

> Crosscompiling from DOS32 (GO32V2) into DOS16 is OK - no need for realmode
> FPC compiler.

Yes. It would probably detract too much from simply the ability to make good programs.

> The ability to generate pure 8086/88 code is nice but it would be even more
> cool to support (via come switch) also 386+ and 686+ code generation and
> TP7 compatible memory model.

> BTW: will be supported the inline assembler with all modetn instructions
> known to FPC?

Technically possible and doable I think, but I don't know what Nikolay has planned in this department.

Short term I guess his goals are compiling/testing increasingly complex programs(*) and laying in foundations in the compilers for the various memory models.

Some optimization (to generate better code than TP) would also be in order.
FPC has better general compiler abilities (register allocation, inlining etc), but TP has a handtuned target dependent optimizer.

(*) since that would open up compiling the FPC testsuite to check for regressions and omissions

Rugxulo(R)

Homepage

Usono,
01.05.2013, 03:12

@ marcov

FPC 16-bit

> > > Note that it is all still very initial, and I haven't heard the exact
> > plans, e.g. how many TP idiosyncrasies (think inline($54) etc) are
> > going to be implemented.
> >
> > There are other 16-bit Pascal subset-ish compilers and VMs out there.
>
> Yes. I fail how to see it is relevant for this discussion.

In theory, efforts or sources could be combined. No reason to reinvent everything in a vacuum (unless that floats your boat). We're not the first people to ever talk about such things.

> > Even
> > P4/P5 is well-documented, but I've not delved enough to even pretend to
> > port that (to 32-bit, much less 16-bit) as actual native compiler.
>
> P4/P5 is not strictly a 16-bit compiler. I don't know where you got that
> idea? It is release as source, and meant to be portable.

I know, but I meant that somebody could (in theory) port it to whatever target they wanted. Even 16-bit wouldn't be totally out of the question, obviously.

Originally Scott implied that he wanted to eventually extend it to P6, a real native code compile for an extended dialect (his Pascaline?). But he's been very busy lately.

> It compile with FPC/trunk nowadays (and even mostly works), except for the
> bit where in ISO mode parameters are wired to file descriptors. A few
> paramstr() inserts, and it works, at least superficially.

Haven't tried lately. All I know is what we briefly discussed on the newsgroup (choked on "new(blah,blah2,blah3)"). Maybe I should try FPC snapshots on P5 again. It'd be cool if it worked in FPC. (Again, as nice as it is to self-bootstrap, perhaps there should be a bridge or two from other [more popular?] languages. Even Modula-2 or Oberon are more widely implemented than classic Pascal, AFAIK.)

> > IIRC, Jason Burgon (GVFM dude) once
> > said you can get about 128 MB from 16-bit Borland DPMI, maybe more with
> > HDPMI16 or (which?) Windows or DPMIONE.
>
> Afaik the 286 MMU was limited to 16MB. (the same limit as the ISA bus)
> Maybe you can get further with newer CPUs, but in that case you can also
> run the 32-bit version.

<sarcasm>
Forget 32-bit, try 40- ... scratch that, 48-bit. No reason for a modern compiler these days to not use 1 TB. If you've got it, use it, why waste it? It's there to be used. RAM is cheap. Let them all upgrade, it's not 2003 anymore.
</sarcasm>

Funny that original Pastel (and GCC C frontend) couldn't work on 68000 due to reading the whole thing at once and keeping it in RAM, ate up too much stack. IMO, for a compiler (at least bootstrapping), you have to keep things really really minimal and avoid such situations. All the fancy features come later, first just get the basics working.

> > And of course always swap to hard drive if needed.
>
> This is fairly useless for compilers. Swapping works fine if blocks of
> memories aren't use for a long time (iow if you have a working set of
> memory that at most times is sufficiently smaller than total memory)

That's how they all did it "back in the day", tons and tons of passes. It's slower than keeping it all in RAM, but they didn't have that luxury.

> Irrelevant. We are talking about retargeting an existing compiler to
> 16-bit, not new compilers and what you can dream up. If you want those
> dreams, get started :-)

If you're going to target a huge language like Delphi (and I'm not saying Nikolai is, I don't know), you're going to be in for a world of hurt. You can indeed target 16-bit with 16-bit host without problems. Maybe not as easily if you stick to strict portability or demand WPO (or LTO), but it can be done. In fact, throw away the idea that optimizations are necessary at all ... aren't computers fast enough these days?? (Throw away ye olde Makefiles! Use batch scripting to compile.)

> > There's another (Delphi-compiled, HX friendly IIRC) DOS cross-compiler
> > (.COM only) subset on SourceForge, public domain, though I don't know
> how
> > useful or interesting it is.
> >
> > http://sourceforge.net/projects/xdpascal/files/
>
> Not useful and uninteresting. No strings, not sets, nothing. Even TP3/4 is
> better.

It's mostly TP3-ish, as he claims, targets 16-bit DOS, public domain, and is Delphi (32-bit Win32 .EXE) hosted. That seems relevant. (Besides, the whole idea is that it can be improved further.)

Strings? Same as "array [1..255] of char". Sets? Same as "array [1..32] of boolean". Don't think on what it lacks in proper ways, think semantic equivalents.

No, I'm not saying this particular one is better or super useful, just saying it's a start, an existing project with similar goals, so it's worth mentioning (IMO).

> > I've intentionally tried not to cloud this thread, esp. since I'm no
> > compiler expert. But suffice to say, it's complex. The two biggest
> issues
> > seem to be trying to use all available memory (and peripherals) as well
> as
> > interfacing with the outside world (libs, ABIs, linking).
>
> Yes. That's the first sane remark in this post. But it misses the major
> problem. Debugging. Much more difficult than assembler or linker.
>
> Watcom debug is said to support Dwarf,stabs and Codeview, but it is not
> clear what of those is supported for 16-bits target.

In fairness, I forgot about error recovery and debugging, two other goals that people always add (as well as optimizations, ugh). Heck, even FASM doesn't have debugging, only "-s" for future converting. (Probably less of a loss as asm is so low level anyways.) There are, as always, too many competing and incompatible formats (that are only well-supported for certain platforms).

I'm just saying, you have to start somewhere. And I too don't know which debug format would be best. IIRC, OpenWatcom watcom debugging format was formerly used by them but eventually switched to Dwarf (2-ish). NASM can output Borland style for OMF. YASM supports Codeview (cv8), Dwarf, and Stabs. JWasm by now probably supports appropriate formats for all platforms.

> > Quite honestly, I know it's not quite the same, but I wonder if it would
> be
> > equally worthwhile to port something simple like Pascal-S interpreter.
>
> I would say there are enough minimalistic attempts. Better have
> a cross-only compiler that generates good code and supports a
> complete dialect than forcing an artificial 16-bit bottleneck
> on the compiler to be 16-bit itself.

I have no objections to any of that. I'm just saying, bootstrapping is a pain. Nothing (outside of Forth) ever easily bootstraps, and that (as well as monstrous demands for features) always kills language popularity.

I have no desire to limit dialect, but I also still think a decent 16-bit "hosted" compiler is far from difficult or impossible.

> And more importantly, I think it is impossible. 16MB is simply not enough.
> Solutions to use more probably also can run the 32-bit compiler anyway.

I don't understand your obsession with "no multiple passes, no swapping" and "must have all optimizations from the entire world". I'm talking reasonably simple stuff, not making production quality competition to Delphi here. (I hate to keep bringing up Delphi, but it's just huge. Perhaps I should say C++. Though I think even "simpler" C is bloated too, esp. C99 or C11. Everybody always overcomplicates everything.)

EDIT: I think RR Software dude (Brukhardt?) said that their Ada95 compiler never uses more than 16 MB of RAM. I mean, I know we have tons more these days, but you can have a decent compiler working in 16 MB!

> > Though I know it already can compile quite easily (minimal changes) with
> > FPC, but I'm thinking about translating into other languages (Modula-2?
> > Oberon?),
>
> I wasn't aware that any of those have currently active 16-bit efforts. Or
> anybody else even.

No, I don't know of any actively developed 16-bit compilers for those languages, I just meant in general, it might be easier to stick to a common Pascal-y subset of those languages and hand translate than trying to self-compile (bootstrap) with one specific (but using every feature in the book) language proper.

> > Heck, I've not even gotten a grip on ye olde PL/0 yet. (As cool
> > as self-hosting is, I can't help but wonder if it'd be wiser to be more
> > neutral.
>
> I'm not sure what neutral means in this context. IMHO self hosting is the
> neutral choice (no added dependencies :)

TeX / WEB (or whatever) avoided NEW/DISPOSE, nested procedures, WITH, and various other things (procedure parameters?). In 1982, I doubt Knuth was aware of Modula-2 (or even Ada), at least not in depth. And obviously not with later derivatives, but he did have portability in mind. The original Algol was a huge success, but everyone went on to various incompatible dialects for the same simple features! Without any bridge between them, they're all isolated to death (even if de facto or de jure standardized).

GCC and GNAT can brag all they want about portability, but they are a pain to compile, too big, too POSIX-y, and just overkill. At least Lua (for now) is reasonably small and sticks to ANSI C (with a few quirky assumptions). I just don't know why bootstrapping a language is this difficult.

marcov(R)

01.05.2013, 19:47

@ Laaca

FPC 16-bit

> BTW: will be supported the inline assembler with all modetn instructions
> known to FPC?

Nikolay said that the assembler reader is basically the i386 one, only he has it now artificially limited to root out 186+ instructions easier.

Rugxulo(R)

Homepage

Usono,
03.05.2013, 10:40
(edited by Rugxulo, 03.05.2013, 10:54)

@ marcov

FPC 16-bit (80186 cpu + NASM info)

> > BTW: will be supported the inline assembler with all modetn instructions
> > known to FPC?
>
> Nikolay said that the assembler reader is basically the i386 one, only he
> has it now artificially limited to root out 186+ instructions easier.

NASM supports "cpu 8086" to avoid (most) invalid instructions. (I think it used to only miss catching near 386+ jumps. Perhaps out of range short jumps can be conveniently worked around by -O2, but I can't remember exactly without checking. That was useful for porting xWcopy, though.)

Anything NASM version after 0.98.39 only compiles (officially) with C99 and 32-bit host (probably due to higher requirements from adding 64-bit). In other words, the latest 16-bit DOS .EXE (0.98.39) was here. And IIRC there was a bug (only fixed in unofficial Apple 0.98.40) regarding crashes on 256 or more externs. Also, can't remember, but the newer NASM manuals omit the (large) instruction reference, though at least latest seems to have a small index of instructions and cpus when introduced (though not as friendly, IMHO, without full descriptions).

If it helps, there's also a 186 asm/disasm ASCII chart (table) made back in the day on Simtel.net mirrors under /asmutl/ : disasm.zip

I also made my own wimpy refernce for my own amusement:


                         +---------------------------------+
                         | Intel 80186, 80188 (aka, "186") |
                         +---------------------------------+
Features:
=========

    shl cx,5      ; (shr,sar,rol,ror,rcl,rcr,...)
   
    imul ax,5
    imul ax,bx,5
   
    push 5
    pusha
    popa
   
    enter         ; enter stack frame (nested procedure variables)
    leave         ; leave stack frame
   
    bound         ; array bounds checking (calls INT 5)
    into          ; overflow checking     (calls INT 4)

    insb, insw    ; read from port in DX to [ES:DI]
    outsb, outsw  ; write [DS:SI] to port in DX

Quirks:
=======

    shl ax,33     ; doesn't always clear AX anymore (now only uses 5 bits)
    pop cs        ; not allowed anymore
    push sp       ; still decremented before push, unlike 286
    aam 10        ; not supported by NEC V20/V30
    salc          ; undocumented but "should" still work


P.S. Feel free to ignore all of this, you don't have to pass it along to Nikolai. Though if he's messing with DOS, perhaps he should visit here himself!!

Rugxulo(R)

Homepage

Usono,
03.05.2013, 10:53

@ marcov

P5 (PCOM/PINT) with FPC 2.7.1 snapshot

> P5 compiles with FPC/trunk nowadays (and even mostly works), except for the
> bit where in ISO mode parameters are wired to file descriptors. A few
> paramstr() inserts, and it works, at least superficially.

With a recent 2.7.1 (GO32V2) snapshot, it needs an explicit "if paramcount=0 then halt else assign(prr,paramstr(1))" for PCOM (etc). Though for some reason it doesn't fully write the output bytecode file unless an explicit "close(prr)" is done at the end. (I naively thought all files were closed and buffers flushed at every program exit.) Similarly for PINT, but then it seemed to somewhat work okay (though maybe a bug regarding set ranges). Not totally flawless, still needs work, but it's made good progress. Though I don't anticipate it passing all of Scott Moore's tests any time soon, but it's not an impossible dream anymore, thankfully. One of these days I'll have to compare closer (bytecode output) from a few test files (vs. GPC compiles' output).

marcov(R)

03.05.2013, 15:04

@ Rugxulo

P5 (PCOM/PINT) with FPC 2.7.1 snapshot

> Though I don't anticipate it passing all of Scott Moore's tests
> any time soon, but it's not an impossible dream anymore,

I've no idea if that is actually Florian's dream. (zero mod standards compatibility).

marcov(R)

03.05.2013, 15:10

@ Rugxulo

FPC 16-bit (80186 cpu + NASM info)

> NASM supports "cpu 8086" to avoid (most) invalid instructions. (I think it
> used to only miss catching near 386+ jumps. Perhaps out of range short
> jumps can be conveniently worked around by -O2,

Good to know. But note that inline assembler and backend-assembler are two different things.

The inline assembler is based on converted tables of NASM (for x86 likes at least).

The backend assembler happens to be nasm for the 16-bit port, but is not used otherwise. But e.g. the internal assembler can read an instruction as mnemonic, but output it to the backend assembler using db's.

> Anything NASM version after 0.98.39 only compiles (officially) with C99 and
> 32-bit host (probably due to higher requirements from adding 64-bit).

That's no problem. There is no use for a 16-bit hosted one, since the compiler won't be 16-bit hosted anyway.

> enter ; enter stack frame (nested procedure variables)
> leave ; leave stack frame

Not used in practice, due to nesting limits (31 or 32). Leave is sometimes used.

> bound ; array bounds checking (calls INT 5)
> into ; overflow checking (calls INT 4)

Not used afaik.

> insb, insw ; read from port in DX to [ES:DI]
> outsb, outsw ; write [DS:SI] to port in DX

No instructions a compiler would generate (manual assembler in libraries maybe)

> Quirks:
> =======
>
> shl ax,33 ; doesn't always clear AX anymore (now only uses 5 bits)
> pop cs ; not allowed anymore
> push sp ; still decremented before push, unlike 286
> aam 10 ; not supported by NEC V20/V30
> salc ; undocumented but "should" still work
> [/code]

Of those, probably only push sp could be generated by the compiler, and that that change is fairly known (detections are based on it).

And since I always had a 386+ (even in my 16-bit times), I'm not that well versed in this.

> P.S. Feel free to ignore all of this, you don't have to pass it along to
> Nikolai. Though if he's messing with DOS, perhaps he should visit here
> himself!!

He might already have a look from time to time already.

marcov(R)

03.05.2013, 23:27

@ Rugxulo

FPC 16-bit

> In theory, efforts or sources could be combined. No reason to reinvent
> everything in a vacuum (unless that floats your boat). We're not the first
> people to ever talk about such things.

True. But talk can be easily combinable over a couple of beers. For concrete projects that are built on different assumptions that is usually impossible.
(other than testing with one to make the other behave the same).

More importantly, I don't even see anything that you name is on the same level as FPC. They are not decade long polished production compilers, but one of attempts of people.

And again, your bootstrapping problem is mostly self-imposed. I don't see an absolute need to have a 16-bit hosted compiler.

> > P4/P5 is not strictly a 16-bit compiler. I don't know where you got that
> > idea? It is release as source, and meant to be portable.
>
> I know, but I meant that somebody could (in theory) port it to whatever
> target they wanted.

No, since it is not a complete solution. (pint.p must be compiled by something) AND the result is not that interesting anyway. A very minimal interpreter. Big deal.
(rest P4/P5 skipped. I'm only interested in it as testing codebase, not use)

> > Afaik the 286 MMU was limited to 16MB. (the same limit as the ISA bus)
> > Maybe you can get further with newer CPUs, but in that case you can also
> > run the 32-bit version.
>
> <sarcasm>
> Forget 32-bit, try 40- ... scratch that, 48-bit.

Why put the holy limit at a 16-bitter with a 20 bit bus ? Why not go for some real iron, like a 8-bitter ? :-)


> No reason for a modern
> compiler these days to not use 1 TB. If you've got it, use it, why waste
> it? It's there to be used. RAM is cheap. Let them all upgrade, it's not
> 2003 anymore.

Or 1998 or so, when computers routinely got 16MB+

> IMO, for a compiler (at least bootstrapping), you have to keep
> things really really minimal and avoid such situations. All the fancy
> features come later, first just get the basics working.

IMHO such efforts will never break free from mindless minimalism, and are thus doomed from the start.

> That's how they all did it "back in the day", tons and tons of passes. It's
> slower than keeping it all in RAM, but they didn't have that luxury.

Yes, they did. Imposing those same limitations now is masochism, not heroism. There is nothing to prove, it has been done.

> If you're going to target a huge language like Delphi (and I'm not saying
> Nikolai is, I don't know), you're going to be in for a world of hurt.

He is building a FPC backend. The frontend is unmodified. FPC is a pm codebase. Do you actually READ my posts ? :)

> You can indeed target 16-bit with 16-bit host without problems.

If you start over, and focus on it, you can probably do it in 286pm. (even with TP you needed the pm compiler for larger programs). Go ahead, but it is totally irrelevant for the FPC backend.

(xdpascal)
> It's mostly TP3-ish, as he claims, targets 16-bit DOS, public domain, and
> is Delphi (32-bit Win32 .EXE) hosted.

Yes. Exactly, useless.

> No, I'm not saying this particular one is better or super useful, just
> saying it's a start

For what? Suicide by compiler frustration?

> an existing project with similar goals, so it's worth
> mentioning (IMO).

Not a production compiler, not even in the same category.

> > Yes. That's the first sane remark in this post. But it misses the major
> > problem. Debugging. Much more difficult than assembler or linker.
> >
> > Watcom debug is said to support Dwarf,stabs and Codeview, but it is not
> > clear what of those is supported for 16-bits target.
>
> I'm just saying, you have to start somewhere. And I too don't know which
> debug format would be best. IIRC, OpenWatcom watcom debugging format was
> formerly used by them but eventually switched to Dwarf (2-ish).

Does the OW 16-bit target debugger support stab or dwarf? I can only see lists that are general (and thus are probably for 32-bit, and probably only a handful of the older like CV variants for the 16-bit target)

> NASM can
> output Borland style for OMF. YASM supports Codeview (cv8), Dwarf, and
> Stabs. JWasm by now probably supports appropriate formats for all
> platforms.

It is not who generates what (we do dwarf2/3 and stabs already), but what will read it.

> > I would say there are enough minimalistic attempts. Better have
> > a cross-only compiler that generates good code and supports a
> > complete dialect than forcing an artificial 16-bit bottleneck
> > on the compiler to be 16-bit itself.
>
> I have no objections to any of that. I'm just saying, bootstrapping is a
> pain.

For any compiler creating a new backend (architecture target) is a problem. It is simply a lot of work. But better a good compiler for a few targets, than something worthless that supports "everything"

> Nothing (outside of Forth) ever easily bootstraps, and that (as well
> as monstrous demands for features) always kills language popularity.

... and that proves the above point :)

> I have no desire to limit dialect, but I also still think a decent 16-bit
> "hosted" compiler is far from difficult or impossible.

Go ahead then. But it is irrelevant for the FPC project.

> I don't understand your obsession with "no multiple passes,

FPC archicture change.

> no swapping"

Swapping is fairly useless for programs that randomly access most of their allocated memory in a fine grained manner. Like compilers.

The whole principle of swapping/paging is that the working set at any time is systematically smaller than total memory demand. This usually doesn't go for compilers

> and "must have all optimizations from the entire world".

Yes. As you have shown, there is enough cruft already. Feature dropping is not an option. Features are important. 16-bit hosted not, except for fun.

> > I wasn't aware that any of those have currently active 16-bit efforts.
> > Or anybody else even.
>
> No, I don't know of any actively developed 16-bit compilers for those
> languages, I just meant in general, it might be easier to stick to a common
> Pascal-y subset of those languages and hand translate than trying to
> self-compile (bootstrap) with one specific (but using every feature in the
> book) language proper.

Modula2 and Pascal have no subset. They are different languages with different keywords and different blockstructure principles.

(TeX discussion removed, didn't see the relevance, except to illustrate that before 1985-early nineties period portability is a b*tch)

> GCC and GNAT can brag all they want about portability, but they are a pain
> to compile, too big, too POSIX-y, and just overkill.

No. They are perfectly fine. Nothing wrong with it. Real production compilers, no toys.

> At least Lua (for now) is reasonably small

Is a scripting language.

> and sticks to ANSI C (with a few quirky assumptions).

Yeah, another against it.

> I just don't know why bootstrapping a language is this difficult.

Its not a language. It is the program that DOES the language.

Rugxulo(R)

Homepage

Usono,
06.05.2013, 17:43

@ marcov

FPC 16-bit

> More importantly, I don't even see anything that you name is on the same
> level as FPC. They are not decade long polished production compilers, but
> one of attempts of people.

I wasn't trying to compare anyone to anyone else. And certainly such fluff as "production" wasn't on my radar at all. All projects start small initially. Unfortunately, most of them fall apart later because they can't handle the complexity. Don't you think that is something that should be avoided?

> And again, your bootstrapping problem is mostly self-imposed. I don't see
> an absolute need to have a 16-bit hosted compiler.

No, of course not, only if someone could only access 16-bit only machines, which is more rare these days than 10 years ago. (But 16-bit host is far from impossible, even for "real world production commercial compilers".)

But let's be honest, my printer most likely doesn't care if I use 16-bit or 32-bit or 64-bit software. As long as you send it the right codes, it'll do its job. (Hypothetically speaking. Right now my printer won't work at all, heh.) Most reasonable problems (and programs) don't (or shouldn't) care about bitsize at all. If you can't write something useful in 500 kb, you can't write anything.

> No, since it is not a complete solution. (pint.p must be compiled by
> something) AND the result is not that interesting anyway. A very minimal
> interpreter. Big deal.
> (rest P4/P5 skipped. I'm only interested in it as testing codebase, not
> use)

This was not necessarily meant to be a production tool out of the box. You'll have to adapt it. As is, it's meant to be a portable self-bootstrapping example of the original language as standardized.

I certainly don't expect it to influence FPC (nor you) at all, just mentioning it for completeness.

> > <sarcasm>
> > Forget 32-bit, try 40- ... scratch that, 48-bit.
>
> Why put the holy limit at a 16-bitter with a 20 bit bus ? Why not go for
> some real iron, like a 8-bitter ? :-)

Already been done. Besides, I don't have any 8-bitters.

> Or 1998 or so, when computers routinely got 16MB+

You don't need 16 MB for a decent compiler. You don't need 4 GB (or even 1 GB) for a decent OS.

> IMHO such efforts will never break free from mindless minimalism, and are
> thus doomed from the start.

I'm not suggesting a OISC esolang, just something that is reasonably-sized and doesn't need everything and the kitchen sink.

> > If you're going to target a huge language like Delphi (and I'm not
> saying
> > Nikolai is, I don't know), you're going to be in for a world of hurt.
>
> He is building a FPC backend. The frontend is unmodified. FPC is a pm
> codebase. Do you actually READ my posts ? :)

I'd be surprised if he supported full Delphi 7, but I guess it's not impossible.

> (xdpascal)
> > It's mostly TP3-ish, as he claims, targets 16-bit DOS, public domain,
> and
> > is Delphi (32-bit Win32 .EXE) hosted.
>
> Yes. Exactly, useless.

Useless because it lacks a full dialect (e.g. Delphi 7) or because you yourself will never need to use it?

> > No, I'm not saying this particular one is better or super useful, just
> > saying it's a start
>
> For what? Suicide by compiler frustration?

I've not seen the FPC guts, but I doubt it's much cleaner.

> > an existing project with similar goals, so it's worth
> > mentioning (IMO).
>
> Not a production compiler, not even in the same category.

I never mentioned "production". I wasn't comparing anyone, just mentioning an interesting example. All such harping like that just reminds me of BWK's "real work, not toys" bullcrap. (Some people just like to exaggerate.)

> For any compiler creating a new backend (architecture target) is a problem.
> It is simply a lot of work. But better a good compiler for a few targets,
> than something worthless that supports "everything"

If that were true, then why do we have so many competing technologies? Oh, that's right, it's often political bias, not technical, that limits software.

> > and "must have all optimizations from the entire world".
>
> Yes. As you have shown, there is enough cruft already. Feature dropping is
> not an option. Features are important. 16-bit hosted not, except for fun.

You have to stop somewhere. You can't add indefinitely. Worse is when features start conflicting (e.g. GNU Pascal's Borland stuff, lots of workarounds for that). Eventually complexity falls in on itself.

> Modula2 and Pascal have no subset. They are different languages with
> different keywords and different blockstructure principles.

Not 100% identical, no, but very very similar, due to common ancestry.

> (TeX discussion removed, didn't see the relevance, except to illustrate
> that before 1985-early nineties period portability is a b*tch)

Decent portability is still relatively unknown. People make horrible assumptions and just call it portable because it happens to work (sometimes accidentally). I don't admire any project that "only" works on Mac, Win, Lin. It's often less out of technical reasons than just developer whimsy.

> > GCC and GNAT can brag all they want about portability, but they are a
> pain
> > to compile, too big, too POSIX-y, and just overkill.
>
> No. They are perfectly fine. Nothing wrong with it. Real production
> compilers, no toys.

Perfectly fine? GCC has undergone ten bazillion changes over the years and broken a lot of things. It's a pain to build, it's a pain to install. It's too POSIX heavy, and expecting it to work reliably on anything outside select targets (GNU/Linux using ELF, especially, their preference) is sometimes only wishful thinking.

marcov(R)

08.05.2013, 23:39

@ Rugxulo

FPC 16-bit

> > More importantly, I don't even see anything that you name is on the same
> > level as FPC. They are not decade long polished production compilers,
> but
> > one of attempts of people.
>
> I wasn't trying to compare anyone to anyone else. And certainly such fluff
> as "production" wasn't on my radar at all. All projects start small
> initially. Unfortunately, most of them fall apart later because they can't
> handle the complexity. Don't you think that is something that should be
> avoided?

No. Most are simply not worth keeping. The initial part is not the hard part, and I see no reason to expend extra effort to "save" it. Integration costs are probably higher than redoing the work.

And redoing the work is then more favorable because it can then align better with the assumptions and architecture of the codebase you are integrating it into.

Growing from the initial situation to a sustainable project is usually the bottleneck, not the initial effort.

> > And again, your bootstrapping problem is mostly self-imposed. I don't
> see an absolute need to have a 16-bit hosted compiler.
>
> No, of course not, only if someone could only access 16-bit only machines,
> which is more rare these days than 10 years ago. (But 16-bit host is far
> from impossible, even for "real world production commercial compilers".)

Why do you think that nearly none of the great compilers are also 16-bit hosted? While possible (at least for 286pm), it is only possible if you craft specially for it, and the market after the main "turbo" lines wasn't big enough anymore to warrant that.

And even if sb would have tried, all the 16-bit aficionados couldn't be pried from previous generation turbo compilers with a crowbar.

> But let's be honest, my printer most likely doesn't care if I use 16-bit or
> 32-bit or 64-bit software.

Depends. Afaik bioses went partially 32-bit internally because of USB, so if you have an USB Printer ... :-)

> Most reasonable problems (and programs) don't (or shouldn't) care
> about bitsize at all.

They do. Since they routine use datastructures > 64kb, and in general don't expect limits in the memory subsystem. And they generally value features over memory conservation. That's the problem with 16-bit, it requires special effort, not compiling general code for it.

> If you can't write something useful in 500 kb, you can't write anything.

Baseless claim.

[P4/P5]
> This was not necessarily meant to be a production tool out of the box.
> You'll have to adapt it. As is, it's meant to be a portable
> self-bootstrapping example of the original language as standardized.

Yes. But it doesn't solve the basic chicken-egg problem in a way that matters. It tries to solve it with minimalism and if you try to build on it, you fall in the same trap as Unix did.

IOW if you specify bootstrapping in some limited dialect (like C or standard pascal), to get proper bootstrapping, all core infrastructure must be in that limited dialect.

Even GCC gave that up recently, when they moved to C++......

> I certainly don't expect it to influence FPC (nor you) at all, just
> mentioning it for completeness.

Completeness of what?

> > > <sarcasm>
> > > Forget 32-bit, try 40- ... scratch that, 48-bit.

> > Why put the holy limit at a 16-bitter with a 20 bit bus ? Why not go for
> > some real iron, like a 8-bitter ? :-)
>
> Already
> been done. Besides, I don't have any 8-bitters.

You misunderstood my point. It wasn't a dare to come up with even more outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit) is so bad.

> > Or 1998 or so, when computers routinely got 16MB+
>
> You don't need 16 MB for a decent compiler. You don't need 4 GB (or even 1
> GB) for a decent OS.

Depends on "decent" of course. And on the size of your programs.

> > IMHO such efforts will never break free from mindless minimalism, and
> are
> > thus doomed from the start.
>
> I'm not suggesting a OISC esolang, just something that is reasonably-sized
> and doesn't need everything and the kitchen sink.

While that sounds reasonable, all the examples are extremely minimal, and seem to favour size over function in an extreme way.

> > He is building a FPC backend. The frontend is unmodified. FPC is a pm
> > codebase. Do you actually READ my posts ? :)
>
> I'd be surprised if he supported full Delphi 7, but I guess it's not
> impossible.

He says using huge and pm there is a fair chance. Contrary to compilers, many classes of applications are not that memory hungry that 16MB is a problem.

> > (xdpascal)
> > > It's mostly TP3-ish, as he claims, targets 16-bit DOS, public domain,
> > and
> > > is Delphi (32-bit Win32 .EXE) hosted.
> >
> > Yes. Exactly, useless.
>
> Useless because it lacks a full dialect (e.g. Delphi 7) or because you
> yourself will never need to use it?

It serves no useful purpose in any way. It is a subset of an obsolete toolchain that got 3 newer versions. Even a pm TP7 compiler is a total different world featurewise.

> > > No, I'm not saying this particular one is better or super useful, just
> > > saying it's a start
> >
> > For what? Suicide by compiler frustration?
>
> I've not seen the FPC guts, but I doubt it's much cleaner.

It's not how it looks, but what you can do with it.

> > > an existing project with similar goals, so it's worth
> > > mentioning (IMO).
> >
> > Not a production compiler, not even in the same category.
>
> I never mentioned "production". I wasn't comparing anyone, just mentioning
> an interesting example.

And I never mentioned being interested in minimalist bootstrapping philosophy.

> All such harping like that just reminds me of BWK's
> "real work, not toys" bullcrap. (Some people just like to exaggerate.)

I think the point of this thread should be more that one era's real work are toys in a later era.

> > For any compiler creating a new backend (architecture target) is a
> problem.
> > It is simply a lot of work. But better a good compiler for a few
> targets,
> > than something worthless that supports "everything"
>
> If that were true, then why do we have so many competing technologies?

Legacy, preference, slightly different application domains and tradeoffs.

> Oh, that's right, it's often political bias, not technical, that limits
> software.

It is technical too.

> > > and "must have all optimizations from the entire world".
> >
> > Yes. As you have shown, there is enough cruft already. Feature dropping
> is
> > not an option. Features are important. 16-bit hosted not, except for
> fun.
>
> You have to stop somewhere. You can't add indefinitely.

Like with everything, such requirements change with the requirements of the times. No I don't think there is a hard limit somewhere.

And, something you are probably well aware of as Dos user, people tend to earlier drop old features than stop creating new ones.

> Worse is when
> features start conflicting (e.g. GNU Pascal's Borland stuff, lots of
> workarounds for that). Eventually complexity falls in on itself.

IMHO GPC was victim of its own implementation limitations and the fact that the devels only reluctantly accepted the need for Borland compatibility. And even then not fully.

I sometimes got the feeling they were fighting gcc more than they were benefiting from it.

> > Modula2 and Pascal have no subset. They are different languages with
> > different keywords and different blockstructure principles.
>
> Not 100% identical, no, but very very similar, due to common ancestry.

Yes wrt principles, not that much in the actual language. Blockstructure is different, operators are different, case sensitive etc etc.

Only the unit/module and scoping systems match, but that is more language semantics than language.

> > (TeX discussion removed, didn't see the relevance, except to illustrate
> > that before 1985-early nineties period portability is a b*tch)
>
> Decent portability is still relatively unknown.

That's because it doesn't exist.

Nearly all portability attempts starts to define a subset that is supposed to be portable. And that is always either too minimal (making the lowest common denominator awfully small, think standard C here), or too big (throw in one a slightly odd ball platform and you already have a problem, think Windows and POSIX).

> People make horrible
> assumptions and just call it portable because it happens to work (sometimes
> accidentally). I don't admire any project that "only" works on Mac, Win,
> Lin. It's often less out of technical reasons than just developer whimsy.

Right! It HAS to work on FreeBSD too of course!

> > > GCC and GNAT can brag all they want about portability, but they are a
> > pain to compile, too big, too POSIX-y, and just overkill.
> >
> > No. They are perfectly fine. Nothing wrong with it. Real production
> > compilers, no toys.
>
> Perfectly fine? GCC has undergone ten bazillion changes over the years and
> broken a lot of things. It's a pain to build, it's a pain to install. It's
> too POSIX heavy, and expecting it to work reliably on anything outside
> select targets (GNU/Linux using ELF, especially, their preference) is
> sometimes only wishful thinking.

Yeah. But while it is not universal, at least it can be used adequately for certain tasks.

Rugxulo(R)

Homepage

Usono,
15.05.2013, 19:00

@ marcov

FPC 16-bit

> Why do you think that nearly none of the great compilers are also 16-bit
> hosted?

Which ones are "great"? Please list them.

Historically, there were many many commercial DOS compilers. Quite a few are even still sold! But these days it sells more to target certain systems (Mac, Win, Linux on IA-32 or AMD64) than anything else.

16-bit PowerBASIC for DOS claims to fit compiler and libs in 300 kb. Extended Pascal's first implementation (AFAIK) was for 16-bit DOS.

> While possible (at least for 286pm), it is only possible if you
> craft specially for it, and the market after the main "turbo" lines wasn't
> big enough anymore to warrant that.

You get nothing for free. Any cpu requires handholding. It's either behind the scenes or you have to do it explicitly, but either way it's not by default.

"Market" is totally relative to popularity (ego, politics) and money, not technical reasoning.

> And even if sb would have tried, all the 16-bit aficionados couldn't be
> pried from previous generation turbo compilers with a crowbar.

More browbeating from newer OSes. Every geek wants to declare some other tech "obsolete" or "dead".

Did you know DOS is dead? OS/2 is dead? Pascal is dead? WinXP is dead? Silverlight is dead? Flash is dead? Blackberry is dead? Nokia is dead? Even many many people have tried to declare C dead (which it kinda is but still lives on with POSIX) in lieu of C++.

Do you see the pattern? It's all marketing. They want to sell their newer, "improved" version. And if yours works fine, you won't upgrade. So if they can't sell you on features alone, they declare (and coerce) other things to be less useful, deprecated, etc.

And as last resort, if they can't "win" through normal means, they lower the price (for a "limited time" ... sound familiar?). Then they can claim "millions of users" and jack the price up again, esp. with a "newer upgrade". Then it becomes established and you're stuck with it (and all the various changes that happen on a whim).

This is the nightmare that is "modern" technology. Chase the latest, ignore pre-existing. Often newer tech is more tightly nailed down than a fort. So many restrictions, so many rules, so narrow, so expensive, but that's how the "market" wants it. If anything, we only have things like GNU and *BSD as a complete opposite rejection of that attitude, even if they too fall into the trap sometimes.

FreeDOS ... FreePascal ... what a joke, eh? Why are we replicating old, dead, obsolete tech? Why aren't we all using whatever flavor-of-the-month? It's obvious nobody can get "real" work done with inferior tools, right?

> > If you can't write something useful in 500 kb, you can't write anything.
>
> Baseless claim.

The largest 8086 instruction is six bytes. The largest 286 instruction is 10 bytes. That is plenty of room for code. You never ever see an assembly program that even comes close to exceeding 640 kb (MZ limit), but with HLLs you almost never see anything below 100 kb. (Even .so or .dll doesn't help much anymore.) Remember DOS386's old signature? For some people, even "Hello, world!" can't fit in 100 kb. It's "impossible", they say.

> Yes. But it doesn't solve the basic chicken-egg problem in a way that
> matters. It tries to solve it with minimalism and if you try to build on
> it, you fall in the same trap as Unix did.
>
> IOW if you specify bootstrapping in some limited dialect (like C or
> standard pascal), to get proper bootstrapping, all core infrastructure must
> be in that limited dialect.

Unavoidable unless you find a way to do some kind of meta-programming to target more than one intermediate. It's possible, but very few developers try.

> Even GCC gave that up recently, when they moved to C++......

They didn't give up anything. Some developers forked and cleaned up some bits (with more to come) with a reasonable subset of C++. C++ is well-supported, popular, and standardized, all of which are things GNU cares about.

Re: bootstrapping, I can only assume they expect people to use older GCC to build older G++ to build current GCC, if needed (assuming their platform was ever supported and working in the first place). E.g. 2.7.2.3 can build 2.95.3 and other K&Rs can allegedly build 3.4.6.

> You misunderstood my point. It wasn't a dare to come up with even more
> outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit) is
> so bad.

It must be bad because nobody supports DJGPP (32-bit) anymore. Even working code is thrown away. I just don't understand that. It's not rational, it's apparently all about promoting whatever else they prefer instead.

> > Useless because it lacks a full dialect (e.g. Delphi 7) or because you
> > yourself will never need to use it?
>
> It serves no useful purpose in any way. It is a subset of an obsolete
> toolchain that got 3 newer versions. Even a pm TP7 compiler is a total
> different world featurewise.

So many people brag about C++ or Ada. Yet how many who bragged still use the same dialect they used back in bragging times? (Ada83 anyone? AT&T 2.0 anyone? Python 1.52 anyone? Perl 4 anyone? Scheme R4RS anyone? Ruby 1.8.4 anyone?) None. They all "upgrade" so as not to incur the wrath of the deprecation police (aka, embarrassment at using outdated, inferior tech). The only problem is that there's only so much you can add to something before you have to rename it entirely to avoid a stigma (many companies fell into that renaming trap, and it never helps anyways).

> > All such harping like that just reminds me of BWK's
> > "real work, not toys" bullcrap. (Some people just like to exaggerate.)
>
> I think the point of this thread should be more that one era's real work
> are toys in a later era.

They never even had access to the original tools, so they don't even know how it was supposed to work. So they create their own warped variant and call it superior. It's easy to laugh at our "inferior" elders. But tech is very short-lived, so eventually everything gets replaced (usually for no good reason, but anyways ...).

> > Oh, that's right, it's often political bias, not technical, that limits
> > software.
>
> It is technical too.

No, it's not, almost never. If anything, people prefer halfway working implementations to nothing at all. But it's obvious some people don't want to support certain things.

> > You have to stop somewhere. You can't add indefinitely.
>
> Like with everything, such requirements change with the requirements of the
> times. No I don't think there is a hard limit somewhere.

If you need 1000x more processing power and RAM (and libs and tools) than anyone else before you, you're either a). uber genius, or b). dumb as a brick.

> > Perfectly fine? GCC has undergone ten bazillion changes over the years
> and
> > broken a lot of things. It's a pain to build, it's a pain to install.
> It's
> > too POSIX heavy
>
> Yeah. But while it is not universal, at least it can be used adequately for
> certain tasks.

Anything can be used for "certain" tasks. GCC 1.x was considered decent too. Same as 2.x. Same as 3.x But nobody uses those anymore, they are "dead" and "inferior". Even if they still work (!), people throw them away. Is 4.x that much better? Only if you think "modern" is irreplaceable.

BTW, in case you haven't noticed, a lot of changes have happened due to indirect influence from Clang (written in C++), which is preferred by Mac OS X, FreeBSD, Minix, and even Embarcadero.

marcov(R)

15.05.2013, 21:27

@ Rugxulo

FPC 16-bit

> > Why do you think that nearly none of the great compilers are also 16-bit
> > hosted?
>
> Which ones are "great"? Please list them.

Current great ones are LLVM, MSVC, GCC and Intel ICC. (me sucking his thumb here).


> Historically, there were many many commercial DOS compilers.

I didn't say commercial. I said great. In the present tense.

> Quite a few are even still sold!

I'm told you can still buy rolls for Grahams Bell's phonograph.

> But these days it sells more to target certain systems
> (Mac, Win, Linux on IA-32 or AMD64) than anything else.

.... despite that the phonograph doesn't really trigger anything in today's audio market. Maybe that's because even 8-track is a good portion of a century newer.


> 16-bit PowerBASIC for DOS claims to fit compiler and libs in 300 kb.
> Extended Pascal's first implementation (AFAIK) was for 16-bit DOS.

Probably on audiophile fora, people constantly criticize people using CDs that phonographs are still vastly superior :-)

> You get nothing for free.

True. But even for the non-free, some things cost more than others.

... and provide less value for money.

> "Market" is totally relative to popularity (ego, politics) and money, not
> technical reasoning.

Yes. Phonographs are superior. I get it :-)

(skipping a lot of musings that didn't have any sane points)

> > > If you can't write something useful in 500 kb, you can't write
> anything.
> >
> > Baseless claim.
>
> The largest 8086 instruction is six bytes. The largest 286 instruction is
> 10 bytes. That is plenty of room for code. You never ever see an assembly
> program that even comes close to exceeding 640 kb (MZ limit), but with HLLs
> you almost never see anything below 100 kb. (

Yes. But I didn't say that you can't write anything useful in 500kb. I merely contested your statement that you can't write anything useful unless you can do it in 500kb

> > IOW if you specify bootstrapping in some limited dialect (like C or
> > standard pascal), to get proper bootstrapping, all core infrastructure
> must
> > be in that limited dialect.
>
> Unavoidable unless you find a way to do some kind of meta-programming to
> target more than one intermediate. It's possible, but very few developers
> try.

Or simply limit portability to a few sane targets that make the bulk of the world, and consider "total" portability a fun academic thought experiment.

> Re: bootstrapping, I can only assume they expect people to use older GCC to
> build older G++ to build current GCC, if needed (assuming their platform
> was ever supported and working in the first place). E.g. 2.7.2.3 can build
> 2.95.3 and other K&Rs can allegedly build 3.4.6.

Or just brand one golden version on one platform every 5 years. Save it, and use it to jumpstart when necessary.

IOW, the whole bootstrap chain principle is very fun (and together with compiler viruses make great small talk over IT department drinks). But in reality it is a created problem.

> > You misunderstood my point. It wasn't a dare to come up with even more
> > outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit)
> is
> > so bad.
>
> It must be bad because nobody supports DJGPP (32-bit) anymore.

Compared to what well supported 16-bit open source compiler?

> Even working code is thrown away. I just don't understand that. It's not rational, it's
> apparently all about promoting whatever else they prefer instead.

My guess is that they are simply cutting on the overhead of being an accountable public project.

So the needs of the devels prevail, a right that they simply seize, probably because they, and they alone bear the costs of updating it for every new major version of GCC.

> > It serves no useful purpose in any way. It is a subset of an obsolete
> > toolchain that got 3 newer versions. Even a pm TP7 compiler is a total
> > different world featurewise.
>
> So many people brag about C++ or Ada. Yet how many who bragged still use
> the same dialect they used back in bragging times? (Ada83 anyone? AT&T 2.0
> anyone? Python 1.52 anyone? Perl 4 anyone? Scheme R4RS anyone? Ruby 1.8.4
> anyone?) None. They

Never bragged about any of them.

> all "upgrade" so as not to incur the wrath of the
> deprecation police (aka, embarrassment at using outdated, inferior tech).

No. Because they can't bear the costs (be it monetary or timewise) of continuing. I bet most of them only moved on grudgingly.

> They never even had access to the original tools, so they don't even know
> how it was supposed to work. So they create their own warped variant and
> call it superior. It's easy to laugh at our "inferior" elders.

It is not. At most we laugh at the people that use their fruits mindlessly now. Not the elders themselves. Most would probably even frown on the current use of their products now.

> > > Oh, that's right, it's often political bias, not technical, that
> limits
> > > software.
> >
> > It is technical too.
>
> No, it's not, almost never. If anything, people prefer halfway working
> implementations to nothing at all. But it's obvious some people don't want
> to support certain things.

Technical also includes fitness for use. If you want to do something just for the sport of it, it becomes academic, and not technical.

> > Like with everything, such requirements change with the requirements of
> the
> > times. No I don't think there is a hard limit somewhere.
>
> If you need 1000x more processing power and RAM (and libs and tools) than
> anyone else before you, you're either a). uber genius, or b). dumb as a
> brick.

That is like refusing to use a microwave because rubbing sticks together gets it done too.

> > Yeah. But while it is not universal, at least it can be used adequately
> for
> > certain tasks.
>
> Anything can be used for "certain" tasks. GCC 1.x was considered decent
> too. Same as 2.x. Same as 3.x

Yes. ALL FOR THE REQUIREMENTS OF THEIR TIMES !

> Is 4.x that much better? Only if you think "modern" is irreplaceable.

No, of course not. But don't complain about support if you are not prepared to bear the cost. And most aren't. There are exceptions though, like Dragonfly BSD starting with a forked FreeBSD 4.

I don't agree, but I respect them.

> BTW, in case you haven't noticed, a lot of changes have happened due to
> indirect influence from Clang (written in C++), which is preferred by Mac
> OS X, FreeBSD, Minix, and even Embarcadero.

Bollocks. Even on LLVM's main targets, gcc still beats LLVM.

I'm a LLVM sympathizer, and have regularly followed FreeBSD with LLVM meetings in the last years (mostly on FOSDEM)

But that is from a license based (and GCC monopoly scare) perspective, and doesn't mean I shut my eyes for realities.

LLVM's honeymoon is coming to an end. They have been promising great advancements due to superior architecture for nearly half a decade now, and STILL can't routinely match gcc in depth (performance on their main target architecture), let alone in width (number of targets)

The tide is changing, and they better start showing results soon, or even good PR and commercial friends won't save them, and the only thing in favour of them will be the license.

Rugxulo(R)

Homepage

Usono,
15.05.2013, 22:44

@ marcov

FPC 16-bit

> > > Why do you think that nearly none of the great compilers are also
> 16-bit
> > > hosted?
> >
> > Which ones are "great"? Please list them.
>
> Current great ones are LLVM, MSVC, GCC and Intel ICC. (me sucking his thumb
> here).

Clang + LLVM doesn't support Windows very well. (There are "experimental" MinGW builds, but not for x64, IIRC.) Even two of the others (MSVC, Intel) don't support Itanium anymore, so they're arguably getting worse! And of course, it has to be said: MSVC, Intel, Embarcadero are very very expensive!

My point is that they may be big, but they're not infallible, and they're certainly not universally useful (only pandering to a small niche ... with big pockets, presumably).

> > > > If you can't write something useful in 500 kb, you can't write
> > anything.
> > >
> > > Baseless claim.
> >
> > The largest 8086 instruction is six bytes. The largest 286 instruction
> is
> > 10 bytes. That is plenty of room for code.
>
> Yes. But I didn't say that you can't write anything useful in 500kb. I
> merely contested your statement that you can't write anything useful unless
> you can do it in 500kb

I didn't mean it like that. I'm not that naive. But if you (or your compiler) can't fit an extremely useful amount of code in 500 kb, you're sunk.

> Or simply limit portability to a few sane targets that make the bulk of the
> world, and consider "total" portability a fun academic thought experiment.

Be careful what you wish for. By download statistics alone, Windows far far surpasses everything else. Then comes Mac. Then much much further is Linux. Everything else is mostly ignored (including FreeBSD). If certain parts of the commercial world weren't so anti-GPLv3, FreeBSD wouldn't exist anymore.

> Or just brand one golden version on one platform every 5 years. Save it,
> and use it to jumpstart when necessary.

Apparently (with very few exceptions) no one does that.

> > > You misunderstood my point. It wasn't a dare to come up with even more
> > > outdated pascal stuff. It was a dare to explain why 32-bit (or 64-bit)
> > is
> > > so bad.
> >
> > It must be bad because nobody supports DJGPP (32-bit) anymore.
>
> Compared to what well supported 16-bit open source compiler?

I don't understand the question.

My point was that "32-bit POSIX isn't enough anymore." So it's not 640 kb, it's not stability, it's not lack of tools, and it's not lack of free/libre: people just don't freaking care anymore.

> > > Like with everything, such requirements change with the requirements
> of
> > the
> > > times. No I don't think there is a hard limit somewhere.
> >
> > If you need 1000x more processing power and RAM (and libs and tools)
> than
> > anyone else before you, you're either a). uber genius, or b). dumb as a
> > brick.
>
> That is like refusing to use a microwave because rubbing sticks together
> gets it done too.

Not refusing the hardware, refusing the requirements! It's too much, when will it end? You don't need 1.5 GB bandwidth just to download a compiler!! You don't need 2 GB for an OS!!

> > > Yeah. But while it is not universal, at least it can be used
> adequately
> > for
> > > certain tasks.
> >
> > Anything can be used for "certain" tasks. GCC 1.x was considered decent
> > too. Same as 2.x. Same as 3.x
>
> Yes. ALL FOR THE REQUIREMENTS OF THEIR TIMES !

What requirements changed? Do people eat 1000x more food? Pay 1000x more money? Speak 1000x more words?

> > BTW, in case you haven't noticed, a lot of changes have happened due to
> > indirect influence from Clang (written in C++), which is preferred by
> Mac
> > OS X, FreeBSD, Minix, and even Embarcadero.
>
> Bollocks. Even on LLVM's main targets, gcc still beats LLVM.

Only in very very minor ways (see below).

> I'm a LLVM sympathizer, and have regularly followed FreeBSD with LLVM
> meetings in the last years (mostly on FOSDEM)

AFAIK, FreeBSD 10 (x86, x64) will have only Clang. Ports will still prefer GCC 4.6.

> But that is from a license based (and GCC monopoly scare) perspective, and
> doesn't mean I shut my eyes for realities.
>
> LLVM's honeymoon is coming to an end. They have been promising great
> advancements due to superior architecture for nearly half a decade now, and
> STILL can't routinely match gcc in depth (performance on their main target
> architecture), let alone in width (number of targets)

No, it's still vibrant and ongoing. IBM recently contributed a bunch of patches for PowerPC and System Z. It's only approx. 7-15% slower there than GCC.

It's not overall performance that is a problem but the lack of existing cpu backend support (and pre-existing code that assumed nothing else would ever exist).

> The tide is changing, and they better start showing results soon, or even
> good PR and commercial friends won't save them, and the only thing in
> favour of them will be the license.

Apple, IBM, Intel (not ICC), Embarcadero, FreeBSD, Minix all use and support it. There's probably more groups than that, obviously. It's far from dying.

You didn't react, so I guess you already knew (or maybe not): Embarcadero uses (actually, sells) modified Clang as their BCC64. That doesn't sound like they need saving! Quite shocking to me to hear that but not realistically that surprising. (I just wish I understood how some of these people calculate their prices, it seems extremely arbitrary.) Yes, even your beloved Delphi is rumored to be migrating to LLVM. Then, worse performance or not, you're stuck! Enjoy your upgrade!

Laaca(R)

Homepage

Czech republic,
16.05.2013, 10:15

@ Rugxulo

FPC 16-bit

I really think that DOS 16-bit native version of FPC is not needed - with assumption that we have good and reliable DOS 32-bit version which can cross-compile into 16-bit.
All FPC code does not expect limited memory block size so adaptation would be extremely complicated and the result would be very, very buggy.

However if somebody really wants to run FPC on 16-bit machine it should be possible and not so complicated to use some kind od 386 emulation on 8086 processor.
I don't know if such emulator exists but such emulator shouldn't be so complicated as someone could expect because only some very basic emulation of PC would be sufficient (no need for graphic modes, sound, etc.)

Anyway - I hope that 16-bit FPC will not suffer by bugs present in GO32V2 target.
Marcov, have you read this my post? http://www.bttr-software.de/forum/forum_entry.php?...amp;page=0&category=0&order=last_answer

---
DOS-u-akbar!

Rugxulo(R)

Homepage

Usono,
16.05.2013, 20:35

@ Laaca

FPC 16-bit

> I really think that DOS 16-bit native version of FPC is not needed

For the record, I never suggested, demanded, or expected such, by far. I just don't personally see the justification for all of the modern day additions in software.

Modern just makes things harder to maintain, unstable, and drops support for 99% of the pre-existing computing world. (Remember when Ruby 1.9 dropped like 9 OSes? And yet it was that "old" 1.8 version that was ISO standardized. I'm afraid it will be yet another ignored standard that "nobody" cares about. Why even have standards if "nobody" cares for anything outside of POSIX [Mac and Linux] or Windows? Money and politics, there's no other justification.)

> with assumption that we have good and reliable DOS 32-bit version
> which can cross-compile into 16-bit.

I think this is far from being a stable backend just yet. For now it's probably? just one man scratching an itch.

> All FPC code does not expect limited memory block size so adaptation would
> be extremely complicated and the result would be very, very buggy.

I didn't expect anyone to care. I was just saying it would be possible. (Keep in mind that they can barely keep the 32-bit DOS port working, so it's not limited 640 kb memory or 16-bit that is holding anyone back!)

> However if somebody really wants to run FPC on 16-bit machine it should be
> possible and not so complicated to use some kind od 386 emulation on 8086
> processor.
> I don't know if such emulator exists but such emulator shouldn't be so
> complicated as someone could expect because only some very basic emulation
> of PC would be sufficient (no need for graphic modes, sound, etc.)

There was an old DOS shareware 386 real-mode-only emulator for 286 [sic] machines, IIRC. It's still available on Sac.Sk, lemme find it:


em3134b1.zip    UTILMISC    EMU386 v1.34 Beta 1 - 386 capabilities emulator for 286s       19060   1998-02-24 02:35:00
em386133.zip    UTILMISC    EMU386 v1.33 - Real-mode 386 Opcode Emulator for 286 PCs       18795   1998-01-21 02:51:00


Never tried it, but IIRC, it only traps SIGILL (int 6?) and emulates the 32-bit arithmetic stuff.

(Obviously ANSI C requires long int, which is minimum 32-bits, and many 16-bit C compilers support that, and even others like Turbo Pascal or Oberon-M. So it's not like 32-bit arithmetic was so hard to use for 16-bit machines.)

> Anyway - I hope that 16-bit FPC will not suffer by bugs present in GO32V2
> target.
> Marcov, have you read this my post?
> http://www.bttr-software.de/forum/forum_entry.php?...amp;page=0&category=0&order=last_answer

He probably read it, but he's not the DOS maintainer (there is none), so it's not his priority. And it's fair to assume he's not interested in prolonging DOS.

marcov(R)

16.05.2013, 20:46

@ Laaca

FPC 16-bit

> Marcov, have you read this my post?
> http://www.bttr-software.de/forum/forum_entry.php?...amp;page=0&category=0&order=last_answer

Yes. Nothing in my direct influence sphere.

Maybe some of them are solved. I'll see if I can merge those back to fixes soon. (if not today) Apparently, several issues was fixed last minute in 2.6.0 branch were not propagated back to fixes (and thus 2.6.2):

r24033 | pierre | 2013-03-28 14:04:54 +0100 (Thu, 28 Mar 2013) | 1 line
Changed paths:
M /trunk/packages/gdbint/src/gdbint.pp

Commit go32v2 specific fix in 2.6.0 branch rev 20576

r24227 | pierre | 2013-04-12 12:19:38 +0200 (Fri, 12 Apr 2013) | 11 lines
Changed paths:
M /trunk/ide/fpdebug.pas

Merge forgotten go32v2 2.6.0 branch changes back into trunk.

r24228 | pierre | 2013-04-12 12:21:34 +0200 (Fri, 12 Apr 2013) | 7 lines
Changed paths:
M /trunk/installer/install.dat

Merge forgotten go32v2 2.6.0 branch changes back into trunk.
------------------------------------------------------------------------
r20595 | pierre | 2012-03-23 14:45:47 +0100 (Fri, 23 Mar 2012) | 1 line

* Fix fppkg short source zip name
---
r24230 | pierre | 2013-04-12 12:28:40 +0200 (Fri, 12 Apr 2013) | 7 lines
Changed paths:
M /trunk/rtl/go32v2/system.pp

Merge forgotten go32v2 2.6.0 branch changes back into trunk.
---------------------------------------------------------------------
r24231 | pierre | 2013-04-12 12:33:06 +0200 (Fri, 12 Apr 2013) | 12 lines
Changed paths:
M /trunk/rtl/go32v2/v2prt0.as

Merge forgotten go32v2 2.6.0 branch changes back into trunk.
r20577 | pierre | 2012-03-22 16:35:26 +0100 (Thu, 22 Mar 2012) | 4 lines

// * Provide both environ and _environ inside startup file
// to avoid loading of old or new crt1.o object from DJGPP libc.
// Not merged as this is now treated in linker script.
* Make some labels local to be able to get a complete
disassembly of start function using GDB.

marcov(R)

16.05.2013, 21:19

@ Rugxulo

FPC 16-bit

> Clang + LLVM doesn't support Windows very well. (There are "experimental"
> MinGW builds, but not for x64, IIRC.) Even two of the others (MSVC, Intel)
> don't support Itanium anymore, so they're arguably getting worse! And of
> course, it has to be said: MSVC, Intel, Embarcadero are very very
> expensive!

Yes. No doubt about that. (though the itanium bit is a stretch, it is pretty much dead for new software development)

> My point is that they may be big, but they're not infallible, and they're
> certainly not universally useful (only pandering to a small niche ... with
> big pockets, presumably).

No. Among them they are 99.9% of the software tools usage. Maybe another 9.

That's why they are great.

> > Yes. But I didn't say that you can't write anything useful in 500kb. I
> > merely contested your statement that you can't write anything useful
> > unless
> > you can do it in 500kb
>
> I didn't mean it like that. I'm not that naive. But if you (or your
> compiler) can't fit an extremely useful amount of code in 500 kb, you're
> sunk.

No. Since I might simply use another compiler if I had an extremely tight budget.

And then, suddenly the whole 500kb limit is totally arbitrary and reveals your dos roots. I daily program microcontrollers (I/O slaves based on Microchip) and my largest program there is 13kb. (with gcc btw, +/-2000 lines of C, libraries and headers excluded)

> Be careful what you wish for. By download statistics alone, Windows far far
> surpasses everything else.

True.

> Then comes Mac. Then much much further is Linux.

Still mostly true.

> Everything else is mostly ignored (including FreeBSD). If certain parts of
> the commercial world weren't so anti-GPLv3, FreeBSD wouldn't exist
> anymore.

Wrong, since GPLV3 is fairly recent. FreeBSD has a strong following among ISPs. It is not a client OS, so comparing downloads is a bit strange. (since that is an end-user centric metric).

> > Or just brand one golden version on one platform every 5 years. Save it,
> > and use it to jumpstart when necessary.
>
> Apparently (with very few exceptions) no one does that.

I think only Debian is fanatic about it.

> > > It must be bad because nobody supports DJGPP (32-bit) anymore.
> >
> > Compared to what well supported 16-bit open source compiler?
>
> I don't understand the question.
>
> My point was that "32-bit POSIX isn't enough anymore."

DJGPP is POSIX nowadays?

> So it's not 640 kb,
> it's not stability, it's not lack of tools, and it's not lack of
> free/libre: people just don't freaking care anymore.

Well, or simply think that the burden/baton should pass to actual users? Maybe there is a nice task for you there :-)

> > That is like refusing to use a microwave because rubbing sticks together
> > gets it done too.
>
> Not refusing the hardware, refusing the requirements!

Rubbing two microwaves together doesn't produce fire! The old requirements were perfectly fine! I rubbed sticks together for years just fine!

Whatever gets the food cooked :-D :-D

> > Yes. ALL FOR THE REQUIREMENTS OF THEIR TIMES !
>
> What requirements changed?

The balance that hardware like memory makes in the total cost of software development. Hourly programmer rate / unit memory.

> > Bollocks. Even on LLVM's main targets, gcc still beats LLVM.
>
> Only in very very minor ways (see below).

Their PR wasn't about matching GCC, it was about passing it, leaving it in the dust.

Even if you buy your minor ways (I don't), it is still quite a shortfall.

> No, it's still vibrant and ongoing. IBM recently contributed a bunch of
> patches for PowerPC and System Z. It's only approx. 7-15% slower there than
> GCC.
>
> It's not overall performance that is a problem but the lack of existing cpu
> backend support (and pre-existing code that assumed nothing else would ever
> exist).

It's a fail overall. They judged gcc on overall principles, so they will be judged on overall principles.

> Apple, IBM, Intel (not ICC), Embarcadero, FreeBSD, Minix all use and
> support it. There's probably more groups than that, obviously. It's far
> from dying.

All those wins are on license, not on technical grounds. So....

> Yes, even
> your beloved Delphi is rumored to be migrating to LLVM. Then, worse
> performance or not, you're stuck! Enjoy your upgrade!

Nope. The so called nextgen compiler is still very incompatible (the iOS XE4 recently released is based on it).

This because it does more than just LLVM. It also tries to reinvent the language on .NET/Java footings, including immutable strings, meaning all string routines need to be reviewed/rewritten.

If Embarcadero continue on this course, I'll probably have a few years left on my XE3, and then migrate either to MSVC or FPC.

marcov(R)

16.05.2013, 21:26

@ marcov

FPC 16-bit

> I'll see if I can merge those back to fixes soon.

Done (r24515)

Rugxulo(R)

Homepage

Usono,
17.05.2013, 07:52

@ marcov

FPC 16-bit

> Yes. No doubt about that. (though the itanium bit is a stretch, it is
> pretty much dead for new software development)

"Itanium is dead, x64 wins!" (groan)

I have no idea, but Wikipedia claims it's still going strong, at least through 2017.

> No. Among them they are 99.9% of the software tools usage. Maybe another
> 9.
>
> That's why they are great.

Except for GCC, they are all very narrow in what they target and are ridiculously expensive. GCC's main problem is being too *nix-oriented, which means anything outside of POSIX/ELF and C++ or Fortran isn't very important to them (and tends to bitrot or get removed quickly). But at least they (somewhat) support other stuff at all.

(Heh, just for laughs I wanted to check Gautier's "Lang Index" on SourceForge and also see if similarly TIOBE finally updated for May. I think it's quite funny what they say, and you of all people may find it "interesting".)

> No. Since I might simply use another compiler if I had an extremely tight
> budget.

99% of compilers are expensive. If your budget is tight, you won't have enough money to keep buying more and more compilers. And let me save you the trouble: none are perfect. If you want something bad enough, you'll have to workaround any issues in your tools.

> And then, suddenly the whole 500kb limit is totally arbitrary and reveals
> your dos roots.

Reveals my roots? That was my whole point! The IBM PC grew up with PC-DOS/MS-DOS, and (more or less) we still have such compatibility today, despite many people trying to change (and tightly control) it over the years.

> > Everything else is mostly ignored (including FreeBSD). If certain parts
> of
> > the commercial world weren't so anti-GPLv3, FreeBSD wouldn't exist
> > anymore.
>
> Wrong, since GPLV3 is fairly recent.

Five years is hardly recent. A lot has happened since then.

> FreeBSD has a strong following among
> ISPs. It is not a client OS, so comparing downloads is a bit strange.
> (since that is an end-user centric metric).

I meant third-party projects that explicitly target FreeBSD, not just random *nix compatibility in sources that halfway work. Those binaries have very very low download counts.

> > > > It must be bad because nobody supports DJGPP (32-bit) anymore.
> > >
> > My point was that "32-bit POSIX isn't enough anymore."
>
> DJGPP is POSIX nowadays?

Certified? No, nobody wants to waste the time or money (same as Linux or FreeBSD, from what I hear). And no, it doesn't (can't) support things like 2008 (mmap). My point was that even what is there (1992, parts of 2001) is ignored.

> > So it's not 640 kb,
> > it's not stability, it's not lack of tools, and it's not lack of
> > free/libre: people just don't freaking care anymore.
>
> Well, or simply think that the burden/baton should pass to actual users?
> Maybe there is a nice task for you there :-)

No, because even when it works they don't care.

> > > Yes. ALL FOR THE REQUIREMENTS OF THEIR TIMES !
> >
> > What requirements changed?
>
> The balance that hardware like memory makes in the total cost of software
> development. Hourly programmer rate / unit memory.

Costs always go up, never down. In fact, the free market loves to charge "whatever [it] wants!" So that's not a good measure. (U.S. is too greedy, but outsourcing is lowballing.)

Besides, the whole "RAM is cheap" cliche is a red herring. Nobody can truly install 256 GB or 1 TB anyways, the motherboard is way too limited. Even OSes (e.g. XP 32-bit) can't handle the full amount that the architecture supports, whether for technical or licensing reasons.

> > Apple, IBM, Intel (not ICC), Embarcadero, FreeBSD, Minix all use and
> > support it. There's probably more groups than that, obviously. It's far
> > from dying.
>
> All those wins are on license, not on technical grounds. So....

Faster compilation. Less memory usage. 75%-90% speed of GCC. Better tools support. Good support of standards.

That's far from just licensing (which reminds me, you never hear about PCC anymore. At one time that was seen as the one to follow. Guess nobody cares anymore. Sad, really.)

> > Yes, even
> > your beloved Delphi is rumored to be migrating to LLVM. Then, worse
> > performance or not, you're stuck! Enjoy your upgrade!
>
> Nope. The so called nextgen compiler is still very incompatible (the iOS
> XE4 recently released is based on it).

BTW, dare I say it, but ... is iOS really worth targeting? Well, I guess if something is trendy enough and makes enough money ....

> This because it does more than just LLVM. It also tries to reinvent the
> language on .NET/Java footings, including immutable strings, meaning all
> string routines need to be reviewed/rewritten.

In fairness, Delphi already probably had too many kinds of strings. (And other languages have had immutable strings too, perhaps due to garbage collection, dunno. E.g. Lua and Modula-3.)

> If Embarcadero continue on this course, I'll probably have a few years left
> on my XE3, and then migrate either to MSVC or FPC.

Migrating to MSVC sounds like a bad idea. Even if you loved C++ (since their C support is minimal, i.e. C++ wins over obsolete C), there are many other tools. But maybe you're one of those that can't live without fancy IDEs. That seems to be most people's favorite thing about MSVC. (Well, the console is indeed considered obsolete in many peoples' eyes, part of the reason Windows was promoted so heavily.)

During previous revisions to this post, I deleted some points. But let me exhume one: we are both living in deprecated worlds.

Windows: Win32 GUI (preferred), DOS console (legacy, deprecated)
Linux: GNOME or KDE under X11 (preferred), POSIX console (deprecated)
Mac OS X: Cocoa (preferred), *BSD terminal (deprecated)

So even if there is some compatibility, it's always shunned and ignored by some people. Heck, it may not even work very well or for long because somebody always wants to replace it with something new and fresh and "better". It's not just "intertia". At some point you really have to stabilize on "something" or else nothing will ever get done. And constantly changing the goals of a project to appease every latest trend is not the recipe for success.

marcov(R)

05.06.2013, 13:34

@ Rugxulo

FPC 16-bit

> "Itanium is dead, x64 wins!" (groan)

One could argue if it was ever alive to be begin with.

> I have no idea, but Wikipedia claims it's still going strong, at least
> through 2017.

I think C=64 as a target sees more action than Itanium.

> Except for GCC, they are all very narrow in what they target and are
> ridiculously expensive.

Yes. There are discount and educational versions though.

> GCC's main problem is being too *nix-oriented,

All true.

But still that is what is mostly used.

> finally updated for May. I think it's quite funny what they say, and you of
> all people may find it "interesting".)

I think search analysis is a lousy way of determining language usage, so I don't take tiobe very seriously. I see Delphi usage at every client I visit (usually in some engineering department), and no wonder that that doesn't make the web.

> > No. Since I might simply use another compiler if I had an extremely
> tight
> > budget.
>
> 99% of compilers are expensive. If your budget is tight, you won't have
> enough money to keep buying more and more compilers.

One in twenty plus years should be doable. 16-bit is the way of the dodo since before the mid nineties.

Or simply live with an imperfect (gcc) one. But nothing justifies requiring silly and arbitrary 500k limits.

> Reveals my roots? That was my whole point! The IBM PC grew up with
> PC-DOS/MS-DOS, and (more or less) we still have such compatibility today,

Yes. But you didn't post a msdos specific statement, you posted about nothing worth doing requires more than 500k. That is a general statement.

> > > Everything else is mostly ignored (including FreeBSD). If certain
> parts
> > of
> > > the commercial world weren't so anti-GPLv3, FreeBSD wouldn't exist
> > > anymore.
> >
> > Wrong, since GPLV3 is fairly recent.
>
> Five years is hardly recent. A lot has happened since then.

Still you don't really qualify your above statement about FreeBSD non GPLv3 sentiment being its raison d'etre.

> > FreeBSD has a strong following among
> > ISPs. It is not a client OS, so comparing downloads is a bit strange.
> > (since that is an end-user centric metric).
>
> I meant third-party projects that explicitly target FreeBSD, not just
> random *nix compatibility in sources that halfway work. Those binaries have
> very very low download counts.

That's exactly what I mean. That is an end-user (desktop) centric way of measuring a server OS.

> Certified? No, nobody wants to waste the time or money (same as Linux or
> FreeBSD, from what I hear). And no, it doesn't (can't) support things like
> 2008 (mmap). My point was that even what is there (1992, parts of 2001) is
> ignored.

A chain is only as strong as its weakest link. You are either compatible or not. A few rest bits of posix compatibility are useless.

> > Well, or simply think that the burden/baton should pass to actual users?
> > Maybe there is a nice task for you there :-)
>
> No, because even when it works they don't care.

Neither do users apparently, or they would step up.

> > The balance that hardware like memory makes in the total cost of
> software
> > development. Hourly programmer rate / unit memory.
>
> Costs always go up, never down.

In currency yes. But in price/MB, no.

> Besides, the whole "RAM is cheap" cliche is a red herring. Nobody can truly
> install 256 GB or 1 TB anyways, the motherboard is way too limited.

No, but you can install 4GB for sub Eur 50, which is still way more than 500kb. Or when did you last bought new memory in sub 128MB quantities? 2000 ?

> > > Apple, IBM, Intel (not ICC), Embarcadero, FreeBSD, Minix all use and
> > > support it. There's probably more groups than that, obviously. It's
> far
> > > from dying.
> >
> > All those wins are on license, not on technical grounds. So....
>
> Faster compilation. Less memory usage. 75%-90% speed of GCC.

Two of those would be interesting. Unfortunately there are killer downsides:

- bad cross platform/architecture support.
- no openmp
- own part of toolchain (linker/assembler/debugger) projects are not
that active it seems.

> That's far from just licensing

True. But I think for your list the licensing bit was dominant.

> (which reminds me, you never hear about PCC
> anymore. At one time that was seen as the one to follow. Guess nobody cares
> anymore. Sad, really.)

As far as I know, OpenBSD clung to PCC as a last hope for a while. But it never really was " the one to follow" for the rest.

> > Nope. The so called nextgen compiler is still very incompatible (the iOS
> > XE4 recently released is based on it).
>
> BTW, dare I say it, but ... is iOS really worth targeting? Well, I guess if
> something is trendy enough and makes enough money ....

Not for me. Anyway, the problem with iOS targeting is a bit as a lottery. Yes, there are people that win, and develop succesful iOS apps. From what I see this group is small.

Most of them however seem to see it just as another client (shop front in vertical markets), more a mobile form of website than real applications with functionality. It is expected though and they grudgingly bear the cost, or even hope to score points against the competition. This is the largest group by far.

Then there is a group attracted by the mobile glory and play with mobile toolchains but never ship something real and just want to get on board.

> > This because it does more than just LLVM. It also tries to reinvent the
> > language on .NET/Java footings, including immutable strings, meaning all
> > string routines need to be reviewed/rewritten.
>
> In fairness, Delphi already probably had too many kinds of strings.

I'm not sure that is really the case. Yes it has many, but most are for specialist but viable use. Only shortstring could be canned I guess, but many argue even against that.

> (And
> other languages have had immutable strings too, perhaps due to garbage
> collection, dunno. E.g. Lua and Modula-3.)

Garbage collection is probably a factor yes. But Delphi is not a GCed scripting language.

> > If Embarcadero continue on this course, I'll probably have a few years
> left
> > on my XE3, and then migrate either to MSVC or FPC.
>
> Migrating to MSVC sounds like a bad idea. Even if you loved C++ (since
> their C support is minimal, i.e. C++ wins over obsolete C), there are many
> other tools.

C++ wins over C period :-)

> But maybe you're one of those that can't live without fancy
> IDEs. That seems to be most people's favorite thing about MSVC. (Well, the
> console is indeed considered obsolete in many peoples' eyes, part of the
> reason Windows was promoted so heavily.)

It's very telling that you are anti-IDE, but not really explain why the other tools would be an option at all. Or ask why I chose MSVC.

> During previous revisions to this post, I deleted some points. But let me
> exhume one: we are both living in deprecated worlds.
>
> Windows: Win32 GUI (preferred), DOS console (legacy, deprecated)

The windows console also exists, and is prefered. (cmd.exe vs command.com).

Since last year there are even console only windows (server) versions again.

> Linux: GNOME or KDE under X11 (preferred), POSIX console (deprecated)

Hard to say. Linux, like Windows is a server OS, and console belongs there.

> So even if there is some compatibility, it's always shunned and ignored by
> some people.

You confuse not the popular choice with outright deprecated.

Rugxulo(R)

Homepage

Usono,
06.06.2013, 00:03

@ marcov

FPC 16-bit

> > finally updated for May. I think it's quite funny what they say, and you
> of
> > all people may find it "interesting".)
>
> I think search analysis is a lousy way of determining language usage, so I
> don't take tiobe very seriously. I see Delphi usage at every client I visit
> (usually in some engineering department), and no wonder that that doesn't
> make the web.

My point is that every blowhard tries to declare certain tech as obsolete. It never ends.

> > Reveals my roots? That was my whole point! The IBM PC grew up with
> > PC-DOS/MS-DOS, and (more or less) we still have such compatibility
> today,
>
> Yes. But you didn't post a msdos specific statement, you posted about
> nothing worth doing requires more than 500k. That is a general statement.

No. I actually said that you should be able to write something quite reasonably useful, even a full-blown compiler, in 500 kb. If not, your tools are very poor or you don't know what you're doing.

(And yes, I was implying this for the IBM PC architecture and real mode DOS. This is a DOS forum and thread talking about 16-bit support from compilers!)

Totally arbitrary, but it shouldn't be that big of a stretch! (Even FASMW, which is far from being only a minimalistic assembler + IDE for Win32, is only 142 kb.)

> > > FreeBSD has a strong following among
> > > ISPs. It is not a client OS, so comparing downloads is a bit strange.
> > > (since that is an end-user centric metric).
> >
> > I meant third-party projects that explicitly target FreeBSD, not just
> > random *nix compatibility in sources that halfway work. Those binaries
> have
> > very very low download counts.
>
> That's exactly what I mean. That is an end-user (desktop) centric way of
> measuring a server OS.

You know full well that FreeBSD isn't limited to servers, not even by design. But in a world full of boring hype of "all new" (incompatible), it's a losing battle. FreeBSD is not very popular.

(Though why MS doesn't mimic Apple and port over some FreeBSD tools or even their Linux emulation is beyond me. Oh wait, I forgot, compatibility is bad for marketing when you want everyone to only follow you instead.)

> A chain is only as strong as its weakest link. You are either compatible or
> not. A few rest bits of posix compatibility are useless.

People cannot restrain themselves from using some non-portable features, even if not technically needed. While you may applaud them, I can't understand it. I'll never understand why targeting 1% of machines is better than targeting 99%. (Not an absolute measure, but in most software the differences aren't necessary anyways. Usually devs just didn't know or care about doing it a different way.)

> > > Well, or simply think that the burden/baton should pass to actual
> users?
> > > Maybe there is a nice task for you there :-)
> >
> > No, because even when it works they don't care.
>
> Neither do users apparently, or they would step up.

They did, but ... old code was removed, patches were rejected, etc. These are not technical limitations but human ones. They only want to promote themselves and their ideals, not help others port the actual technical code. (Yes, I'm mostly thinking GNU snobs here, but there are others too, see below.)

(Why do you think IE10 was originally Win8 only? They want to promote Win8, not make the web a better place for everyone else. Even if it is a decent web browser, it's of limited use to the majority of people ... unless they buy Win8, which is the whole point.)

But instead of people saying "Not portable enough, let's improve it", they instead say, "It supports all I care about. Everything else is obsolete and not supported. I don't have it nor use it nor care, therefore no one else should either."

> > Besides, the whole "RAM is cheap" cliche is a red herring. Nobody can
> truly
> > install 256 GB or 1 TB anyways, the motherboard is way too limited.
>
> No, but you can install 4GB for sub Eur 50, which is still way more than
> 500kb. Or when did you last bought new memory in sub 128MB quantities? 2000
> ?

So you don't think 2.5 MB for PPC386.EXE can be heavily improved?? You think that's totally reasonable?? (A quick check shows 0.99.05 to be less than 500 kb. Which would be great ... except I know that's not optimal code size by a long shot. Though obviously supporting many dialects and features affects things, but with "tons of RAM" these days, nobody has the motivation to even believe such a reduction is possible.)

Since computers are so fast, why don't you just "make -B" (GNU Make 3.81, "Unconditionally make all targets") every single time.

And since hard drives are so cheap, why don't you "tar xf" every single .tar.gz (.tar.bz2, .xz, etc) archive?

And since RAM is so cheap, just remove any calls to "dispose" in your Pascal code. Surely it will never need to be freed, there's more than enough! (No garbage collector needed!)

> Two of those would be interesting. Unfortunately there are killer
> downsides:
>
> - bad cross platform/architecture support.
> - no openmp
> - own part of toolchain (linker/assembler/debugger) projects are not
> that active it seems.

All of that has been improved lately. I don't know the details, but it's far from as pathetic as you imply. (Though I agree it's not perfect, but nothing is.)

> > > If Embarcadero continue on this course, I'll probably have a few years
> > left
> > > on my XE3, and then migrate either to MSVC or FPC.
> >
> > Migrating to MSVC sounds like a bad idea. Even if you loved C++ (since
> > their C support is minimal, i.e. C++ wins over obsolete C), there are
> many
> > other tools.
>
> C++ wins over C period :-)

I assume you catch my drift here. People love to deride others as useless and obsolete. I'm not picking on anyone in particular here, but seriously, some people (Bjarne!) want C++ to totally subsume C! I mean literally as in ye olde Pascal "level 0 and level 1" kind of feel, where C is level 0 and C++ is level 1.

(I mean, it almost makes sense, C/C++ are already widely implemented, but I don't honestly expect nor want every C vendor to totally implement C++. C is hard enough in latest standards.)

And of course, Java or C# or D each presumably feels the same way about C++, ironically enough. It never ends. (And I'm sure Ada would try to scoop up all Pascal-derivatives if it could.)

> > But maybe you're one of those that can't live without fancy
> > IDEs. That seems to be most people's favorite thing about MSVC. (Well,
> the
> > console is indeed considered obsolete in many peoples' eyes, part of the
> > reason Windows was promoted so heavily.)
>
> It's very telling that you are anti-IDE, but not really explain why the
> other tools would be an option at all. Or ask why I chose MSVC.

I don't really use an IDE, no. I'm not really against it, but again, hate to be so simplistic and minimal, but seriously, 1.5 GB (download size, last I heard) is a lot of space. I don't know, I just can't handle complexity well (who can???). Too much is just too much.

(I wasn't being sarcastic, seriously, the IDE is indeed why most people seem to love MSVC.)

> > During previous revisions to this post, I deleted some points. But let
> me
> > exhume one: we are both living in deprecated worlds.
> >
> > Windows: Win32 GUI (preferred), DOS console (legacy, deprecated)
>
> The windows console also exists, and is prefered. (cmd.exe vs
> command.com).

It's not preferred at all. I don't think MS would encourage it. Even Delphi docs online seems quite explicitly to say that GUI is preferred instead.

I'm not saying console would (or should) go away or that there isn't a use for it (out of necessity), but that's not the way the modern world works. Console is seen as obsolete, less intuitive, etc.

> Since last year there are even console only windows (server) versions
> again.

They apparently fought kicking and screaming against this for years. It's debatable whether they will support it properly for long. This was really just to compete with *nix, not because they like the console. Remember, even SFU is officially deprecated nowadays. Everything like DOS and *nix to them is old style and hence not desired. (I know they have PowerShell, but that's more of a competing tech vs. scripting languages than anything, out of necessity. I have no idea if it works on WinRT. Probably, since it's .NET supported.)

Keep in mind that even C++ was second-class to them until recently where they promised to support it fully. Apparently they prefer their own C# .NET instead.

> > Linux: GNOME or KDE under X11 (preferred), POSIX console (deprecated)
>
> Hard to say. Linux, like Windows is a server OS, and console belongs
> there.

I'm just saying, some people would totally kill the console if they could.

> > So even if there is some compatibility, it's always shunned and ignored
> by
> > some people.
>
> You confuse not the popular choice with outright deprecated.

I'm not necessarily talking official companies here (though they hire these same blowhards), I'm talking about popular opinion, even from developers. They don't care about some things, and they would indeed strip support for them entirely, if possible. They see no need for xyz, therefore xyz should die.

marcov(R)

01.07.2013, 22:29

@ Rugxulo

FPC 16-bit

> My point is that every blowhard tries to declare certain tech as obsolete.

Yes. But that is more context of declaring Windows XP dead. Or Vista. Or pre 3.x Linux kernels. Not natively hosted 8086 compilers anno 2013:-)

> > Yes. But you didn't post a msdos specific statement, you posted about
> > nothing worth doing requires more than 500k. That is a general
> statement.
>
> No. I actually said that you should be able to write something quite
> reasonably useful, even a full-blown compiler, in 500 kb.

Maybe. Maybe if I dedicate my life I can also do it in 250kb. Or 128kb. Or whatever.

> (And yes, I was implying this for the IBM PC architecture and real mode
> DOS. This is a DOS forum and thread talking about 16-bit support from
> compilers!)

TARGET support. You tried to change it to HOST support.

> Totally arbitrary, but it shouldn't be that big of a stretch! (Even FASMW,
> which is far from being only a minimalistic assembler + IDE for Win32, is
> only 142 kb.)

Yes, even FASM. ROTFL :-D

You know the end of a software tools discussion approaches if fasm is declared the pinnacle of development tools :)

> > That's exactly what I mean. That is an end-user (desktop) centric way of
> > measuring a server OS.
>
> You know full well that FreeBSD isn't limited to servers, not even by
> design.

I never said that. But that is its main use, and thus anything used to gauge its usage should account for that.

It is like gauging FreeBSD as a 8086 host based on the fact that it can run QEMU.

> (Though why MS doesn't mimic Apple and port over some FreeBSD tools or even
> their Linux emulation is beyond me. Oh wait, I forgot, compatibility is bad
> for marketing when you want everyone to only follow you instead.)

Because they are Unix licensee and have had their own *nix since years of yonder?

> > A chain is only as strong as its weakest link. You are either compatible
> or not. A few rest bits of posix compatibility are useless.
>
> People cannot restrain themselves from using some non-portable features,

True.

> While you may applaud them, I can't
> understand it.

I can be short about that, never considered POSIX as a dividing line of portable and non-portable. Only as a dividing line of Unix-like and not Unix like.

I'm sure as a dos user you can sympathize.

> They did, but ... old code was removed, patches were rejected, etc. These
> are not technical limitations but human ones. They only want to promote
> themselves and their ideals, not help others port the actual technical
> code. (Yes, I'm mostly thinking GNU snobs here, but there are others too,
> see below.)

So the noble forces of the old and useless gathered and forked in response? Maybe I missed it, WHERE did they go too ? :-)

Rugxulo(R)

Homepage

Usono,
07.07.2013, 01:52

@ marcov

FPC 16-bit

> Yes. But that is more context of declaring Windows XP dead. Or Vista. Or
> pre 3.x Linux kernels. Not natively hosted 8086 compilers anno 2013

Yes, because when I'm writing software, I really need to think first, not about what the project actually needs, but "what is popular?" and "what is the latest tech bandwagon to jump on?".

8086 software still runs, even on modern descendants. It doesn't care what year it is. If it didn't run, perhaps you'd have a point. However, people don't think of minimal requirements, they only see what's in front of them, and then only code for that.

The whole point of mentioning "8086" was to list a minimal subset for accomplishing some reasonable goals. It's not absolute. It was just an example of trying to restrict the millions of options available (that often conflict) in an effort to simplify and (maybe) target a larger audience than whatever "new" (incompatible) is being overly promoted these days.

> Maybe. Maybe if I dedicate my life I can also do it in 250kb. Or 128kb. Or
> whatever.

It wasn't meant to be an absolute metric, just an honest question: how much will ever be enough??

MS literally spent an extra billion dollars on the XBox 360 just to make them all come with 512 (instead of 256) MB of RAM. Nowadays, both XBox One and PS4 will come with 8 GB (though some big chunk of that is reserved for OS, I think). The original XBox only had 64 MB, and older consoles (PS1?) much much smaller (2 MB??). Heck, I think the Atari Lynx had 64 kb. Can't remember for Game Boy, GBC, NES, but it was probably 4 kb (plus whatever VRAM). The Atari 2600 was like 256 bytes!!!

> TARGET support. You tried to change it to HOST support.

Because it's been done successfully dozens of times! Sue me for not thinking it's impossible.

> Yes, even FASM. You know the end of a software tools discussion
> approaches if fasm is declared the pinnacle of development tools

It's too tiring to be diplomatic with you here. You've obviously
never used FASM nor read up on it or else you wouldn't say that.
FASM is very very far from being "only" a minimal assembler.

> > You know full well that FreeBSD isn't limited to servers, not even by
> > design.
>
> I never said that. But that is its main use, and thus anything used to
> gauge its usage should account for that.

We already know what the "modern" world thinks of FreeBSD, especially in light of Mac OS X.

> Because [MS?] are Unix licensee and have had their own *nix since years of
> yonder?

SFU is deprecated and to be removed eventually, just like the POSIX subsystem. No, I don't think MS licenses UNIX (tm) at all, and they long long ago sold off all their Xenix stuff. These days, they are wholly interested in exclusively "drinking their own champagne" (only using their own tech).

> I can be short about that, never considered POSIX as a dividing line of
> portable and non-portable. Only as a dividing line of Unix-like and not
> Unix like.

POSIX was originally meant to be a compromise between SYSV and BSD. Unfortunately, it's not really a good starting ground for software
(unless you literally don't care about others like decent Windows support ... it's beyond ridiculous to require a POSIX shell there).

Sorry if I don't see "POSIX only" as any better than "Windows only".

> I'm sure as a dos user you can sympathize.

Nobody sympathizes with anybody. I can show you several software websites (well, at least two) that flat out "mock" their own older versions (Win9x and DOS, respectively), which gave them their start. If you can't even sympathize with yourself, your history, who the hell can you respect??

But why respect anybody when we're so obviously superior in "2013"?? Oh, those fools of 2007, 2001, 1995 ... how stupid they were! Good thing our favorite software (whatever flavor of the month) is going to function and be supported forever.

> > They did, but ... old code was removed, patches were rejected, etc.
> These
> > are not technical limitations but human ones. They only want to promote
> > themselves and their ideals, not help others port the actual technical
> > code.
>
> So the noble forces of the old and useless gathered and forked in response?
> Maybe I missed it, WHERE did they go too ?

Yes, you missed it. Of course you missed it, how could you not? If you don't stay in contact with such groups, how would you ever know? Especially if you haven't booted up a DOS (or compatible) in several years, how could you know?

"I don't see it, therefore it doesn't exist."

Back to the board
Thread view  Mix view  Order
15074 Postings in 1351 Threads, 241 registered users, 15 users online (0 registered, 15 guests)
DOS ain't dead | Admin contact
RSS Feed
powered by my little forum