Back in the late 1980s, how was commercial software for 8-bit home computers developed?












71















When hobbyists wanted to write software for e.g. the Commodore 64, they either used the built-in BASIC interpreter (with all its limitations) or some native tools, like compilers for other languages or, most of the time, assemblers. This had a lot of drawbacks, like the limited screen size, the slow disk I/O, the limited RAM available for the tools and your own code, etc. So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.



There were some nice commercial applications and a lot more nice commercial games available and I wonder how these were developed. Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the development shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?










share|improve this question




















  • 1





    Related: retrocomputing.stackexchange.com/questions/3314/…

    – tofro
    Jan 30 at 8:04






  • 1





    Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

    – Felix Palmen
    Jan 30 at 8:13






  • 1





    No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

    – tofro
    Jan 30 at 8:35






  • 3





    Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

    – Wilson
    Jan 30 at 9:16






  • 2





    @Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

    – Felix Palmen
    Jan 30 at 10:20
















71















When hobbyists wanted to write software for e.g. the Commodore 64, they either used the built-in BASIC interpreter (with all its limitations) or some native tools, like compilers for other languages or, most of the time, assemblers. This had a lot of drawbacks, like the limited screen size, the slow disk I/O, the limited RAM available for the tools and your own code, etc. So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.



There were some nice commercial applications and a lot more nice commercial games available and I wonder how these were developed. Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the development shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?










share|improve this question




















  • 1





    Related: retrocomputing.stackexchange.com/questions/3314/…

    – tofro
    Jan 30 at 8:04






  • 1





    Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

    – Felix Palmen
    Jan 30 at 8:13






  • 1





    No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

    – tofro
    Jan 30 at 8:35






  • 3





    Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

    – Wilson
    Jan 30 at 9:16






  • 2





    @Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

    – Felix Palmen
    Jan 30 at 10:20














71












71








71


7






When hobbyists wanted to write software for e.g. the Commodore 64, they either used the built-in BASIC interpreter (with all its limitations) or some native tools, like compilers for other languages or, most of the time, assemblers. This had a lot of drawbacks, like the limited screen size, the slow disk I/O, the limited RAM available for the tools and your own code, etc. So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.



There were some nice commercial applications and a lot more nice commercial games available and I wonder how these were developed. Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the development shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?










share|improve this question
















When hobbyists wanted to write software for e.g. the Commodore 64, they either used the built-in BASIC interpreter (with all its limitations) or some native tools, like compilers for other languages or, most of the time, assemblers. This had a lot of drawbacks, like the limited screen size, the slow disk I/O, the limited RAM available for the tools and your own code, etc. So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.



There were some nice commercial applications and a lot more nice commercial games available and I wonder how these were developed. Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the development shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?







history software-development 8-bit-microcomputers






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Feb 1 at 9:39









Peter Mortensen

1555




1555










asked Jan 30 at 7:20









Felix PalmenFelix Palmen

7531411




7531411








  • 1





    Related: retrocomputing.stackexchange.com/questions/3314/…

    – tofro
    Jan 30 at 8:04






  • 1





    Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

    – Felix Palmen
    Jan 30 at 8:13






  • 1





    No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

    – tofro
    Jan 30 at 8:35






  • 3





    Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

    – Wilson
    Jan 30 at 9:16






  • 2





    @Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

    – Felix Palmen
    Jan 30 at 10:20














  • 1





    Related: retrocomputing.stackexchange.com/questions/3314/…

    – tofro
    Jan 30 at 8:04






  • 1





    Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

    – Felix Palmen
    Jan 30 at 8:13






  • 1





    No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

    – tofro
    Jan 30 at 8:35






  • 3





    Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

    – Wilson
    Jan 30 at 9:16






  • 2





    @Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

    – Felix Palmen
    Jan 30 at 10:20








1




1





Related: retrocomputing.stackexchange.com/questions/3314/…

– tofro
Jan 30 at 8:04





Related: retrocomputing.stackexchange.com/questions/3314/…

– tofro
Jan 30 at 8:04




1




1





Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

– Felix Palmen
Jan 30 at 8:13





Hmm, is it close enough to be a dupe? I think focusing on the ZX spectrum is unnecessary, I'm really interested in how common it was to a) use the target machine directly (which seems very though), b) use some custom hardware extensions or c) do cross-development on a more powerfull machine ...

– Felix Palmen
Jan 30 at 8:13




1




1





No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

– tofro
Jan 30 at 8:35





No, I don't think it's a duplicate. That is why I only said "related" - The C64 might have had entirely different (or similar, I don't know) development methods than the Spectrum. But the related question has a few answers that go beyond the Spectrum.

– tofro
Jan 30 at 8:35




3




3





Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

– Wilson
Jan 30 at 9:16





Actually, if you are wondering about "some nice commercial applications and a lot more nice commercial games", it might be helpful to narrow your question down a bit. For example, Wordstar, UCSD Pascal and The Hobbit are three completely different software titles; their development will have been completely different also! So your question is fairly broad. I don't know about "too broad" though.

– Wilson
Jan 30 at 9:16




2




2





@Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

– Felix Palmen
Jan 30 at 10:20





@Wilson I expected the challenges to be very similar (given you develop some decently complex software for an 8bit home computer), no matter what kind of software you create -- do you think this isn't the case? Anyways, I just realize it will be hard to pick an "accepted" answer here, so maybe it is a bit broad. Anyways, thanks for the insights given so far!

– Felix Palmen
Jan 30 at 10:20










12 Answers
12






active

oldest

votes


















79














It varied. There was no single method. Some people used assemblers on the target machine, others used cross-development tools.



As an example of a large product for an 8-bit machine, I worked on the BitStik CAD software for Apple II and BBC Micro systems from 1984 to 1986. That used Apple II machines with Z80 CP/M cards for coding (with WordStar) and assembling and linking (with Microsoft's M80 assembler, using its .6502 directive). We had a very simple networked hard disk system for source code, which used ribbon cables to connect machines to the disk cabinet.



Executables would be written onto bootable Apple DOS disks, and then you'd reboot into Apple DOS and try out your changes. This meant we were using the same machines for coding and testing, but frequently switching between OSes on them.



We had machines with hardware debugger cards (can't remember the name) for difficult problems. It worked pretty well, all things considered.



Since WordStar has attracted attention, this was "Classic" WordStar 3.3 IIRC. It has a "Non-Document" mode where it works as a plain text editor, and if you turned off the menus and delays, it was decently fast on a 3.58MHz Z80. Its "Column Mode", which lets you define and operate on a rectangular block of text that doesn't have to contain complete lines, was pretty useful when writing assembler code.



Source control was entirely manual. There was a big wallchart for developers to claim control of source files. They had the copies they were working on in their personal areas of the hard disk, and would put their changed versions back into the shared area and cross out their initials on the chart. Backups were taken weekly, and before any major changes. More copies than that would have been hard work: the hard disk was only 20MB for five developers.



There were a lot of floppy disks around, and we had to be disciplined with them. After one new developer was a problem with that, the rule became that any unlabelled disk was subject to summary destruction, by anyone who felt like it.






share|improve this answer





















  • 18





    Coding with WORDSTAR!!?!?!!!????

    – slebetman
    Jan 30 at 13:14






  • 10





    So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

    – Mawg
    Jan 30 at 14:05






  • 6





    @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

    – John Dallman
    Jan 30 at 14:51






  • 9





    Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

    – Tony Whitley
    Jan 31 at 12:42






  • 4





    @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

    – John Dallman
    Jan 31 at 20:15



















30















This had a lot of drawbacks, like the limited screen size, the slow Disk I/O, the limited RAM needed for the tools and your own code, etc.




Those are just drawbacks of having a slower or less capable computer. As that was the norm, I don't think anyone thought much of it. Even considering that, a lot of that may be alleviated by a simple setup involving a couple of Commodore 64s and a serial connection.



Consider Prince of Persia, that famous game for the Apple II. It was developed on the Apple II. I think the author had two Apple IIs which were connected; one to type his code up, and the other to test the game.



In these cases, to debug the program, something like an Action Replay cartridge might be used. These essentially send an NMI to the processor, and bank in some kind of monitor ROM, so that you can inspect and/or change variables etc. Then when you're done debugging, the program can (usually) resume.




Did the programmers indeed work on the target machine with all the drawbacks




Sometimes, as I have pointed out. but still, a lot of work was more easily done on paper. Back then, a lot of books were printed that showed how to design sprites on graph paper, and translate it into hexadecimal, and type the hexadecimal codes into the computer. More rarely, a digitiser was used to get the graphical data into the machine. Similarly, maps and flowcharts were a good way to think about a solution before even beginning to type any code out, as they still are.




Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerfull workstations with e.g. fast harddisks and cross-assemblers?




Certainly at least some dev shops used contemporaneous, but more powerful or comfortable computers. For example, the Tatung Einstein was commonly used to develop software titles for the ZX Spectrum or other Z80 machines. And similarly, any PC clone or CP/M computer or anything like that can be used in much the same way. That's advantageous because these machines have more RAM than the ZX Spectrum 48K, so the assembler can more easily reside with (some of) the source code and (some of the) program, all at once.



Starting from the late 80s, it was feasible to develop on the Amiga for the Commodore 64. Now you have the conveniences of a much faster CPU and storage, even hard disks and version control, and Commodore 64 emulators on the Amiga itself. So that's not really that different from what C64 developers do today, except we're probably more likely to use Linux or Windows or whatever.






share|improve this answer





















  • 4





    Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

    – Solomon Slow
    Jan 30 at 14:26






  • 5





    @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

    – Wilson
    Jan 30 at 16:42



















18















So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.




To start with, I still like to use my IIgs (or IIc-plus) when coding for the Apple II. Both are quite fast machines with more than enough memory to do the job. After all, editing source text doesn't get faster with a mouse and many colours. And all the 'helpers' of modern IDE are just adding potential errors - like selecting the wrong function, just because it's so neatly presented. Yes, the Google disease of always picking the top entry has reached the programming community.




Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?




The '80s on small computers were an extremely fast changing environment, so there is not a single answer to this, or in other words: It depends :)





  • In the very beginning cross assemblers where the norm. How else to get a program running on a new CPU? Running them on time-sharing systems wasn't about getting more luxury, but running it at all, thus avoiding assembling from hand. They were not in any way great tools or easy to handle. Don't expect something like an Arduino IDE running on a PC compiling and downloading the program in a click. We talk extremely meagre functionality, clumsy handling at high cost. And the result was just another file on that distant mainframe.



    It was common to just dump the result to a TTY as hex (*1) and then key it into a single boarder to have it run - or into a prommer to store it and then run. People with a lot of money would have prommers able to read paper tape, punched by the TTY. Controlling all of this by hand while moving rolls of paper around was considered easy handling and fast development.




  • This soon changed when development tools were moved to the machine itself. For example, one of the sales arguments for the JOLT was its "Resident Assembler Program (RAP)" advertised with "costly time sharing services are not needed for cross assemblies" and "The Resident Assembler Program is compatible with the MaS Technology Cross Assembler" in their brochure.



    While such a resident assembler was just a small step, more capable assemblers running on the target system (or closely related ones) became available. A notable example was Intel's ISIS systems as professional development tools. Here, an 8080 system was used to develop programs for the whole Intel line from 8023 to 8085.



    Now, while these are dedicated developer systems, they were neither comfortable nor any faster than their targets - often slower. Heck, even early 8086 development was done on these. The 8086 boards were just used as target.




  • Around 1980 and for small developers, it was already a great gain to load an editor from tape, write/change the source code (loaded from tape as well), save it, load a compiler (all from tape), load the source again, compile, save the result, run it. Sure, all of this could be quite sped up by buying a disk drive - but more than doubling the system cost. An original 8KiB PET cost 800 USD, while a 2040 drive called for over 1,000 USD. And that's 1978 prices.



    So for a more professional setup (maybe with a printer ... whooohoo) 4-6,000 USD was a reasonable number - in today's money that's 25-30,000 USD. Not exactly lunch money.




At that point (1977-1982) it's important to remember that (Game) Development wasn't driven by big corporations with large development teams, but more often than not by single person or very small teams of 2-4 developers. It's the time soon to be important companies started at the literal kitchen table with the machine itself and a hunger to build.





  • As before the question wasn't about how inconvenient using the target machine is, but how great it would be to have a floppy at all.



    Considering this, a built-in BASIC with integrated editor and interpreter and debugger in one, so only source/program has to be loaded and saved, had many advantages on early machines. A built-in IDE after all.




No one thought at this time about having a faster machine. There was none. Using a different machine would only complicate the development process.





  • Soon third party developers released more integrated compiler environments to run on the target machine. One where, for example, the source could be held in memory when switching from editor to compiler.



    And for owners of lots of RAM - like 32 KiB or even more - there were integrated programs that did hold both in memory and the source. Who needs disks anyway? :))




  • When floppies became more affordable, development tools were based around them with automated load of components. Most notably may be the UCSD p-code system (Pascal, Fortran, etc.) offering an integrated menu system to switch between components (Editor, Compiler, Linker, Debugger) with only a few key presses.



    Similar integration made Turbo Pascal in 1983/84 an unparalleled success.



  • With introduction of hard disks to the general public (read: becoming affordable by only doubling the machine's price) around the same time (1984...) development did really speed up. Not so much due to the hard disk speed, but missing the need to swap floppies (*2).


  • The PC itself wasn't a real fast machine; its advantage was a large RAM configuration and, as mentioned, easy hard disk integration. Once again it wasn't about speed, but the ability to do certain tasks at all - like fast program switching or large compilers without loading of overlays.


  • All throughout the '80s, speed wasn't a thing of the CPU at all. Speed difference between slowest and fastest machines was not given in the first half and maybe 1:2 throughout the second half of the decade.



For a more general view it's useful to keep in mind that most development until the late '80s wasn't so much about making life comfy as today, but being able to develop at all. Much as the most important feature of the (small) machines back then was to be a computer capable of running arbitrary programs at all. The choice wasn't to compile a program with one click within seconds and get a beautiful output presented, but rather being able to compile at all. Switching a disk 5 times per compile and waiting a minute or two until a 1,000 line program is compiled and linked is fun compared to no tools and no program.



Looking close, it's helpful to follow a rough timeline and different different use cases in mind:




  • Very early on, ~1973-1978, Crossdevelopment was important and often the only way, as the CPUs to be developed for weren't available in usable machines. Programm sizes where measured rather in pages of 256 Bytes than Kilobytes.


  • When the first general available machines (Altair, Commodore, Apple, Tandy) where in use(1977-1980), they were, for most users, far more convenient than any (expensive) cross development, not at least due the fact that programs could be developed interactive and tested right away. Still, codesize was measured in single digit KiB, and development environment and runtime were the same. Eventually a developer used additional tools in form of plug in ROMs ( TIM for Commodore or Programmers Aid#1 for the Apple II).


  • With the advent of dedicated home computers, roughly the time from 1980 to 1984 (The Video Game Crash, *3), they became them self the best development platform for their software. Much of the code was platform dependant, and needed to be tested in place anyway. Also, the difference between these and 'professional' systems in speed where more often than not reversed. And while professional development systems had in general better storage, their RAM offerings where as small. As a result, it was more appropriate to soup up the target system - like adding floppies. Hard disks weren't an issue.


  • While the PC itself didn't change much here, despite the larger memory, it was the AT of 1984 and more important the 16 bit wave of 1985 that changed the play field with faster machines. Development was done on the target machines itself - after all, they were at the top end of the time again.


  • While now faster platforms available that could ease development for older generations, it still wasn't until the late 1980s that the needed software to emulate older machines good enough to allow satisfying cross development.


  • For the top end machines, development was and is done on the machines itself.



It wasn't until the mid to late 90s, when of the shelf hardware could generaly be used as cross development ... then again, it became less and less important as the hardware converged toward today's PC.





*1 - Or a loader format, which is no big difference either.



*2 - In the early '80s, before hard disks were affordable, developers often had 3 or 4 drives at their machines to reduce floppy swapping.



*3 - 1980 (To be strict christmas sales of 1979) is marked by introduction the first (dedicated, mainstream) home computers with the TI 99/4 and Atari 400/800.



*4 - The often cited video game crash of 1983-1984 (With ET being like the tip of the iceberg) was in most parts a US phenomena as Europe and Japan where only indirect influenced.






share|improve this answer


























  • You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

    – Felix Palmen
    Jan 30 at 18:14






  • 6





    @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

    – Raffzahn
    Jan 30 at 19:08






  • 2





    Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

    – Raffzahn
    Jan 30 at 19:10








  • 4





    The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

    – supercat
    Jan 30 at 19:38











  • "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

    – Felix Palmen
    Jan 31 at 7:28





















12














They used cross-development kits back then too. I worked briefly in a UK game developer in 1990 and all their Commodore 64 and ZX Spectrum games were developed on a PC with a proprietary kit.



See for example Andrew Braybrook's diary covering the development of Morpheus on the C64, where they start to use Opus PCs and an Atari ST to develop on, connected to the C64 with an RS232 cable. This diary appeared in Zzap 64! magazine at the time.






share|improve this answer



















  • 2





    Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

    – tylisirn
    Jan 30 at 22:21



















10














In It's Behind You: The Making of a Computer Game, in which Bob Pape describes his process when authoring the ZX Spectrum conversion of R-Type, alongside colleagues working on Atari ST and C64 ports, he writes:




The equipment I was using to write R-Type with [initially] was the same as for Rampage, a standard 48K Spectrum with Interface 1 and microdrives and everybody's favourite hardware copying/backup device Romantic Robot’s Multiface 1. I had a copy of
OCP EditorAssembler on the microdrive along with my source code and with a push of the button on the Multiface I could go from an empty Spectrum to one ready to assemble in just a few seconds, however the drawback to this method was that the assembler was resident in the Spectrum's memory taking up valuable RAM that I couldn't use.




So Rampage, which was released for the Spectrum in 1988, is at least one commercial piece of software from the late '80s that was developed directly on the machine that it targetted, using comparatively fast external storage and a little off-the-shelf hardware assistance for state inspection.



During the course of R-Type's development:




Around this time the three of us took delivery of proper PC based development systems from Catalyst. Each of us received a then state-of-the-art Opus 80286 PC with monochrome monitor running DOS and a professional cross-development package.



The business end of the PC development was handled by a PDS (Programmer's Development System) board, which was really just a parallel I/O card that would connect to the target machine (at the time a Spectrum, Amstrad or C64) via a simple interface. On the target machine you'd load a small piece of code that would sit there polling the lines and waiting for a signal supplied by a custom assembler running on the PC - as soon as you'd assembled on the PC and used the SEND Command the target machine would transfer the object code and you'd be up and running in about a second. The download code for the target machines came in three versions, 'dumb', 'smart' and 'interrupt driven' with the latter running under interrupts allowing the PC to monitor, control and even change the code on the target machine while the game was running, which wasn't really a lot of use if you were trying to write time critical code but it did have a nice line in real-time Trace functions.




So R-Type, released later in 1988, was that author's turning point to cross-development, using an environment very much like the one you describe as that which you'd use today: write and build on a comparatively fast system, then test the result on the real hardware.






share|improve this answer


























  • Side note: everybody on this site should read Bob's book, it's a truly great read.

    – Matt Lacey
    Feb 1 at 4:22











  • +1, it is amazing how he got an R-Type out of the Speccy.

    – Alan B
    Feb 8 at 12:52



















8














Most of the "professional" outfits did use cross development, although they often had to build their own tools. For example they might have a Z80 assembler, but would need to make their down serial download app for the target machine to get the compiled code on there. IBM PCs and compatibles were popular for this task.



There were also add-on cards for the 8 bit machines that made development work on them easier. These usually contained some kind of "monitor" application in ROM, that let the programmer freeze and inspect the state of the machine, make changes and so on. Because the application was in ROM it didn't eat up any precious RAM or interfere too much with the operation of the code being debugged.



Some development was done offline too. For example graphics were often sketched out on graph paper and then manually converted to numbers and programmed directly into the game.






share|improve this answer































    8














    Like others said, it varied. Lots of small developers (like Llamasoft) programmed directly on the hardware, bigger developers used other computers to act as debuggers or cross-development systems. For example, there were a lot of cross assemblers available for the Atari ST. Atari used special hardware and Vax computers to aid in development of 2600/5200 games. The games would still run on (modified) console hardware, with the other machine connected to the console and acted as a debugger.



    So generally you use the bigger system to aid in development on the smaller system or develop directly on the hardware. But here's one example where a C64 was used to prepare developers for the ST:



    http://www.stcarchiv.de/stm1988/07/die-hexer-teil-1



    Demo coders TEX, who later joined Thalion (Dragonflight, Amberstar), started programming in 68000 assembler before receiving their first Atari STs. According to the devs, they used a 68000 assembler/simulator to gain some experience.



    Jez San (Starglider) used a 68000 emulator to simplify debugging (http://www.stcarchiv.de/stm1988/05/interview-jez-san).






    share|improve this answer































      5














      Together with a friend, around 1984-1986, I developed a PCB CAD system "CADsoft" (not to be confused with the later company of that name, producing the Eagle CAD system) targetting the Z80 CP/M platform (with a specially-designed graphics card, see here) that was sold around 20 times, and used by some hardware companies to design the PCBs for their product portfolio.



      We did it in Pascal/MT+ and a few assembly routines for hardware interfacing. The resulting executable was too large to fit into the RAM, but Pascal/MT+ supported overlays to be swapped in and out on-demand.



      We coded on our CP/M machines, equal to the target ones:




      • 64 kB RAM

      • 4 MHz Z80A

      • 80*24 terminal with serial connection (9600 baud)

      • two 800 kB floppy drives (later extended by a 20 MB SCSI hard disk)


      I don't remember the editor we used, quite probably WordStar.



      Compiling and linking the application took a few minutes, but was possible without floppy swaps, IIRC.



      We didn't have version control, only lots of backup floppy-disks. We did most of the coding in coordinated sessions, so the risk of accidentally modifying the same source file was reduced by oral communication.






      share|improve this answer































        5














        A friend of mine did the C64 conversion of Tetris for Mirrorsoft. AFAIR he used the game was supplied, pre-written, in Commodore Basic, which was then compiled, then he used a disk-based assembler for the graphics and sound (he wrote his own sound player and composer app, which the composer user to supply the music) and assembled everything, all on a single C64 in his bedroom. Not sure how long each 'compilation' took, but I remember watching him play clarinet, whilst waiting for the disk to stop whirring, which signalled it was ready to play-test.






        share|improve this answer
























        • Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

          – Tommy
          Feb 1 at 22:42











        • You are correct. The main logic and game play was compiled basic.

          – Neil
          Feb 2 at 11:05



















        5














        I'll answer for me, as someone who did a few games.



        On the C64, I used a stock machine with a floppy drive and an assembler, I forget the product. If I recall correctly, the assembler came on cartridge and let you write assembly language as an extension of Basic.



        On the ZX Spectrum, I had a CP/M machine (Memotech) with dual floppies and a C compiler / assembler / linker toolchain. I then had some code to hexdump the (fixed location) linker output and send it down a custom interface into the Speccy (this definitely took a cup of (instant) coffee while it loaded).



        [Incidentally, when I grew up and became a firmware engineer with real budgets, my stack of choice involved an in-circuit emulator, EPROM emulator, logic analyser and PDP-II to run compilation, mostly controlled by a bank of VT-100 type terminals. You could have used such a thing to develop 8-bit home computer apps if you had the funds]






        share|improve this answer































          4














          It depends on how late into the late 80s we are talking about, and your definition of "commercial titles"



          By 1988-1989 got my first job developing/maintaining a C payroll package, using a Commodore 80286, running at 12MHz with 1MB of RAM (PC40?) at work, with a wooping 20MB hard disk, which was upgraded to 30-40MB a couple of months later.



          Was using for development Microsoft C compiler, Masm assembler and MS-DOS 3.x. The editor was SideKick for editing text files with WordStar compatible keys, and was writing a couple of routines in 8086 ASM. For compiling was still invoking Microsoft make with a carefully crafted Makefile



          The monitor was monochromatic green, Hercules, and the backups of our source code were done in 5 1/4 floppy disks.



          The software was delivered to the customers either in a 5 1⁄4 or a 3 1/2 floppy disk, and later on as as a compressed zip file with an installer I wrote in batch DOS commands.



          The customers usually used less powerful machines, there were still around a lot of XT machines and even Tandy 1000 DOS computers. (By that time I also had a Olivetti PC1 XT at home)



          By that time, started writing a DOS ZX Spectrum emulator/debbuger, first versions running in Hercules at work (and CGA, later VGA at home) without keyboard input, that never released. It was capable of doing step-by-step debugging and used TASM to cross assembly Z80 ASM files. Those were the basis for my WSpecem ZX emulator for Windows, released in 1996.






          share|improve this answer


























          • Commodore 286? Or was it a Compaq 286?

            – manassehkatz
            Jan 30 at 22:33











          • Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

            – Rui F Ribeiro
            Jan 30 at 22:40








          • 1





            Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

            – manassehkatz
            Jan 30 at 22:45











          • I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

            – Rui F Ribeiro
            Jan 30 at 22:47













          • 1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

            – a CVn
            Feb 1 at 21:53



















          3














          I wrote commercial TRS-80 software, on a TRS-80. Either in BASIC, or in Z-80 assembler run through Editor Assembler. Those were the days. I also used the TRS-80 to cross-compile 6805 asm to burn on Motorola chips for what we'd call embedded devices now. At my next job, we all shared and wrote software on an IMSAI with a Z-80 chip and bank-switched RAM using an M/PM-like timeshare OS, with a single giant 5MB HDD that backed up to VHS cassettes. If the head programmer ran the 30-minute build of his Z-80 ASM (it was a phonebook sized printout) we'd all feel it.






          share|improve this answer






















            protected by wizzwizz4 Jan 31 at 16:34



            Thank you for your interest in this question.
            Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



            Would you like to answer one of these unanswered questions instead?














            12 Answers
            12






            active

            oldest

            votes








            12 Answers
            12






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            79














            It varied. There was no single method. Some people used assemblers on the target machine, others used cross-development tools.



            As an example of a large product for an 8-bit machine, I worked on the BitStik CAD software for Apple II and BBC Micro systems from 1984 to 1986. That used Apple II machines with Z80 CP/M cards for coding (with WordStar) and assembling and linking (with Microsoft's M80 assembler, using its .6502 directive). We had a very simple networked hard disk system for source code, which used ribbon cables to connect machines to the disk cabinet.



            Executables would be written onto bootable Apple DOS disks, and then you'd reboot into Apple DOS and try out your changes. This meant we were using the same machines for coding and testing, but frequently switching between OSes on them.



            We had machines with hardware debugger cards (can't remember the name) for difficult problems. It worked pretty well, all things considered.



            Since WordStar has attracted attention, this was "Classic" WordStar 3.3 IIRC. It has a "Non-Document" mode where it works as a plain text editor, and if you turned off the menus and delays, it was decently fast on a 3.58MHz Z80. Its "Column Mode", which lets you define and operate on a rectangular block of text that doesn't have to contain complete lines, was pretty useful when writing assembler code.



            Source control was entirely manual. There was a big wallchart for developers to claim control of source files. They had the copies they were working on in their personal areas of the hard disk, and would put their changed versions back into the shared area and cross out their initials on the chart. Backups were taken weekly, and before any major changes. More copies than that would have been hard work: the hard disk was only 20MB for five developers.



            There were a lot of floppy disks around, and we had to be disciplined with them. After one new developer was a problem with that, the rule became that any unlabelled disk was subject to summary destruction, by anyone who felt like it.






            share|improve this answer





















            • 18





              Coding with WORDSTAR!!?!?!!!????

              – slebetman
              Jan 30 at 13:14






            • 10





              So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

              – Mawg
              Jan 30 at 14:05






            • 6





              @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

              – John Dallman
              Jan 30 at 14:51






            • 9





              Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

              – Tony Whitley
              Jan 31 at 12:42






            • 4





              @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

              – John Dallman
              Jan 31 at 20:15
















            79














            It varied. There was no single method. Some people used assemblers on the target machine, others used cross-development tools.



            As an example of a large product for an 8-bit machine, I worked on the BitStik CAD software for Apple II and BBC Micro systems from 1984 to 1986. That used Apple II machines with Z80 CP/M cards for coding (with WordStar) and assembling and linking (with Microsoft's M80 assembler, using its .6502 directive). We had a very simple networked hard disk system for source code, which used ribbon cables to connect machines to the disk cabinet.



            Executables would be written onto bootable Apple DOS disks, and then you'd reboot into Apple DOS and try out your changes. This meant we were using the same machines for coding and testing, but frequently switching between OSes on them.



            We had machines with hardware debugger cards (can't remember the name) for difficult problems. It worked pretty well, all things considered.



            Since WordStar has attracted attention, this was "Classic" WordStar 3.3 IIRC. It has a "Non-Document" mode where it works as a plain text editor, and if you turned off the menus and delays, it was decently fast on a 3.58MHz Z80. Its "Column Mode", which lets you define and operate on a rectangular block of text that doesn't have to contain complete lines, was pretty useful when writing assembler code.



            Source control was entirely manual. There was a big wallchart for developers to claim control of source files. They had the copies they were working on in their personal areas of the hard disk, and would put their changed versions back into the shared area and cross out their initials on the chart. Backups were taken weekly, and before any major changes. More copies than that would have been hard work: the hard disk was only 20MB for five developers.



            There were a lot of floppy disks around, and we had to be disciplined with them. After one new developer was a problem with that, the rule became that any unlabelled disk was subject to summary destruction, by anyone who felt like it.






            share|improve this answer





















            • 18





              Coding with WORDSTAR!!?!?!!!????

              – slebetman
              Jan 30 at 13:14






            • 10





              So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

              – Mawg
              Jan 30 at 14:05






            • 6





              @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

              – John Dallman
              Jan 30 at 14:51






            • 9





              Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

              – Tony Whitley
              Jan 31 at 12:42






            • 4





              @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

              – John Dallman
              Jan 31 at 20:15














            79












            79








            79







            It varied. There was no single method. Some people used assemblers on the target machine, others used cross-development tools.



            As an example of a large product for an 8-bit machine, I worked on the BitStik CAD software for Apple II and BBC Micro systems from 1984 to 1986. That used Apple II machines with Z80 CP/M cards for coding (with WordStar) and assembling and linking (with Microsoft's M80 assembler, using its .6502 directive). We had a very simple networked hard disk system for source code, which used ribbon cables to connect machines to the disk cabinet.



            Executables would be written onto bootable Apple DOS disks, and then you'd reboot into Apple DOS and try out your changes. This meant we were using the same machines for coding and testing, but frequently switching between OSes on them.



            We had machines with hardware debugger cards (can't remember the name) for difficult problems. It worked pretty well, all things considered.



            Since WordStar has attracted attention, this was "Classic" WordStar 3.3 IIRC. It has a "Non-Document" mode where it works as a plain text editor, and if you turned off the menus and delays, it was decently fast on a 3.58MHz Z80. Its "Column Mode", which lets you define and operate on a rectangular block of text that doesn't have to contain complete lines, was pretty useful when writing assembler code.



            Source control was entirely manual. There was a big wallchart for developers to claim control of source files. They had the copies they were working on in their personal areas of the hard disk, and would put their changed versions back into the shared area and cross out their initials on the chart. Backups were taken weekly, and before any major changes. More copies than that would have been hard work: the hard disk was only 20MB for five developers.



            There were a lot of floppy disks around, and we had to be disciplined with them. After one new developer was a problem with that, the rule became that any unlabelled disk was subject to summary destruction, by anyone who felt like it.






            share|improve this answer















            It varied. There was no single method. Some people used assemblers on the target machine, others used cross-development tools.



            As an example of a large product for an 8-bit machine, I worked on the BitStik CAD software for Apple II and BBC Micro systems from 1984 to 1986. That used Apple II machines with Z80 CP/M cards for coding (with WordStar) and assembling and linking (with Microsoft's M80 assembler, using its .6502 directive). We had a very simple networked hard disk system for source code, which used ribbon cables to connect machines to the disk cabinet.



            Executables would be written onto bootable Apple DOS disks, and then you'd reboot into Apple DOS and try out your changes. This meant we were using the same machines for coding and testing, but frequently switching between OSes on them.



            We had machines with hardware debugger cards (can't remember the name) for difficult problems. It worked pretty well, all things considered.



            Since WordStar has attracted attention, this was "Classic" WordStar 3.3 IIRC. It has a "Non-Document" mode where it works as a plain text editor, and if you turned off the menus and delays, it was decently fast on a 3.58MHz Z80. Its "Column Mode", which lets you define and operate on a rectangular block of text that doesn't have to contain complete lines, was pretty useful when writing assembler code.



            Source control was entirely manual. There was a big wallchart for developers to claim control of source files. They had the copies they were working on in their personal areas of the hard disk, and would put their changed versions back into the shared area and cross out their initials on the chart. Backups were taken weekly, and before any major changes. More copies than that would have been hard work: the hard disk was only 20MB for five developers.



            There were a lot of floppy disks around, and we had to be disciplined with them. After one new developer was a problem with that, the rule became that any unlabelled disk was subject to summary destruction, by anyone who felt like it.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Feb 13 at 23:13

























            answered Jan 30 at 7:34









            John DallmanJohn Dallman

            3,279816




            3,279816








            • 18





              Coding with WORDSTAR!!?!?!!!????

              – slebetman
              Jan 30 at 13:14






            • 10





              So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

              – Mawg
              Jan 30 at 14:05






            • 6





              @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

              – John Dallman
              Jan 30 at 14:51






            • 9





              Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

              – Tony Whitley
              Jan 31 at 12:42






            • 4





              @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

              – John Dallman
              Jan 31 at 20:15














            • 18





              Coding with WORDSTAR!!?!?!!!????

              – slebetman
              Jan 30 at 13:14






            • 10





              So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

              – Mawg
              Jan 30 at 14:05






            • 6





              @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

              – John Dallman
              Jan 30 at 14:51






            • 9





              Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

              – Tony Whitley
              Jan 31 at 12:42






            • 4





              @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

              – John Dallman
              Jan 31 at 20:15








            18




            18





            Coding with WORDSTAR!!?!?!!!????

            – slebetman
            Jan 30 at 13:14





            Coding with WORDSTAR!!?!?!!!????

            – slebetman
            Jan 30 at 13:14




            10




            10





            So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

            – Mawg
            Jan 30 at 14:05





            So old fashioned! Nowadays many posters to S.O seem to use the more modern Notepad++ as their "IDE"

            – Mawg
            Jan 30 at 14:05




            6




            6





            @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

            – John Dallman
            Jan 30 at 14:51





            @slebetman: Yes, WordStar - added a bit more. There was no integrated editor and assembler at the time, and M80 was the best assembler available. Non-document mode WordStar worked fine with it, and WordStar's "column mode", which is still unique to it, could be quite helpful for assembler coding.

            – John Dallman
            Jan 30 at 14:51




            9




            9





            Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

            – Tony Whitley
            Jan 31 at 12:42





            Re: editing with WordStar ========================= It caused a bug in a colleague's code that I struggled to track down. An instruction was not being executed even though it was clearly there in the code. Digging into the problem I found the machine code didn't have the instruction. Why not? Because WordStar had set bit 8 of an ASCII code somewhere in the line. It looked fine in WordStar, it looked fine in a printed listing and the assembler didn't report an error - it just silently ignored the whole line!

            – Tony Whitley
            Jan 31 at 12:42




            4




            4





            @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

            – John Dallman
            Jan 31 at 20:15





            @TonyWhitley: Someone had edited the file in document mode, which sets those top bits for some kind of tracking - can't remember just what after so many years.

            – John Dallman
            Jan 31 at 20:15











            30















            This had a lot of drawbacks, like the limited screen size, the slow Disk I/O, the limited RAM needed for the tools and your own code, etc.




            Those are just drawbacks of having a slower or less capable computer. As that was the norm, I don't think anyone thought much of it. Even considering that, a lot of that may be alleviated by a simple setup involving a couple of Commodore 64s and a serial connection.



            Consider Prince of Persia, that famous game for the Apple II. It was developed on the Apple II. I think the author had two Apple IIs which were connected; one to type his code up, and the other to test the game.



            In these cases, to debug the program, something like an Action Replay cartridge might be used. These essentially send an NMI to the processor, and bank in some kind of monitor ROM, so that you can inspect and/or change variables etc. Then when you're done debugging, the program can (usually) resume.




            Did the programmers indeed work on the target machine with all the drawbacks




            Sometimes, as I have pointed out. but still, a lot of work was more easily done on paper. Back then, a lot of books were printed that showed how to design sprites on graph paper, and translate it into hexadecimal, and type the hexadecimal codes into the computer. More rarely, a digitiser was used to get the graphical data into the machine. Similarly, maps and flowcharts were a good way to think about a solution before even beginning to type any code out, as they still are.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerfull workstations with e.g. fast harddisks and cross-assemblers?




            Certainly at least some dev shops used contemporaneous, but more powerful or comfortable computers. For example, the Tatung Einstein was commonly used to develop software titles for the ZX Spectrum or other Z80 machines. And similarly, any PC clone or CP/M computer or anything like that can be used in much the same way. That's advantageous because these machines have more RAM than the ZX Spectrum 48K, so the assembler can more easily reside with (some of) the source code and (some of the) program, all at once.



            Starting from the late 80s, it was feasible to develop on the Amiga for the Commodore 64. Now you have the conveniences of a much faster CPU and storage, even hard disks and version control, and Commodore 64 emulators on the Amiga itself. So that's not really that different from what C64 developers do today, except we're probably more likely to use Linux or Windows or whatever.






            share|improve this answer





















            • 4





              Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

              – Solomon Slow
              Jan 30 at 14:26






            • 5





              @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

              – Wilson
              Jan 30 at 16:42
















            30















            This had a lot of drawbacks, like the limited screen size, the slow Disk I/O, the limited RAM needed for the tools and your own code, etc.




            Those are just drawbacks of having a slower or less capable computer. As that was the norm, I don't think anyone thought much of it. Even considering that, a lot of that may be alleviated by a simple setup involving a couple of Commodore 64s and a serial connection.



            Consider Prince of Persia, that famous game for the Apple II. It was developed on the Apple II. I think the author had two Apple IIs which were connected; one to type his code up, and the other to test the game.



            In these cases, to debug the program, something like an Action Replay cartridge might be used. These essentially send an NMI to the processor, and bank in some kind of monitor ROM, so that you can inspect and/or change variables etc. Then when you're done debugging, the program can (usually) resume.




            Did the programmers indeed work on the target machine with all the drawbacks




            Sometimes, as I have pointed out. but still, a lot of work was more easily done on paper. Back then, a lot of books were printed that showed how to design sprites on graph paper, and translate it into hexadecimal, and type the hexadecimal codes into the computer. More rarely, a digitiser was used to get the graphical data into the machine. Similarly, maps and flowcharts were a good way to think about a solution before even beginning to type any code out, as they still are.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerfull workstations with e.g. fast harddisks and cross-assemblers?




            Certainly at least some dev shops used contemporaneous, but more powerful or comfortable computers. For example, the Tatung Einstein was commonly used to develop software titles for the ZX Spectrum or other Z80 machines. And similarly, any PC clone or CP/M computer or anything like that can be used in much the same way. That's advantageous because these machines have more RAM than the ZX Spectrum 48K, so the assembler can more easily reside with (some of) the source code and (some of the) program, all at once.



            Starting from the late 80s, it was feasible to develop on the Amiga for the Commodore 64. Now you have the conveniences of a much faster CPU and storage, even hard disks and version control, and Commodore 64 emulators on the Amiga itself. So that's not really that different from what C64 developers do today, except we're probably more likely to use Linux or Windows or whatever.






            share|improve this answer





















            • 4





              Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

              – Solomon Slow
              Jan 30 at 14:26






            • 5





              @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

              – Wilson
              Jan 30 at 16:42














            30












            30








            30








            This had a lot of drawbacks, like the limited screen size, the slow Disk I/O, the limited RAM needed for the tools and your own code, etc.




            Those are just drawbacks of having a slower or less capable computer. As that was the norm, I don't think anyone thought much of it. Even considering that, a lot of that may be alleviated by a simple setup involving a couple of Commodore 64s and a serial connection.



            Consider Prince of Persia, that famous game for the Apple II. It was developed on the Apple II. I think the author had two Apple IIs which were connected; one to type his code up, and the other to test the game.



            In these cases, to debug the program, something like an Action Replay cartridge might be used. These essentially send an NMI to the processor, and bank in some kind of monitor ROM, so that you can inspect and/or change variables etc. Then when you're done debugging, the program can (usually) resume.




            Did the programmers indeed work on the target machine with all the drawbacks




            Sometimes, as I have pointed out. but still, a lot of work was more easily done on paper. Back then, a lot of books were printed that showed how to design sprites on graph paper, and translate it into hexadecimal, and type the hexadecimal codes into the computer. More rarely, a digitiser was used to get the graphical data into the machine. Similarly, maps and flowcharts were a good way to think about a solution before even beginning to type any code out, as they still are.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerfull workstations with e.g. fast harddisks and cross-assemblers?




            Certainly at least some dev shops used contemporaneous, but more powerful or comfortable computers. For example, the Tatung Einstein was commonly used to develop software titles for the ZX Spectrum or other Z80 machines. And similarly, any PC clone or CP/M computer or anything like that can be used in much the same way. That's advantageous because these machines have more RAM than the ZX Spectrum 48K, so the assembler can more easily reside with (some of) the source code and (some of the) program, all at once.



            Starting from the late 80s, it was feasible to develop on the Amiga for the Commodore 64. Now you have the conveniences of a much faster CPU and storage, even hard disks and version control, and Commodore 64 emulators on the Amiga itself. So that's not really that different from what C64 developers do today, except we're probably more likely to use Linux or Windows or whatever.






            share|improve this answer
















            This had a lot of drawbacks, like the limited screen size, the slow Disk I/O, the limited RAM needed for the tools and your own code, etc.




            Those are just drawbacks of having a slower or less capable computer. As that was the norm, I don't think anyone thought much of it. Even considering that, a lot of that may be alleviated by a simple setup involving a couple of Commodore 64s and a serial connection.



            Consider Prince of Persia, that famous game for the Apple II. It was developed on the Apple II. I think the author had two Apple IIs which were connected; one to type his code up, and the other to test the game.



            In these cases, to debug the program, something like an Action Replay cartridge might be used. These essentially send an NMI to the processor, and bank in some kind of monitor ROM, so that you can inspect and/or change variables etc. Then when you're done debugging, the program can (usually) resume.




            Did the programmers indeed work on the target machine with all the drawbacks




            Sometimes, as I have pointed out. but still, a lot of work was more easily done on paper. Back then, a lot of books were printed that showed how to design sprites on graph paper, and translate it into hexadecimal, and type the hexadecimal codes into the computer. More rarely, a digitiser was used to get the graphical data into the machine. Similarly, maps and flowcharts were a good way to think about a solution before even beginning to type any code out, as they still are.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerfull workstations with e.g. fast harddisks and cross-assemblers?




            Certainly at least some dev shops used contemporaneous, but more powerful or comfortable computers. For example, the Tatung Einstein was commonly used to develop software titles for the ZX Spectrum or other Z80 machines. And similarly, any PC clone or CP/M computer or anything like that can be used in much the same way. That's advantageous because these machines have more RAM than the ZX Spectrum 48K, so the assembler can more easily reside with (some of) the source code and (some of the) program, all at once.



            Starting from the late 80s, it was feasible to develop on the Amiga for the Commodore 64. Now you have the conveniences of a much faster CPU and storage, even hard disks and version control, and Commodore 64 emulators on the Amiga itself. So that's not really that different from what C64 developers do today, except we're probably more likely to use Linux or Windows or whatever.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Jan 30 at 10:45

























            answered Jan 30 at 9:00









            WilsonWilson

            11.8k555136




            11.8k555136








            • 4





              Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

              – Solomon Slow
              Jan 30 at 14:26






            • 5





              @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

              – Wilson
              Jan 30 at 16:42














            • 4





              Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

              – Solomon Slow
              Jan 30 at 14:26






            • 5





              @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

              – Wilson
              Jan 30 at 16:42








            4




            4





            Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

            – Solomon Slow
            Jan 30 at 14:26





            Re, "drawbacks of [a lesser] computer.... I don't think anyone thought much of it." For anybody who ever got used to a VT-100 terminal (or better), hacking on a screen that could only display 16 lines of forty fuzzy characters each actually was pretty irritating.

            – Solomon Slow
            Jan 30 at 14:26




            5




            5





            @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

            – Wilson
            Jan 30 at 16:42





            @SolomonSlow Seems like you're comparing 1970's office equipment with some kind of TRS-80 tier computer. There's a wide gamut of solutions in-between also. Consider an Apple II with an 80-column card, or a C128 with a decent monitor. They're not bad for displaying text, it's what they're made for.

            – Wilson
            Jan 30 at 16:42











            18















            So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.




            To start with, I still like to use my IIgs (or IIc-plus) when coding for the Apple II. Both are quite fast machines with more than enough memory to do the job. After all, editing source text doesn't get faster with a mouse and many colours. And all the 'helpers' of modern IDE are just adding potential errors - like selecting the wrong function, just because it's so neatly presented. Yes, the Google disease of always picking the top entry has reached the programming community.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?




            The '80s on small computers were an extremely fast changing environment, so there is not a single answer to this, or in other words: It depends :)





            • In the very beginning cross assemblers where the norm. How else to get a program running on a new CPU? Running them on time-sharing systems wasn't about getting more luxury, but running it at all, thus avoiding assembling from hand. They were not in any way great tools or easy to handle. Don't expect something like an Arduino IDE running on a PC compiling and downloading the program in a click. We talk extremely meagre functionality, clumsy handling at high cost. And the result was just another file on that distant mainframe.



              It was common to just dump the result to a TTY as hex (*1) and then key it into a single boarder to have it run - or into a prommer to store it and then run. People with a lot of money would have prommers able to read paper tape, punched by the TTY. Controlling all of this by hand while moving rolls of paper around was considered easy handling and fast development.




            • This soon changed when development tools were moved to the machine itself. For example, one of the sales arguments for the JOLT was its "Resident Assembler Program (RAP)" advertised with "costly time sharing services are not needed for cross assemblies" and "The Resident Assembler Program is compatible with the MaS Technology Cross Assembler" in their brochure.



              While such a resident assembler was just a small step, more capable assemblers running on the target system (or closely related ones) became available. A notable example was Intel's ISIS systems as professional development tools. Here, an 8080 system was used to develop programs for the whole Intel line from 8023 to 8085.



              Now, while these are dedicated developer systems, they were neither comfortable nor any faster than their targets - often slower. Heck, even early 8086 development was done on these. The 8086 boards were just used as target.




            • Around 1980 and for small developers, it was already a great gain to load an editor from tape, write/change the source code (loaded from tape as well), save it, load a compiler (all from tape), load the source again, compile, save the result, run it. Sure, all of this could be quite sped up by buying a disk drive - but more than doubling the system cost. An original 8KiB PET cost 800 USD, while a 2040 drive called for over 1,000 USD. And that's 1978 prices.



              So for a more professional setup (maybe with a printer ... whooohoo) 4-6,000 USD was a reasonable number - in today's money that's 25-30,000 USD. Not exactly lunch money.




            At that point (1977-1982) it's important to remember that (Game) Development wasn't driven by big corporations with large development teams, but more often than not by single person or very small teams of 2-4 developers. It's the time soon to be important companies started at the literal kitchen table with the machine itself and a hunger to build.





            • As before the question wasn't about how inconvenient using the target machine is, but how great it would be to have a floppy at all.



              Considering this, a built-in BASIC with integrated editor and interpreter and debugger in one, so only source/program has to be loaded and saved, had many advantages on early machines. A built-in IDE after all.




            No one thought at this time about having a faster machine. There was none. Using a different machine would only complicate the development process.





            • Soon third party developers released more integrated compiler environments to run on the target machine. One where, for example, the source could be held in memory when switching from editor to compiler.



              And for owners of lots of RAM - like 32 KiB or even more - there were integrated programs that did hold both in memory and the source. Who needs disks anyway? :))




            • When floppies became more affordable, development tools were based around them with automated load of components. Most notably may be the UCSD p-code system (Pascal, Fortran, etc.) offering an integrated menu system to switch between components (Editor, Compiler, Linker, Debugger) with only a few key presses.



              Similar integration made Turbo Pascal in 1983/84 an unparalleled success.



            • With introduction of hard disks to the general public (read: becoming affordable by only doubling the machine's price) around the same time (1984...) development did really speed up. Not so much due to the hard disk speed, but missing the need to swap floppies (*2).


            • The PC itself wasn't a real fast machine; its advantage was a large RAM configuration and, as mentioned, easy hard disk integration. Once again it wasn't about speed, but the ability to do certain tasks at all - like fast program switching or large compilers without loading of overlays.


            • All throughout the '80s, speed wasn't a thing of the CPU at all. Speed difference between slowest and fastest machines was not given in the first half and maybe 1:2 throughout the second half of the decade.



            For a more general view it's useful to keep in mind that most development until the late '80s wasn't so much about making life comfy as today, but being able to develop at all. Much as the most important feature of the (small) machines back then was to be a computer capable of running arbitrary programs at all. The choice wasn't to compile a program with one click within seconds and get a beautiful output presented, but rather being able to compile at all. Switching a disk 5 times per compile and waiting a minute or two until a 1,000 line program is compiled and linked is fun compared to no tools and no program.



            Looking close, it's helpful to follow a rough timeline and different different use cases in mind:




            • Very early on, ~1973-1978, Crossdevelopment was important and often the only way, as the CPUs to be developed for weren't available in usable machines. Programm sizes where measured rather in pages of 256 Bytes than Kilobytes.


            • When the first general available machines (Altair, Commodore, Apple, Tandy) where in use(1977-1980), they were, for most users, far more convenient than any (expensive) cross development, not at least due the fact that programs could be developed interactive and tested right away. Still, codesize was measured in single digit KiB, and development environment and runtime were the same. Eventually a developer used additional tools in form of plug in ROMs ( TIM for Commodore or Programmers Aid#1 for the Apple II).


            • With the advent of dedicated home computers, roughly the time from 1980 to 1984 (The Video Game Crash, *3), they became them self the best development platform for their software. Much of the code was platform dependant, and needed to be tested in place anyway. Also, the difference between these and 'professional' systems in speed where more often than not reversed. And while professional development systems had in general better storage, their RAM offerings where as small. As a result, it was more appropriate to soup up the target system - like adding floppies. Hard disks weren't an issue.


            • While the PC itself didn't change much here, despite the larger memory, it was the AT of 1984 and more important the 16 bit wave of 1985 that changed the play field with faster machines. Development was done on the target machines itself - after all, they were at the top end of the time again.


            • While now faster platforms available that could ease development for older generations, it still wasn't until the late 1980s that the needed software to emulate older machines good enough to allow satisfying cross development.


            • For the top end machines, development was and is done on the machines itself.



            It wasn't until the mid to late 90s, when of the shelf hardware could generaly be used as cross development ... then again, it became less and less important as the hardware converged toward today's PC.





            *1 - Or a loader format, which is no big difference either.



            *2 - In the early '80s, before hard disks were affordable, developers often had 3 or 4 drives at their machines to reduce floppy swapping.



            *3 - 1980 (To be strict christmas sales of 1979) is marked by introduction the first (dedicated, mainstream) home computers with the TI 99/4 and Atari 400/800.



            *4 - The often cited video game crash of 1983-1984 (With ET being like the tip of the iceberg) was in most parts a US phenomena as Europe and Japan where only indirect influenced.






            share|improve this answer


























            • You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

              – Felix Palmen
              Jan 30 at 18:14






            • 6





              @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

              – Raffzahn
              Jan 30 at 19:08






            • 2





              Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

              – Raffzahn
              Jan 30 at 19:10








            • 4





              The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

              – supercat
              Jan 30 at 19:38











            • "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

              – Felix Palmen
              Jan 31 at 7:28


















            18















            So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.




            To start with, I still like to use my IIgs (or IIc-plus) when coding for the Apple II. Both are quite fast machines with more than enough memory to do the job. After all, editing source text doesn't get faster with a mouse and many colours. And all the 'helpers' of modern IDE are just adding potential errors - like selecting the wrong function, just because it's so neatly presented. Yes, the Google disease of always picking the top entry has reached the programming community.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?




            The '80s on small computers were an extremely fast changing environment, so there is not a single answer to this, or in other words: It depends :)





            • In the very beginning cross assemblers where the norm. How else to get a program running on a new CPU? Running them on time-sharing systems wasn't about getting more luxury, but running it at all, thus avoiding assembling from hand. They were not in any way great tools or easy to handle. Don't expect something like an Arduino IDE running on a PC compiling and downloading the program in a click. We talk extremely meagre functionality, clumsy handling at high cost. And the result was just another file on that distant mainframe.



              It was common to just dump the result to a TTY as hex (*1) and then key it into a single boarder to have it run - or into a prommer to store it and then run. People with a lot of money would have prommers able to read paper tape, punched by the TTY. Controlling all of this by hand while moving rolls of paper around was considered easy handling and fast development.




            • This soon changed when development tools were moved to the machine itself. For example, one of the sales arguments for the JOLT was its "Resident Assembler Program (RAP)" advertised with "costly time sharing services are not needed for cross assemblies" and "The Resident Assembler Program is compatible with the MaS Technology Cross Assembler" in their brochure.



              While such a resident assembler was just a small step, more capable assemblers running on the target system (or closely related ones) became available. A notable example was Intel's ISIS systems as professional development tools. Here, an 8080 system was used to develop programs for the whole Intel line from 8023 to 8085.



              Now, while these are dedicated developer systems, they were neither comfortable nor any faster than their targets - often slower. Heck, even early 8086 development was done on these. The 8086 boards were just used as target.




            • Around 1980 and for small developers, it was already a great gain to load an editor from tape, write/change the source code (loaded from tape as well), save it, load a compiler (all from tape), load the source again, compile, save the result, run it. Sure, all of this could be quite sped up by buying a disk drive - but more than doubling the system cost. An original 8KiB PET cost 800 USD, while a 2040 drive called for over 1,000 USD. And that's 1978 prices.



              So for a more professional setup (maybe with a printer ... whooohoo) 4-6,000 USD was a reasonable number - in today's money that's 25-30,000 USD. Not exactly lunch money.




            At that point (1977-1982) it's important to remember that (Game) Development wasn't driven by big corporations with large development teams, but more often than not by single person or very small teams of 2-4 developers. It's the time soon to be important companies started at the literal kitchen table with the machine itself and a hunger to build.





            • As before the question wasn't about how inconvenient using the target machine is, but how great it would be to have a floppy at all.



              Considering this, a built-in BASIC with integrated editor and interpreter and debugger in one, so only source/program has to be loaded and saved, had many advantages on early machines. A built-in IDE after all.




            No one thought at this time about having a faster machine. There was none. Using a different machine would only complicate the development process.





            • Soon third party developers released more integrated compiler environments to run on the target machine. One where, for example, the source could be held in memory when switching from editor to compiler.



              And for owners of lots of RAM - like 32 KiB or even more - there were integrated programs that did hold both in memory and the source. Who needs disks anyway? :))




            • When floppies became more affordable, development tools were based around them with automated load of components. Most notably may be the UCSD p-code system (Pascal, Fortran, etc.) offering an integrated menu system to switch between components (Editor, Compiler, Linker, Debugger) with only a few key presses.



              Similar integration made Turbo Pascal in 1983/84 an unparalleled success.



            • With introduction of hard disks to the general public (read: becoming affordable by only doubling the machine's price) around the same time (1984...) development did really speed up. Not so much due to the hard disk speed, but missing the need to swap floppies (*2).


            • The PC itself wasn't a real fast machine; its advantage was a large RAM configuration and, as mentioned, easy hard disk integration. Once again it wasn't about speed, but the ability to do certain tasks at all - like fast program switching or large compilers without loading of overlays.


            • All throughout the '80s, speed wasn't a thing of the CPU at all. Speed difference between slowest and fastest machines was not given in the first half and maybe 1:2 throughout the second half of the decade.



            For a more general view it's useful to keep in mind that most development until the late '80s wasn't so much about making life comfy as today, but being able to develop at all. Much as the most important feature of the (small) machines back then was to be a computer capable of running arbitrary programs at all. The choice wasn't to compile a program with one click within seconds and get a beautiful output presented, but rather being able to compile at all. Switching a disk 5 times per compile and waiting a minute or two until a 1,000 line program is compiled and linked is fun compared to no tools and no program.



            Looking close, it's helpful to follow a rough timeline and different different use cases in mind:




            • Very early on, ~1973-1978, Crossdevelopment was important and often the only way, as the CPUs to be developed for weren't available in usable machines. Programm sizes where measured rather in pages of 256 Bytes than Kilobytes.


            • When the first general available machines (Altair, Commodore, Apple, Tandy) where in use(1977-1980), they were, for most users, far more convenient than any (expensive) cross development, not at least due the fact that programs could be developed interactive and tested right away. Still, codesize was measured in single digit KiB, and development environment and runtime were the same. Eventually a developer used additional tools in form of plug in ROMs ( TIM for Commodore or Programmers Aid#1 for the Apple II).


            • With the advent of dedicated home computers, roughly the time from 1980 to 1984 (The Video Game Crash, *3), they became them self the best development platform for their software. Much of the code was platform dependant, and needed to be tested in place anyway. Also, the difference between these and 'professional' systems in speed where more often than not reversed. And while professional development systems had in general better storage, their RAM offerings where as small. As a result, it was more appropriate to soup up the target system - like adding floppies. Hard disks weren't an issue.


            • While the PC itself didn't change much here, despite the larger memory, it was the AT of 1984 and more important the 16 bit wave of 1985 that changed the play field with faster machines. Development was done on the target machines itself - after all, they were at the top end of the time again.


            • While now faster platforms available that could ease development for older generations, it still wasn't until the late 1980s that the needed software to emulate older machines good enough to allow satisfying cross development.


            • For the top end machines, development was and is done on the machines itself.



            It wasn't until the mid to late 90s, when of the shelf hardware could generaly be used as cross development ... then again, it became less and less important as the hardware converged toward today's PC.





            *1 - Or a loader format, which is no big difference either.



            *2 - In the early '80s, before hard disks were affordable, developers often had 3 or 4 drives at their machines to reduce floppy swapping.



            *3 - 1980 (To be strict christmas sales of 1979) is marked by introduction the first (dedicated, mainstream) home computers with the TI 99/4 and Atari 400/800.



            *4 - The often cited video game crash of 1983-1984 (With ET being like the tip of the iceberg) was in most parts a US phenomena as Europe and Japan where only indirect influenced.






            share|improve this answer


























            • You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

              – Felix Palmen
              Jan 30 at 18:14






            • 6





              @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

              – Raffzahn
              Jan 30 at 19:08






            • 2





              Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

              – Raffzahn
              Jan 30 at 19:10








            • 4





              The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

              – supercat
              Jan 30 at 19:38











            • "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

              – Felix Palmen
              Jan 31 at 7:28
















            18












            18








            18








            So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.




            To start with, I still like to use my IIgs (or IIc-plus) when coding for the Apple II. Both are quite fast machines with more than enough memory to do the job. After all, editing source text doesn't get faster with a mouse and many colours. And all the 'helpers' of modern IDE are just adding potential errors - like selecting the wrong function, just because it's so neatly presented. Yes, the Google disease of always picking the top entry has reached the programming community.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?




            The '80s on small computers were an extremely fast changing environment, so there is not a single answer to this, or in other words: It depends :)





            • In the very beginning cross assemblers where the norm. How else to get a program running on a new CPU? Running them on time-sharing systems wasn't about getting more luxury, but running it at all, thus avoiding assembling from hand. They were not in any way great tools or easy to handle. Don't expect something like an Arduino IDE running on a PC compiling and downloading the program in a click. We talk extremely meagre functionality, clumsy handling at high cost. And the result was just another file on that distant mainframe.



              It was common to just dump the result to a TTY as hex (*1) and then key it into a single boarder to have it run - or into a prommer to store it and then run. People with a lot of money would have prommers able to read paper tape, punched by the TTY. Controlling all of this by hand while moving rolls of paper around was considered easy handling and fast development.




            • This soon changed when development tools were moved to the machine itself. For example, one of the sales arguments for the JOLT was its "Resident Assembler Program (RAP)" advertised with "costly time sharing services are not needed for cross assemblies" and "The Resident Assembler Program is compatible with the MaS Technology Cross Assembler" in their brochure.



              While such a resident assembler was just a small step, more capable assemblers running on the target system (or closely related ones) became available. A notable example was Intel's ISIS systems as professional development tools. Here, an 8080 system was used to develop programs for the whole Intel line from 8023 to 8085.



              Now, while these are dedicated developer systems, they were neither comfortable nor any faster than their targets - often slower. Heck, even early 8086 development was done on these. The 8086 boards were just used as target.




            • Around 1980 and for small developers, it was already a great gain to load an editor from tape, write/change the source code (loaded from tape as well), save it, load a compiler (all from tape), load the source again, compile, save the result, run it. Sure, all of this could be quite sped up by buying a disk drive - but more than doubling the system cost. An original 8KiB PET cost 800 USD, while a 2040 drive called for over 1,000 USD. And that's 1978 prices.



              So for a more professional setup (maybe with a printer ... whooohoo) 4-6,000 USD was a reasonable number - in today's money that's 25-30,000 USD. Not exactly lunch money.




            At that point (1977-1982) it's important to remember that (Game) Development wasn't driven by big corporations with large development teams, but more often than not by single person or very small teams of 2-4 developers. It's the time soon to be important companies started at the literal kitchen table with the machine itself and a hunger to build.





            • As before the question wasn't about how inconvenient using the target machine is, but how great it would be to have a floppy at all.



              Considering this, a built-in BASIC with integrated editor and interpreter and debugger in one, so only source/program has to be loaded and saved, had many advantages on early machines. A built-in IDE after all.




            No one thought at this time about having a faster machine. There was none. Using a different machine would only complicate the development process.





            • Soon third party developers released more integrated compiler environments to run on the target machine. One where, for example, the source could be held in memory when switching from editor to compiler.



              And for owners of lots of RAM - like 32 KiB or even more - there were integrated programs that did hold both in memory and the source. Who needs disks anyway? :))




            • When floppies became more affordable, development tools were based around them with automated load of components. Most notably may be the UCSD p-code system (Pascal, Fortran, etc.) offering an integrated menu system to switch between components (Editor, Compiler, Linker, Debugger) with only a few key presses.



              Similar integration made Turbo Pascal in 1983/84 an unparalleled success.



            • With introduction of hard disks to the general public (read: becoming affordable by only doubling the machine's price) around the same time (1984...) development did really speed up. Not so much due to the hard disk speed, but missing the need to swap floppies (*2).


            • The PC itself wasn't a real fast machine; its advantage was a large RAM configuration and, as mentioned, easy hard disk integration. Once again it wasn't about speed, but the ability to do certain tasks at all - like fast program switching or large compilers without loading of overlays.


            • All throughout the '80s, speed wasn't a thing of the CPU at all. Speed difference between slowest and fastest machines was not given in the first half and maybe 1:2 throughout the second half of the decade.



            For a more general view it's useful to keep in mind that most development until the late '80s wasn't so much about making life comfy as today, but being able to develop at all. Much as the most important feature of the (small) machines back then was to be a computer capable of running arbitrary programs at all. The choice wasn't to compile a program with one click within seconds and get a beautiful output presented, but rather being able to compile at all. Switching a disk 5 times per compile and waiting a minute or two until a 1,000 line program is compiled and linked is fun compared to no tools and no program.



            Looking close, it's helpful to follow a rough timeline and different different use cases in mind:




            • Very early on, ~1973-1978, Crossdevelopment was important and often the only way, as the CPUs to be developed for weren't available in usable machines. Programm sizes where measured rather in pages of 256 Bytes than Kilobytes.


            • When the first general available machines (Altair, Commodore, Apple, Tandy) where in use(1977-1980), they were, for most users, far more convenient than any (expensive) cross development, not at least due the fact that programs could be developed interactive and tested right away. Still, codesize was measured in single digit KiB, and development environment and runtime were the same. Eventually a developer used additional tools in form of plug in ROMs ( TIM for Commodore or Programmers Aid#1 for the Apple II).


            • With the advent of dedicated home computers, roughly the time from 1980 to 1984 (The Video Game Crash, *3), they became them self the best development platform for their software. Much of the code was platform dependant, and needed to be tested in place anyway. Also, the difference between these and 'professional' systems in speed where more often than not reversed. And while professional development systems had in general better storage, their RAM offerings where as small. As a result, it was more appropriate to soup up the target system - like adding floppies. Hard disks weren't an issue.


            • While the PC itself didn't change much here, despite the larger memory, it was the AT of 1984 and more important the 16 bit wave of 1985 that changed the play field with faster machines. Development was done on the target machines itself - after all, they were at the top end of the time again.


            • While now faster platforms available that could ease development for older generations, it still wasn't until the late 1980s that the needed software to emulate older machines good enough to allow satisfying cross development.


            • For the top end machines, development was and is done on the machines itself.



            It wasn't until the mid to late 90s, when of the shelf hardware could generaly be used as cross development ... then again, it became less and less important as the hardware converged toward today's PC.





            *1 - Or a loader format, which is no big difference either.



            *2 - In the early '80s, before hard disks were affordable, developers often had 3 or 4 drives at their machines to reduce floppy swapping.



            *3 - 1980 (To be strict christmas sales of 1979) is marked by introduction the first (dedicated, mainstream) home computers with the TI 99/4 and Atari 400/800.



            *4 - The often cited video game crash of 1983-1984 (With ET being like the tip of the iceberg) was in most parts a US phenomena as Europe and Japan where only indirect influenced.






            share|improve this answer
















            So, nowadays, you'd have to be crazy not to use a PC and some nice cross-development tools when targeting these old machines.




            To start with, I still like to use my IIgs (or IIc-plus) when coding for the Apple II. Both are quite fast machines with more than enough memory to do the job. After all, editing source text doesn't get faster with a mouse and many colours. And all the 'helpers' of modern IDE are just adding potential errors - like selecting the wrong function, just because it's so neatly presented. Yes, the Google disease of always picking the top entry has reached the programming community.




            Did the programmers indeed work on the target machine with all the drawbacks, or did at least some of the dev shops have more powerful workstations with e.g. fast hard disks and cross-assemblers?




            The '80s on small computers were an extremely fast changing environment, so there is not a single answer to this, or in other words: It depends :)





            • In the very beginning cross assemblers where the norm. How else to get a program running on a new CPU? Running them on time-sharing systems wasn't about getting more luxury, but running it at all, thus avoiding assembling from hand. They were not in any way great tools or easy to handle. Don't expect something like an Arduino IDE running on a PC compiling and downloading the program in a click. We talk extremely meagre functionality, clumsy handling at high cost. And the result was just another file on that distant mainframe.



              It was common to just dump the result to a TTY as hex (*1) and then key it into a single boarder to have it run - or into a prommer to store it and then run. People with a lot of money would have prommers able to read paper tape, punched by the TTY. Controlling all of this by hand while moving rolls of paper around was considered easy handling and fast development.




            • This soon changed when development tools were moved to the machine itself. For example, one of the sales arguments for the JOLT was its "Resident Assembler Program (RAP)" advertised with "costly time sharing services are not needed for cross assemblies" and "The Resident Assembler Program is compatible with the MaS Technology Cross Assembler" in their brochure.



              While such a resident assembler was just a small step, more capable assemblers running on the target system (or closely related ones) became available. A notable example was Intel's ISIS systems as professional development tools. Here, an 8080 system was used to develop programs for the whole Intel line from 8023 to 8085.



              Now, while these are dedicated developer systems, they were neither comfortable nor any faster than their targets - often slower. Heck, even early 8086 development was done on these. The 8086 boards were just used as target.




            • Around 1980 and for small developers, it was already a great gain to load an editor from tape, write/change the source code (loaded from tape as well), save it, load a compiler (all from tape), load the source again, compile, save the result, run it. Sure, all of this could be quite sped up by buying a disk drive - but more than doubling the system cost. An original 8KiB PET cost 800 USD, while a 2040 drive called for over 1,000 USD. And that's 1978 prices.



              So for a more professional setup (maybe with a printer ... whooohoo) 4-6,000 USD was a reasonable number - in today's money that's 25-30,000 USD. Not exactly lunch money.




            At that point (1977-1982) it's important to remember that (Game) Development wasn't driven by big corporations with large development teams, but more often than not by single person or very small teams of 2-4 developers. It's the time soon to be important companies started at the literal kitchen table with the machine itself and a hunger to build.





            • As before the question wasn't about how inconvenient using the target machine is, but how great it would be to have a floppy at all.



              Considering this, a built-in BASIC with integrated editor and interpreter and debugger in one, so only source/program has to be loaded and saved, had many advantages on early machines. A built-in IDE after all.




            No one thought at this time about having a faster machine. There was none. Using a different machine would only complicate the development process.





            • Soon third party developers released more integrated compiler environments to run on the target machine. One where, for example, the source could be held in memory when switching from editor to compiler.



              And for owners of lots of RAM - like 32 KiB or even more - there were integrated programs that did hold both in memory and the source. Who needs disks anyway? :))




            • When floppies became more affordable, development tools were based around them with automated load of components. Most notably may be the UCSD p-code system (Pascal, Fortran, etc.) offering an integrated menu system to switch between components (Editor, Compiler, Linker, Debugger) with only a few key presses.



              Similar integration made Turbo Pascal in 1983/84 an unparalleled success.



            • With introduction of hard disks to the general public (read: becoming affordable by only doubling the machine's price) around the same time (1984...) development did really speed up. Not so much due to the hard disk speed, but missing the need to swap floppies (*2).


            • The PC itself wasn't a real fast machine; its advantage was a large RAM configuration and, as mentioned, easy hard disk integration. Once again it wasn't about speed, but the ability to do certain tasks at all - like fast program switching or large compilers without loading of overlays.


            • All throughout the '80s, speed wasn't a thing of the CPU at all. Speed difference between slowest and fastest machines was not given in the first half and maybe 1:2 throughout the second half of the decade.



            For a more general view it's useful to keep in mind that most development until the late '80s wasn't so much about making life comfy as today, but being able to develop at all. Much as the most important feature of the (small) machines back then was to be a computer capable of running arbitrary programs at all. The choice wasn't to compile a program with one click within seconds and get a beautiful output presented, but rather being able to compile at all. Switching a disk 5 times per compile and waiting a minute or two until a 1,000 line program is compiled and linked is fun compared to no tools and no program.



            Looking close, it's helpful to follow a rough timeline and different different use cases in mind:




            • Very early on, ~1973-1978, Crossdevelopment was important and often the only way, as the CPUs to be developed for weren't available in usable machines. Programm sizes where measured rather in pages of 256 Bytes than Kilobytes.


            • When the first general available machines (Altair, Commodore, Apple, Tandy) where in use(1977-1980), they were, for most users, far more convenient than any (expensive) cross development, not at least due the fact that programs could be developed interactive and tested right away. Still, codesize was measured in single digit KiB, and development environment and runtime were the same. Eventually a developer used additional tools in form of plug in ROMs ( TIM for Commodore or Programmers Aid#1 for the Apple II).


            • With the advent of dedicated home computers, roughly the time from 1980 to 1984 (The Video Game Crash, *3), they became them self the best development platform for their software. Much of the code was platform dependant, and needed to be tested in place anyway. Also, the difference between these and 'professional' systems in speed where more often than not reversed. And while professional development systems had in general better storage, their RAM offerings where as small. As a result, it was more appropriate to soup up the target system - like adding floppies. Hard disks weren't an issue.


            • While the PC itself didn't change much here, despite the larger memory, it was the AT of 1984 and more important the 16 bit wave of 1985 that changed the play field with faster machines. Development was done on the target machines itself - after all, they were at the top end of the time again.


            • While now faster platforms available that could ease development for older generations, it still wasn't until the late 1980s that the needed software to emulate older machines good enough to allow satisfying cross development.


            • For the top end machines, development was and is done on the machines itself.



            It wasn't until the mid to late 90s, when of the shelf hardware could generaly be used as cross development ... then again, it became less and less important as the hardware converged toward today's PC.





            *1 - Or a loader format, which is no big difference either.



            *2 - In the early '80s, before hard disks were affordable, developers often had 3 or 4 drives at their machines to reduce floppy swapping.



            *3 - 1980 (To be strict christmas sales of 1979) is marked by introduction the first (dedicated, mainstream) home computers with the TI 99/4 and Atari 400/800.



            *4 - The often cited video game crash of 1983-1984 (With ET being like the tip of the iceberg) was in most parts a US phenomena as Europe and Japan where only indirect influenced.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Jan 31 at 18:03

























            answered Jan 30 at 16:04









            RaffzahnRaffzahn

            52.8k6125213




            52.8k6125213













            • You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

              – Felix Palmen
              Jan 30 at 18:14






            • 6





              @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

              – Raffzahn
              Jan 30 at 19:08






            • 2





              Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

              – Raffzahn
              Jan 30 at 19:10








            • 4





              The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

              – supercat
              Jan 30 at 19:38











            • "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

              – Felix Palmen
              Jan 31 at 7:28





















            • You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

              – Felix Palmen
              Jan 30 at 18:14






            • 6





              @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

              – Raffzahn
              Jan 30 at 19:08






            • 2





              Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

              – Raffzahn
              Jan 30 at 19:10








            • 4





              The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

              – supercat
              Jan 30 at 19:38











            • "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

              – Felix Palmen
              Jan 31 at 7:28



















            You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

            – Felix Palmen
            Jan 30 at 18:14





            You really think I'd use an "IDE" (or even a mouse) when doing cross-dev nowadays? Well, that's optional, and not what I was thinking about. All in all, a lot of your text is correct, but misses the point of my question :( I'm not talking about programming a completely new machine, and I don't refer to "punch-card times", my interest is about the daily business in a software shop in the late 80s. The other answers suggest both approaches existed back then, doing all on the target machine and using some cross-dev setups. I guess expensive hardware could pay off in quicker time to market...

            – Felix Palmen
            Jan 30 at 18:14




            6




            6





            @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

            – Raffzahn
            Jan 30 at 19:08





            @FelixPalmen Well, I guess you're missing the point I tried to make - thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself. Only addition of memory/storage was (some) relief. Next, most machines during the 80s were new, and software had to be written on these new machines starting on a quite primitive level. The Atari ST makes be a great example. While cross-development was possible on a PC even using GEM, it was a pin in the ass. So early developer had to use dual floppies until Atari finally delivered a harddisk.

            – Raffzahn
            Jan 30 at 19:08




            2




            2





            Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

            – Raffzahn
            Jan 30 at 19:10







            Further, I never mentioned punch cards - did I? On the other hand, punchtape was used at professional level way into the 80s. Especially to develop for micros. And one last point, @FelixPalmen, I can't see anywhere in my answer an assumption what you use - nor would I do so. So please abstain from personal implications.

            – Raffzahn
            Jan 30 at 19:10






            4




            4





            The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

            – supercat
            Jan 30 at 19:38





            The original IBM PC version of Tetris (the first one that was usable internationally) was written with Turbo Pascal 2.0 or 3.0, which included an editor that could work with files up to 64K, could generate applications up to 64K of code, and on a machine with 192K or more could allow one to edit and run a program without any disk access whatsoever (though saving before running was often a good idea if one had made any significant changes to a program).

            – supercat
            Jan 30 at 19:38













            "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

            – Felix Palmen
            Jan 31 at 7:28







            "thruout most o the 80s, there was no faster/better machine to develop for a target system than the target system itself" -- this is a rather bold claim, given workstations (like e.g. the sun-1, introduced 1982) were indeed available. Some answers mentioned other machines I never hear of before, or even later the possibility to use amigas. My question wasn't whether hardware that could be used for cross-dev existed but whether (and how) this was actually done.

            – Felix Palmen
            Jan 31 at 7:28













            12














            They used cross-development kits back then too. I worked briefly in a UK game developer in 1990 and all their Commodore 64 and ZX Spectrum games were developed on a PC with a proprietary kit.



            See for example Andrew Braybrook's diary covering the development of Morpheus on the C64, where they start to use Opus PCs and an Atari ST to develop on, connected to the C64 with an RS232 cable. This diary appeared in Zzap 64! magazine at the time.






            share|improve this answer



















            • 2





              Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

              – tylisirn
              Jan 30 at 22:21
















            12














            They used cross-development kits back then too. I worked briefly in a UK game developer in 1990 and all their Commodore 64 and ZX Spectrum games were developed on a PC with a proprietary kit.



            See for example Andrew Braybrook's diary covering the development of Morpheus on the C64, where they start to use Opus PCs and an Atari ST to develop on, connected to the C64 with an RS232 cable. This diary appeared in Zzap 64! magazine at the time.






            share|improve this answer



















            • 2





              Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

              – tylisirn
              Jan 30 at 22:21














            12












            12








            12







            They used cross-development kits back then too. I worked briefly in a UK game developer in 1990 and all their Commodore 64 and ZX Spectrum games were developed on a PC with a proprietary kit.



            See for example Andrew Braybrook's diary covering the development of Morpheus on the C64, where they start to use Opus PCs and an Atari ST to develop on, connected to the C64 with an RS232 cable. This diary appeared in Zzap 64! magazine at the time.






            share|improve this answer













            They used cross-development kits back then too. I worked briefly in a UK game developer in 1990 and all their Commodore 64 and ZX Spectrum games were developed on a PC with a proprietary kit.



            See for example Andrew Braybrook's diary covering the development of Morpheus on the C64, where they start to use Opus PCs and an Atari ST to develop on, connected to the C64 with an RS232 cable. This diary appeared in Zzap 64! magazine at the time.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Jan 30 at 14:10









            Alan BAlan B

            44437




            44437








            • 2





              Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

              – tylisirn
              Jan 30 at 22:21














            • 2





              Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

              – tylisirn
              Jan 30 at 22:21








            2




            2





            Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

            – tylisirn
            Jan 30 at 22:21





            Andrew also has a diary on developing Paradroid, which they did earlier using just Commodore machines: zzap64.co.uk/zzap3/para_birth01.html

            – tylisirn
            Jan 30 at 22:21











            10














            In It's Behind You: The Making of a Computer Game, in which Bob Pape describes his process when authoring the ZX Spectrum conversion of R-Type, alongside colleagues working on Atari ST and C64 ports, he writes:




            The equipment I was using to write R-Type with [initially] was the same as for Rampage, a standard 48K Spectrum with Interface 1 and microdrives and everybody's favourite hardware copying/backup device Romantic Robot’s Multiface 1. I had a copy of
            OCP EditorAssembler on the microdrive along with my source code and with a push of the button on the Multiface I could go from an empty Spectrum to one ready to assemble in just a few seconds, however the drawback to this method was that the assembler was resident in the Spectrum's memory taking up valuable RAM that I couldn't use.




            So Rampage, which was released for the Spectrum in 1988, is at least one commercial piece of software from the late '80s that was developed directly on the machine that it targetted, using comparatively fast external storage and a little off-the-shelf hardware assistance for state inspection.



            During the course of R-Type's development:




            Around this time the three of us took delivery of proper PC based development systems from Catalyst. Each of us received a then state-of-the-art Opus 80286 PC with monochrome monitor running DOS and a professional cross-development package.



            The business end of the PC development was handled by a PDS (Programmer's Development System) board, which was really just a parallel I/O card that would connect to the target machine (at the time a Spectrum, Amstrad or C64) via a simple interface. On the target machine you'd load a small piece of code that would sit there polling the lines and waiting for a signal supplied by a custom assembler running on the PC - as soon as you'd assembled on the PC and used the SEND Command the target machine would transfer the object code and you'd be up and running in about a second. The download code for the target machines came in three versions, 'dumb', 'smart' and 'interrupt driven' with the latter running under interrupts allowing the PC to monitor, control and even change the code on the target machine while the game was running, which wasn't really a lot of use if you were trying to write time critical code but it did have a nice line in real-time Trace functions.




            So R-Type, released later in 1988, was that author's turning point to cross-development, using an environment very much like the one you describe as that which you'd use today: write and build on a comparatively fast system, then test the result on the real hardware.






            share|improve this answer


























            • Side note: everybody on this site should read Bob's book, it's a truly great read.

              – Matt Lacey
              Feb 1 at 4:22











            • +1, it is amazing how he got an R-Type out of the Speccy.

              – Alan B
              Feb 8 at 12:52
















            10














            In It's Behind You: The Making of a Computer Game, in which Bob Pape describes his process when authoring the ZX Spectrum conversion of R-Type, alongside colleagues working on Atari ST and C64 ports, he writes:




            The equipment I was using to write R-Type with [initially] was the same as for Rampage, a standard 48K Spectrum with Interface 1 and microdrives and everybody's favourite hardware copying/backup device Romantic Robot’s Multiface 1. I had a copy of
            OCP EditorAssembler on the microdrive along with my source code and with a push of the button on the Multiface I could go from an empty Spectrum to one ready to assemble in just a few seconds, however the drawback to this method was that the assembler was resident in the Spectrum's memory taking up valuable RAM that I couldn't use.




            So Rampage, which was released for the Spectrum in 1988, is at least one commercial piece of software from the late '80s that was developed directly on the machine that it targetted, using comparatively fast external storage and a little off-the-shelf hardware assistance for state inspection.



            During the course of R-Type's development:




            Around this time the three of us took delivery of proper PC based development systems from Catalyst. Each of us received a then state-of-the-art Opus 80286 PC with monochrome monitor running DOS and a professional cross-development package.



            The business end of the PC development was handled by a PDS (Programmer's Development System) board, which was really just a parallel I/O card that would connect to the target machine (at the time a Spectrum, Amstrad or C64) via a simple interface. On the target machine you'd load a small piece of code that would sit there polling the lines and waiting for a signal supplied by a custom assembler running on the PC - as soon as you'd assembled on the PC and used the SEND Command the target machine would transfer the object code and you'd be up and running in about a second. The download code for the target machines came in three versions, 'dumb', 'smart' and 'interrupt driven' with the latter running under interrupts allowing the PC to monitor, control and even change the code on the target machine while the game was running, which wasn't really a lot of use if you were trying to write time critical code but it did have a nice line in real-time Trace functions.




            So R-Type, released later in 1988, was that author's turning point to cross-development, using an environment very much like the one you describe as that which you'd use today: write and build on a comparatively fast system, then test the result on the real hardware.






            share|improve this answer


























            • Side note: everybody on this site should read Bob's book, it's a truly great read.

              – Matt Lacey
              Feb 1 at 4:22











            • +1, it is amazing how he got an R-Type out of the Speccy.

              – Alan B
              Feb 8 at 12:52














            10












            10








            10







            In It's Behind You: The Making of a Computer Game, in which Bob Pape describes his process when authoring the ZX Spectrum conversion of R-Type, alongside colleagues working on Atari ST and C64 ports, he writes:




            The equipment I was using to write R-Type with [initially] was the same as for Rampage, a standard 48K Spectrum with Interface 1 and microdrives and everybody's favourite hardware copying/backup device Romantic Robot’s Multiface 1. I had a copy of
            OCP EditorAssembler on the microdrive along with my source code and with a push of the button on the Multiface I could go from an empty Spectrum to one ready to assemble in just a few seconds, however the drawback to this method was that the assembler was resident in the Spectrum's memory taking up valuable RAM that I couldn't use.




            So Rampage, which was released for the Spectrum in 1988, is at least one commercial piece of software from the late '80s that was developed directly on the machine that it targetted, using comparatively fast external storage and a little off-the-shelf hardware assistance for state inspection.



            During the course of R-Type's development:




            Around this time the three of us took delivery of proper PC based development systems from Catalyst. Each of us received a then state-of-the-art Opus 80286 PC with monochrome monitor running DOS and a professional cross-development package.



            The business end of the PC development was handled by a PDS (Programmer's Development System) board, which was really just a parallel I/O card that would connect to the target machine (at the time a Spectrum, Amstrad or C64) via a simple interface. On the target machine you'd load a small piece of code that would sit there polling the lines and waiting for a signal supplied by a custom assembler running on the PC - as soon as you'd assembled on the PC and used the SEND Command the target machine would transfer the object code and you'd be up and running in about a second. The download code for the target machines came in three versions, 'dumb', 'smart' and 'interrupt driven' with the latter running under interrupts allowing the PC to monitor, control and even change the code on the target machine while the game was running, which wasn't really a lot of use if you were trying to write time critical code but it did have a nice line in real-time Trace functions.




            So R-Type, released later in 1988, was that author's turning point to cross-development, using an environment very much like the one you describe as that which you'd use today: write and build on a comparatively fast system, then test the result on the real hardware.






            share|improve this answer















            In It's Behind You: The Making of a Computer Game, in which Bob Pape describes his process when authoring the ZX Spectrum conversion of R-Type, alongside colleagues working on Atari ST and C64 ports, he writes:




            The equipment I was using to write R-Type with [initially] was the same as for Rampage, a standard 48K Spectrum with Interface 1 and microdrives and everybody's favourite hardware copying/backup device Romantic Robot’s Multiface 1. I had a copy of
            OCP EditorAssembler on the microdrive along with my source code and with a push of the button on the Multiface I could go from an empty Spectrum to one ready to assemble in just a few seconds, however the drawback to this method was that the assembler was resident in the Spectrum's memory taking up valuable RAM that I couldn't use.




            So Rampage, which was released for the Spectrum in 1988, is at least one commercial piece of software from the late '80s that was developed directly on the machine that it targetted, using comparatively fast external storage and a little off-the-shelf hardware assistance for state inspection.



            During the course of R-Type's development:




            Around this time the three of us took delivery of proper PC based development systems from Catalyst. Each of us received a then state-of-the-art Opus 80286 PC with monochrome monitor running DOS and a professional cross-development package.



            The business end of the PC development was handled by a PDS (Programmer's Development System) board, which was really just a parallel I/O card that would connect to the target machine (at the time a Spectrum, Amstrad or C64) via a simple interface. On the target machine you'd load a small piece of code that would sit there polling the lines and waiting for a signal supplied by a custom assembler running on the PC - as soon as you'd assembled on the PC and used the SEND Command the target machine would transfer the object code and you'd be up and running in about a second. The download code for the target machines came in three versions, 'dumb', 'smart' and 'interrupt driven' with the latter running under interrupts allowing the PC to monitor, control and even change the code on the target machine while the game was running, which wasn't really a lot of use if you were trying to write time critical code but it did have a nice line in real-time Trace functions.




            So R-Type, released later in 1988, was that author's turning point to cross-development, using an environment very much like the one you describe as that which you'd use today: write and build on a comparatively fast system, then test the result on the real hardware.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Jan 30 at 18:48

























            answered Jan 30 at 16:59









            TommyTommy

            15k14174




            15k14174













            • Side note: everybody on this site should read Bob's book, it's a truly great read.

              – Matt Lacey
              Feb 1 at 4:22











            • +1, it is amazing how he got an R-Type out of the Speccy.

              – Alan B
              Feb 8 at 12:52



















            • Side note: everybody on this site should read Bob's book, it's a truly great read.

              – Matt Lacey
              Feb 1 at 4:22











            • +1, it is amazing how he got an R-Type out of the Speccy.

              – Alan B
              Feb 8 at 12:52

















            Side note: everybody on this site should read Bob's book, it's a truly great read.

            – Matt Lacey
            Feb 1 at 4:22





            Side note: everybody on this site should read Bob's book, it's a truly great read.

            – Matt Lacey
            Feb 1 at 4:22













            +1, it is amazing how he got an R-Type out of the Speccy.

            – Alan B
            Feb 8 at 12:52





            +1, it is amazing how he got an R-Type out of the Speccy.

            – Alan B
            Feb 8 at 12:52











            8














            Most of the "professional" outfits did use cross development, although they often had to build their own tools. For example they might have a Z80 assembler, but would need to make their down serial download app for the target machine to get the compiled code on there. IBM PCs and compatibles were popular for this task.



            There were also add-on cards for the 8 bit machines that made development work on them easier. These usually contained some kind of "monitor" application in ROM, that let the programmer freeze and inspect the state of the machine, make changes and so on. Because the application was in ROM it didn't eat up any precious RAM or interfere too much with the operation of the code being debugged.



            Some development was done offline too. For example graphics were often sketched out on graph paper and then manually converted to numbers and programmed directly into the game.






            share|improve this answer




























              8














              Most of the "professional" outfits did use cross development, although they often had to build their own tools. For example they might have a Z80 assembler, but would need to make their down serial download app for the target machine to get the compiled code on there. IBM PCs and compatibles were popular for this task.



              There were also add-on cards for the 8 bit machines that made development work on them easier. These usually contained some kind of "monitor" application in ROM, that let the programmer freeze and inspect the state of the machine, make changes and so on. Because the application was in ROM it didn't eat up any precious RAM or interfere too much with the operation of the code being debugged.



              Some development was done offline too. For example graphics were often sketched out on graph paper and then manually converted to numbers and programmed directly into the game.






              share|improve this answer


























                8












                8








                8







                Most of the "professional" outfits did use cross development, although they often had to build their own tools. For example they might have a Z80 assembler, but would need to make their down serial download app for the target machine to get the compiled code on there. IBM PCs and compatibles were popular for this task.



                There were also add-on cards for the 8 bit machines that made development work on them easier. These usually contained some kind of "monitor" application in ROM, that let the programmer freeze and inspect the state of the machine, make changes and so on. Because the application was in ROM it didn't eat up any precious RAM or interfere too much with the operation of the code being debugged.



                Some development was done offline too. For example graphics were often sketched out on graph paper and then manually converted to numbers and programmed directly into the game.






                share|improve this answer













                Most of the "professional" outfits did use cross development, although they often had to build their own tools. For example they might have a Z80 assembler, but would need to make their down serial download app for the target machine to get the compiled code on there. IBM PCs and compatibles were popular for this task.



                There were also add-on cards for the 8 bit machines that made development work on them easier. These usually contained some kind of "monitor" application in ROM, that let the programmer freeze and inspect the state of the machine, make changes and so on. Because the application was in ROM it didn't eat up any precious RAM or interfere too much with the operation of the code being debugged.



                Some development was done offline too. For example graphics were often sketched out on graph paper and then manually converted to numbers and programmed directly into the game.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Jan 30 at 9:30









                useruser

                3,360616




                3,360616























                    8














                    Like others said, it varied. Lots of small developers (like Llamasoft) programmed directly on the hardware, bigger developers used other computers to act as debuggers or cross-development systems. For example, there were a lot of cross assemblers available for the Atari ST. Atari used special hardware and Vax computers to aid in development of 2600/5200 games. The games would still run on (modified) console hardware, with the other machine connected to the console and acted as a debugger.



                    So generally you use the bigger system to aid in development on the smaller system or develop directly on the hardware. But here's one example where a C64 was used to prepare developers for the ST:



                    http://www.stcarchiv.de/stm1988/07/die-hexer-teil-1



                    Demo coders TEX, who later joined Thalion (Dragonflight, Amberstar), started programming in 68000 assembler before receiving their first Atari STs. According to the devs, they used a 68000 assembler/simulator to gain some experience.



                    Jez San (Starglider) used a 68000 emulator to simplify debugging (http://www.stcarchiv.de/stm1988/05/interview-jez-san).






                    share|improve this answer




























                      8














                      Like others said, it varied. Lots of small developers (like Llamasoft) programmed directly on the hardware, bigger developers used other computers to act as debuggers or cross-development systems. For example, there were a lot of cross assemblers available for the Atari ST. Atari used special hardware and Vax computers to aid in development of 2600/5200 games. The games would still run on (modified) console hardware, with the other machine connected to the console and acted as a debugger.



                      So generally you use the bigger system to aid in development on the smaller system or develop directly on the hardware. But here's one example where a C64 was used to prepare developers for the ST:



                      http://www.stcarchiv.de/stm1988/07/die-hexer-teil-1



                      Demo coders TEX, who later joined Thalion (Dragonflight, Amberstar), started programming in 68000 assembler before receiving their first Atari STs. According to the devs, they used a 68000 assembler/simulator to gain some experience.



                      Jez San (Starglider) used a 68000 emulator to simplify debugging (http://www.stcarchiv.de/stm1988/05/interview-jez-san).






                      share|improve this answer


























                        8












                        8








                        8







                        Like others said, it varied. Lots of small developers (like Llamasoft) programmed directly on the hardware, bigger developers used other computers to act as debuggers or cross-development systems. For example, there were a lot of cross assemblers available for the Atari ST. Atari used special hardware and Vax computers to aid in development of 2600/5200 games. The games would still run on (modified) console hardware, with the other machine connected to the console and acted as a debugger.



                        So generally you use the bigger system to aid in development on the smaller system or develop directly on the hardware. But here's one example where a C64 was used to prepare developers for the ST:



                        http://www.stcarchiv.de/stm1988/07/die-hexer-teil-1



                        Demo coders TEX, who later joined Thalion (Dragonflight, Amberstar), started programming in 68000 assembler before receiving their first Atari STs. According to the devs, they used a 68000 assembler/simulator to gain some experience.



                        Jez San (Starglider) used a 68000 emulator to simplify debugging (http://www.stcarchiv.de/stm1988/05/interview-jez-san).






                        share|improve this answer













                        Like others said, it varied. Lots of small developers (like Llamasoft) programmed directly on the hardware, bigger developers used other computers to act as debuggers or cross-development systems. For example, there were a lot of cross assemblers available for the Atari ST. Atari used special hardware and Vax computers to aid in development of 2600/5200 games. The games would still run on (modified) console hardware, with the other machine connected to the console and acted as a debugger.



                        So generally you use the bigger system to aid in development on the smaller system or develop directly on the hardware. But here's one example where a C64 was used to prepare developers for the ST:



                        http://www.stcarchiv.de/stm1988/07/die-hexer-teil-1



                        Demo coders TEX, who later joined Thalion (Dragonflight, Amberstar), started programming in 68000 assembler before receiving their first Atari STs. According to the devs, they used a 68000 assembler/simulator to gain some experience.



                        Jez San (Starglider) used a 68000 emulator to simplify debugging (http://www.stcarchiv.de/stm1988/05/interview-jez-san).







                        share|improve this answer












                        share|improve this answer



                        share|improve this answer










                        answered Jan 30 at 22:25









                        user2369305user2369305

                        46612




                        46612























                            5














                            Together with a friend, around 1984-1986, I developed a PCB CAD system "CADsoft" (not to be confused with the later company of that name, producing the Eagle CAD system) targetting the Z80 CP/M platform (with a specially-designed graphics card, see here) that was sold around 20 times, and used by some hardware companies to design the PCBs for their product portfolio.



                            We did it in Pascal/MT+ and a few assembly routines for hardware interfacing. The resulting executable was too large to fit into the RAM, but Pascal/MT+ supported overlays to be swapped in and out on-demand.



                            We coded on our CP/M machines, equal to the target ones:




                            • 64 kB RAM

                            • 4 MHz Z80A

                            • 80*24 terminal with serial connection (9600 baud)

                            • two 800 kB floppy drives (later extended by a 20 MB SCSI hard disk)


                            I don't remember the editor we used, quite probably WordStar.



                            Compiling and linking the application took a few minutes, but was possible without floppy swaps, IIRC.



                            We didn't have version control, only lots of backup floppy-disks. We did most of the coding in coordinated sessions, so the risk of accidentally modifying the same source file was reduced by oral communication.






                            share|improve this answer




























                              5














                              Together with a friend, around 1984-1986, I developed a PCB CAD system "CADsoft" (not to be confused with the later company of that name, producing the Eagle CAD system) targetting the Z80 CP/M platform (with a specially-designed graphics card, see here) that was sold around 20 times, and used by some hardware companies to design the PCBs for their product portfolio.



                              We did it in Pascal/MT+ and a few assembly routines for hardware interfacing. The resulting executable was too large to fit into the RAM, but Pascal/MT+ supported overlays to be swapped in and out on-demand.



                              We coded on our CP/M machines, equal to the target ones:




                              • 64 kB RAM

                              • 4 MHz Z80A

                              • 80*24 terminal with serial connection (9600 baud)

                              • two 800 kB floppy drives (later extended by a 20 MB SCSI hard disk)


                              I don't remember the editor we used, quite probably WordStar.



                              Compiling and linking the application took a few minutes, but was possible without floppy swaps, IIRC.



                              We didn't have version control, only lots of backup floppy-disks. We did most of the coding in coordinated sessions, so the risk of accidentally modifying the same source file was reduced by oral communication.






                              share|improve this answer


























                                5












                                5








                                5







                                Together with a friend, around 1984-1986, I developed a PCB CAD system "CADsoft" (not to be confused with the later company of that name, producing the Eagle CAD system) targetting the Z80 CP/M platform (with a specially-designed graphics card, see here) that was sold around 20 times, and used by some hardware companies to design the PCBs for their product portfolio.



                                We did it in Pascal/MT+ and a few assembly routines for hardware interfacing. The resulting executable was too large to fit into the RAM, but Pascal/MT+ supported overlays to be swapped in and out on-demand.



                                We coded on our CP/M machines, equal to the target ones:




                                • 64 kB RAM

                                • 4 MHz Z80A

                                • 80*24 terminal with serial connection (9600 baud)

                                • two 800 kB floppy drives (later extended by a 20 MB SCSI hard disk)


                                I don't remember the editor we used, quite probably WordStar.



                                Compiling and linking the application took a few minutes, but was possible without floppy swaps, IIRC.



                                We didn't have version control, only lots of backup floppy-disks. We did most of the coding in coordinated sessions, so the risk of accidentally modifying the same source file was reduced by oral communication.






                                share|improve this answer













                                Together with a friend, around 1984-1986, I developed a PCB CAD system "CADsoft" (not to be confused with the later company of that name, producing the Eagle CAD system) targetting the Z80 CP/M platform (with a specially-designed graphics card, see here) that was sold around 20 times, and used by some hardware companies to design the PCBs for their product portfolio.



                                We did it in Pascal/MT+ and a few assembly routines for hardware interfacing. The resulting executable was too large to fit into the RAM, but Pascal/MT+ supported overlays to be swapped in and out on-demand.



                                We coded on our CP/M machines, equal to the target ones:




                                • 64 kB RAM

                                • 4 MHz Z80A

                                • 80*24 terminal with serial connection (9600 baud)

                                • two 800 kB floppy drives (later extended by a 20 MB SCSI hard disk)


                                I don't remember the editor we used, quite probably WordStar.



                                Compiling and linking the application took a few minutes, but was possible without floppy swaps, IIRC.



                                We didn't have version control, only lots of backup floppy-disks. We did most of the coding in coordinated sessions, so the risk of accidentally modifying the same source file was reduced by oral communication.







                                share|improve this answer












                                share|improve this answer



                                share|improve this answer










                                answered Jan 30 at 21:26









                                Ralf KleberhoffRalf Kleberhoff

                                82128




                                82128























                                    5














                                    A friend of mine did the C64 conversion of Tetris for Mirrorsoft. AFAIR he used the game was supplied, pre-written, in Commodore Basic, which was then compiled, then he used a disk-based assembler for the graphics and sound (he wrote his own sound player and composer app, which the composer user to supply the music) and assembled everything, all on a single C64 in his bedroom. Not sure how long each 'compilation' took, but I remember watching him play clarinet, whilst waiting for the disk to stop whirring, which signalled it was ready to play-test.






                                    share|improve this answer
























                                    • Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                      – Tommy
                                      Feb 1 at 22:42











                                    • You are correct. The main logic and game play was compiled basic.

                                      – Neil
                                      Feb 2 at 11:05
















                                    5














                                    A friend of mine did the C64 conversion of Tetris for Mirrorsoft. AFAIR he used the game was supplied, pre-written, in Commodore Basic, which was then compiled, then he used a disk-based assembler for the graphics and sound (he wrote his own sound player and composer app, which the composer user to supply the music) and assembled everything, all on a single C64 in his bedroom. Not sure how long each 'compilation' took, but I remember watching him play clarinet, whilst waiting for the disk to stop whirring, which signalled it was ready to play-test.






                                    share|improve this answer
























                                    • Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                      – Tommy
                                      Feb 1 at 22:42











                                    • You are correct. The main logic and game play was compiled basic.

                                      – Neil
                                      Feb 2 at 11:05














                                    5












                                    5








                                    5







                                    A friend of mine did the C64 conversion of Tetris for Mirrorsoft. AFAIR he used the game was supplied, pre-written, in Commodore Basic, which was then compiled, then he used a disk-based assembler for the graphics and sound (he wrote his own sound player and composer app, which the composer user to supply the music) and assembled everything, all on a single C64 in his bedroom. Not sure how long each 'compilation' took, but I remember watching him play clarinet, whilst waiting for the disk to stop whirring, which signalled it was ready to play-test.






                                    share|improve this answer













                                    A friend of mine did the C64 conversion of Tetris for Mirrorsoft. AFAIR he used the game was supplied, pre-written, in Commodore Basic, which was then compiled, then he used a disk-based assembler for the graphics and sound (he wrote his own sound player and composer app, which the composer user to supply the music) and assembled everything, all on a single C64 in his bedroom. Not sure how long each 'compilation' took, but I remember watching him play clarinet, whilst waiting for the disk to stop whirring, which signalled it was ready to play-test.







                                    share|improve this answer












                                    share|improve this answer



                                    share|improve this answer










                                    answered Jan 31 at 13:21









                                    NeilNeil

                                    34114




                                    34114













                                    • Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                      – Tommy
                                      Feb 1 at 22:42











                                    • You are correct. The main logic and game play was compiled basic.

                                      – Neil
                                      Feb 2 at 11:05



















                                    • Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                      – Tommy
                                      Feb 1 at 22:42











                                    • You are correct. The main logic and game play was compiled basic.

                                      – Neil
                                      Feb 2 at 11:05

















                                    Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                    – Tommy
                                    Feb 1 at 22:42





                                    Having checked it out on YouTube, I can easily believe it, but am I right to take from this that Tetris was written in BASIC? Run through a BASIC compiler for release, but nevertheless written in BASIC?

                                    – Tommy
                                    Feb 1 at 22:42













                                    You are correct. The main logic and game play was compiled basic.

                                    – Neil
                                    Feb 2 at 11:05





                                    You are correct. The main logic and game play was compiled basic.

                                    – Neil
                                    Feb 2 at 11:05











                                    5














                                    I'll answer for me, as someone who did a few games.



                                    On the C64, I used a stock machine with a floppy drive and an assembler, I forget the product. If I recall correctly, the assembler came on cartridge and let you write assembly language as an extension of Basic.



                                    On the ZX Spectrum, I had a CP/M machine (Memotech) with dual floppies and a C compiler / assembler / linker toolchain. I then had some code to hexdump the (fixed location) linker output and send it down a custom interface into the Speccy (this definitely took a cup of (instant) coffee while it loaded).



                                    [Incidentally, when I grew up and became a firmware engineer with real budgets, my stack of choice involved an in-circuit emulator, EPROM emulator, logic analyser and PDP-II to run compilation, mostly controlled by a bank of VT-100 type terminals. You could have used such a thing to develop 8-bit home computer apps if you had the funds]






                                    share|improve this answer




























                                      5














                                      I'll answer for me, as someone who did a few games.



                                      On the C64, I used a stock machine with a floppy drive and an assembler, I forget the product. If I recall correctly, the assembler came on cartridge and let you write assembly language as an extension of Basic.



                                      On the ZX Spectrum, I had a CP/M machine (Memotech) with dual floppies and a C compiler / assembler / linker toolchain. I then had some code to hexdump the (fixed location) linker output and send it down a custom interface into the Speccy (this definitely took a cup of (instant) coffee while it loaded).



                                      [Incidentally, when I grew up and became a firmware engineer with real budgets, my stack of choice involved an in-circuit emulator, EPROM emulator, logic analyser and PDP-II to run compilation, mostly controlled by a bank of VT-100 type terminals. You could have used such a thing to develop 8-bit home computer apps if you had the funds]






                                      share|improve this answer


























                                        5












                                        5








                                        5







                                        I'll answer for me, as someone who did a few games.



                                        On the C64, I used a stock machine with a floppy drive and an assembler, I forget the product. If I recall correctly, the assembler came on cartridge and let you write assembly language as an extension of Basic.



                                        On the ZX Spectrum, I had a CP/M machine (Memotech) with dual floppies and a C compiler / assembler / linker toolchain. I then had some code to hexdump the (fixed location) linker output and send it down a custom interface into the Speccy (this definitely took a cup of (instant) coffee while it loaded).



                                        [Incidentally, when I grew up and became a firmware engineer with real budgets, my stack of choice involved an in-circuit emulator, EPROM emulator, logic analyser and PDP-II to run compilation, mostly controlled by a bank of VT-100 type terminals. You could have used such a thing to develop 8-bit home computer apps if you had the funds]






                                        share|improve this answer













                                        I'll answer for me, as someone who did a few games.



                                        On the C64, I used a stock machine with a floppy drive and an assembler, I forget the product. If I recall correctly, the assembler came on cartridge and let you write assembly language as an extension of Basic.



                                        On the ZX Spectrum, I had a CP/M machine (Memotech) with dual floppies and a C compiler / assembler / linker toolchain. I then had some code to hexdump the (fixed location) linker output and send it down a custom interface into the Speccy (this definitely took a cup of (instant) coffee while it loaded).



                                        [Incidentally, when I grew up and became a firmware engineer with real budgets, my stack of choice involved an in-circuit emulator, EPROM emulator, logic analyser and PDP-II to run compilation, mostly controlled by a bank of VT-100 type terminals. You could have used such a thing to develop 8-bit home computer apps if you had the funds]







                                        share|improve this answer












                                        share|improve this answer



                                        share|improve this answer










                                        answered Feb 1 at 1:41









                                        RichRich

                                        1993




                                        1993























                                            4














                                            It depends on how late into the late 80s we are talking about, and your definition of "commercial titles"



                                            By 1988-1989 got my first job developing/maintaining a C payroll package, using a Commodore 80286, running at 12MHz with 1MB of RAM (PC40?) at work, with a wooping 20MB hard disk, which was upgraded to 30-40MB a couple of months later.



                                            Was using for development Microsoft C compiler, Masm assembler and MS-DOS 3.x. The editor was SideKick for editing text files with WordStar compatible keys, and was writing a couple of routines in 8086 ASM. For compiling was still invoking Microsoft make with a carefully crafted Makefile



                                            The monitor was monochromatic green, Hercules, and the backups of our source code were done in 5 1/4 floppy disks.



                                            The software was delivered to the customers either in a 5 1⁄4 or a 3 1/2 floppy disk, and later on as as a compressed zip file with an installer I wrote in batch DOS commands.



                                            The customers usually used less powerful machines, there were still around a lot of XT machines and even Tandy 1000 DOS computers. (By that time I also had a Olivetti PC1 XT at home)



                                            By that time, started writing a DOS ZX Spectrum emulator/debbuger, first versions running in Hercules at work (and CGA, later VGA at home) without keyboard input, that never released. It was capable of doing step-by-step debugging and used TASM to cross assembly Z80 ASM files. Those were the basis for my WSpecem ZX emulator for Windows, released in 1996.






                                            share|improve this answer


























                                            • Commodore 286? Or was it a Compaq 286?

                                              – manassehkatz
                                              Jan 30 at 22:33











                                            • Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                              – Rui F Ribeiro
                                              Jan 30 at 22:40








                                            • 1





                                              Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                              – manassehkatz
                                              Jan 30 at 22:45











                                            • I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                              – Rui F Ribeiro
                                              Jan 30 at 22:47













                                            • 1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                              – a CVn
                                              Feb 1 at 21:53
















                                            4














                                            It depends on how late into the late 80s we are talking about, and your definition of "commercial titles"



                                            By 1988-1989 got my first job developing/maintaining a C payroll package, using a Commodore 80286, running at 12MHz with 1MB of RAM (PC40?) at work, with a wooping 20MB hard disk, which was upgraded to 30-40MB a couple of months later.



                                            Was using for development Microsoft C compiler, Masm assembler and MS-DOS 3.x. The editor was SideKick for editing text files with WordStar compatible keys, and was writing a couple of routines in 8086 ASM. For compiling was still invoking Microsoft make with a carefully crafted Makefile



                                            The monitor was monochromatic green, Hercules, and the backups of our source code were done in 5 1/4 floppy disks.



                                            The software was delivered to the customers either in a 5 1⁄4 or a 3 1/2 floppy disk, and later on as as a compressed zip file with an installer I wrote in batch DOS commands.



                                            The customers usually used less powerful machines, there were still around a lot of XT machines and even Tandy 1000 DOS computers. (By that time I also had a Olivetti PC1 XT at home)



                                            By that time, started writing a DOS ZX Spectrum emulator/debbuger, first versions running in Hercules at work (and CGA, later VGA at home) without keyboard input, that never released. It was capable of doing step-by-step debugging and used TASM to cross assembly Z80 ASM files. Those were the basis for my WSpecem ZX emulator for Windows, released in 1996.






                                            share|improve this answer


























                                            • Commodore 286? Or was it a Compaq 286?

                                              – manassehkatz
                                              Jan 30 at 22:33











                                            • Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                              – Rui F Ribeiro
                                              Jan 30 at 22:40








                                            • 1





                                              Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                              – manassehkatz
                                              Jan 30 at 22:45











                                            • I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                              – Rui F Ribeiro
                                              Jan 30 at 22:47













                                            • 1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                              – a CVn
                                              Feb 1 at 21:53














                                            4












                                            4








                                            4







                                            It depends on how late into the late 80s we are talking about, and your definition of "commercial titles"



                                            By 1988-1989 got my first job developing/maintaining a C payroll package, using a Commodore 80286, running at 12MHz with 1MB of RAM (PC40?) at work, with a wooping 20MB hard disk, which was upgraded to 30-40MB a couple of months later.



                                            Was using for development Microsoft C compiler, Masm assembler and MS-DOS 3.x. The editor was SideKick for editing text files with WordStar compatible keys, and was writing a couple of routines in 8086 ASM. For compiling was still invoking Microsoft make with a carefully crafted Makefile



                                            The monitor was monochromatic green, Hercules, and the backups of our source code were done in 5 1/4 floppy disks.



                                            The software was delivered to the customers either in a 5 1⁄4 or a 3 1/2 floppy disk, and later on as as a compressed zip file with an installer I wrote in batch DOS commands.



                                            The customers usually used less powerful machines, there were still around a lot of XT machines and even Tandy 1000 DOS computers. (By that time I also had a Olivetti PC1 XT at home)



                                            By that time, started writing a DOS ZX Spectrum emulator/debbuger, first versions running in Hercules at work (and CGA, later VGA at home) without keyboard input, that never released. It was capable of doing step-by-step debugging and used TASM to cross assembly Z80 ASM files. Those were the basis for my WSpecem ZX emulator for Windows, released in 1996.






                                            share|improve this answer















                                            It depends on how late into the late 80s we are talking about, and your definition of "commercial titles"



                                            By 1988-1989 got my first job developing/maintaining a C payroll package, using a Commodore 80286, running at 12MHz with 1MB of RAM (PC40?) at work, with a wooping 20MB hard disk, which was upgraded to 30-40MB a couple of months later.



                                            Was using for development Microsoft C compiler, Masm assembler and MS-DOS 3.x. The editor was SideKick for editing text files with WordStar compatible keys, and was writing a couple of routines in 8086 ASM. For compiling was still invoking Microsoft make with a carefully crafted Makefile



                                            The monitor was monochromatic green, Hercules, and the backups of our source code were done in 5 1/4 floppy disks.



                                            The software was delivered to the customers either in a 5 1⁄4 or a 3 1/2 floppy disk, and later on as as a compressed zip file with an installer I wrote in batch DOS commands.



                                            The customers usually used less powerful machines, there were still around a lot of XT machines and even Tandy 1000 DOS computers. (By that time I also had a Olivetti PC1 XT at home)



                                            By that time, started writing a DOS ZX Spectrum emulator/debbuger, first versions running in Hercules at work (and CGA, later VGA at home) without keyboard input, that never released. It was capable of doing step-by-step debugging and used TASM to cross assembly Z80 ASM files. Those were the basis for my WSpecem ZX emulator for Windows, released in 1996.







                                            share|improve this answer














                                            share|improve this answer



                                            share|improve this answer








                                            edited Feb 1 at 21:55

























                                            answered Jan 30 at 21:45









                                            Rui F RibeiroRui F Ribeiro

                                            1,203415




                                            1,203415













                                            • Commodore 286? Or was it a Compaq 286?

                                              – manassehkatz
                                              Jan 30 at 22:33











                                            • Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                              – Rui F Ribeiro
                                              Jan 30 at 22:40








                                            • 1





                                              Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                              – manassehkatz
                                              Jan 30 at 22:45











                                            • I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                              – Rui F Ribeiro
                                              Jan 30 at 22:47













                                            • 1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                              – a CVn
                                              Feb 1 at 21:53



















                                            • Commodore 286? Or was it a Compaq 286?

                                              – manassehkatz
                                              Jan 30 at 22:33











                                            • Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                              – Rui F Ribeiro
                                              Jan 30 at 22:40








                                            • 1





                                              Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                              – manassehkatz
                                              Jan 30 at 22:45











                                            • I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                              – Rui F Ribeiro
                                              Jan 30 at 22:47













                                            • 1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                              – a CVn
                                              Feb 1 at 21:53

















                                            Commodore 286? Or was it a Compaq 286?

                                            – manassehkatz
                                            Jan 30 at 22:33





                                            Commodore 286? Or was it a Compaq 286?

                                            – manassehkatz
                                            Jan 30 at 22:33













                                            Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                            – Rui F Ribeiro
                                            Jan 30 at 22:40







                                            Commodore add their PC line-up,I worked for the distributor at this time which also had a software house, and they sold me a 386SX commodore soon thereafter with a very steep discount. Also had access to the technicial bios manuals

                                            – Rui F Ribeiro
                                            Jan 30 at 22:40






                                            1




                                            1





                                            Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                            – manassehkatz
                                            Jan 30 at 22:45





                                            Interesting. I didn't even realize they had a 286 (or 386SX). I do remember Sidekick.

                                            – manassehkatz
                                            Jan 30 at 22:45













                                            I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                            – Rui F Ribeiro
                                            Jan 30 at 22:47







                                            I used to have a commodore xt technical manual with commented asm bios listing. It was a present from the team lead of their (loca) hardware team

                                            – Rui F Ribeiro
                                            Jan 30 at 22:47















                                            1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                            – a CVn
                                            Feb 1 at 21:53





                                            1.2 MB 5¼" disks would be quite spacious for backing up a 20 MB or even a 40 MB hard drive. 35 5¼" floppies would allow backing up a jam-packed 40 MB hard drive; for the 20 MB hard drive, 17 would be needed. For comparison, 35 recordable SL DVDs would allow backing up 150 GB, and probably take longer to do it too... plus, I'm pretty sure even the rewritable ones can't be overwritten in place, so you need to re-do over 4 GB of backing up.

                                            – a CVn
                                            Feb 1 at 21:53











                                            3














                                            I wrote commercial TRS-80 software, on a TRS-80. Either in BASIC, or in Z-80 assembler run through Editor Assembler. Those were the days. I also used the TRS-80 to cross-compile 6805 asm to burn on Motorola chips for what we'd call embedded devices now. At my next job, we all shared and wrote software on an IMSAI with a Z-80 chip and bank-switched RAM using an M/PM-like timeshare OS, with a single giant 5MB HDD that backed up to VHS cassettes. If the head programmer ran the 30-minute build of his Z-80 ASM (it was a phonebook sized printout) we'd all feel it.






                                            share|improve this answer




























                                              3














                                              I wrote commercial TRS-80 software, on a TRS-80. Either in BASIC, or in Z-80 assembler run through Editor Assembler. Those were the days. I also used the TRS-80 to cross-compile 6805 asm to burn on Motorola chips for what we'd call embedded devices now. At my next job, we all shared and wrote software on an IMSAI with a Z-80 chip and bank-switched RAM using an M/PM-like timeshare OS, with a single giant 5MB HDD that backed up to VHS cassettes. If the head programmer ran the 30-minute build of his Z-80 ASM (it was a phonebook sized printout) we'd all feel it.






                                              share|improve this answer


























                                                3












                                                3








                                                3







                                                I wrote commercial TRS-80 software, on a TRS-80. Either in BASIC, or in Z-80 assembler run through Editor Assembler. Those were the days. I also used the TRS-80 to cross-compile 6805 asm to burn on Motorola chips for what we'd call embedded devices now. At my next job, we all shared and wrote software on an IMSAI with a Z-80 chip and bank-switched RAM using an M/PM-like timeshare OS, with a single giant 5MB HDD that backed up to VHS cassettes. If the head programmer ran the 30-minute build of his Z-80 ASM (it was a phonebook sized printout) we'd all feel it.






                                                share|improve this answer













                                                I wrote commercial TRS-80 software, on a TRS-80. Either in BASIC, or in Z-80 assembler run through Editor Assembler. Those were the days. I also used the TRS-80 to cross-compile 6805 asm to burn on Motorola chips for what we'd call embedded devices now. At my next job, we all shared and wrote software on an IMSAI with a Z-80 chip and bank-switched RAM using an M/PM-like timeshare OS, with a single giant 5MB HDD that backed up to VHS cassettes. If the head programmer ran the 30-minute build of his Z-80 ASM (it was a phonebook sized printout) we'd all feel it.







                                                share|improve this answer












                                                share|improve this answer



                                                share|improve this answer










                                                answered Feb 2 at 23:36









                                                DithermasterDithermaster

                                                1613




                                                1613

















                                                    protected by wizzwizz4 Jan 31 at 16:34



                                                    Thank you for your interest in this question.
                                                    Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                                    Would you like to answer one of these unanswered questions instead?



                                                    Popular posts from this blog

                                                    Human spaceflight

                                                    Can not write log (Is /dev/pts mounted?) - openpty in Ubuntu-on-Windows?

                                                    File:DeusFollowingSea.jpg