h o m e
s c h e d u l e
a r c h i v e s
f o r u m
c h a t
f a q
t o o l s
a b o u t
dr. dobbs journal
In an earlier life, as OS Manager for Commodore, Carl Sassenrath designed and implemented the multitasking Amiga OS kernel. Unsatisfied with existing programming languages which he characterizes as non-intuitive and unnecessarily complex, he used his experience to design a new programing dialect. The result is REBOL -a language with a cause.
More REBOL related programs at TechNetCast, including an interview with Marketing VP Steve Mason
Want to know more about REBOL? Check out the Rebol Technologies website
Submit advance questions, post and read messages about this interview with CS on the TechNetCast forum
Our guest today is Carl Sassenrath, founder and CEO of Rebol Technologies and a leading innovator in software technology for over 15 years. He has worked most notably at HP and Apple, and hes probably best known as the designer and implementor of the Amiga OS kernel.
When the original Amiga came out in 1995 it was the first multimedia personal computer on the market that featured multimedia features at a time when IBM PC were limited to 16 colors. It featured a graphical user interface, a fast graphic subsystem and support for animation and sound.
And all this was made possible by a highly efficient modular, multitasking operating system with small memory requirements. An operating system that featured a loadable file system, shared libraries and more importantly was open and designed to be extensible. Carl, welcome to the program.
CS: Thanks Philippe for the invitation to be on your program today.
TNC: Its great to have you on. Obviously the Amiga is the domain where you spent most of your life until you started Rebol. How did you get involved with Amiga?
CS: Well the Amiga was a start up company way back in 1983. At that time I was at Hewlett-Packard and I had been working on a bitmap display system there with Hewlett-Packard implementing a graphical user interface. And it was on a prototype of the Sun workstation that we got directly from Stamford University.
I was very much into bitmap displays and I thought their were future was bright. And there was an invitation to get involved with Amiga. [As part of that invitation, I was told] that if I came to work there, that I could write whatever operating system I wanted.
TNC: What products did Amiga offer at the time?
CS: Well, Amiga was a company in disguise. It was one of those early self-marketing attempts. They were making game controllers for the Atari. That was just a front for the Amiga computer that was being worked on behind the scenes. Jay Minor and Dave Morris founded the company. Jay was employee number three at Atari, so he had a lot of experience in video game machines.
TNC: So they hire you and you have a blank check to write an operating system. Did you have any experience writing operating systems before? What was your software programming experience at that point?
CS: At that point I was very much into operating systems and languages. Prior to Amiga, I had worked on the MPE kernel, the operating system for HP3000 computers, at HP. It's a fully robust commercial operating system and its still in use today. And while I was there I studied essentially every operating system that was out there. I was sort of a fanatic about operating systems.
TNC: So you were the right guy for the job.
CS: Well, I had been dreaming about it for so many years.
TNC: What were some of the design constraints that guided your decisions as you started to build this new operating system?
CS: Well, [the constraints] were pretty extreme because the first objective of the Amiga originally was to be a video game machine. Thats why the company was formed. In those days most video game programmers went right to the metal. What I wanted to do was implement an operating system that was so efficient, yet also provide multitasking, for instance.
The balancing act was to try to make an operating system that would be useful for game programmers, but also have multi-tasking, dynamically shared libraries and devices -- essentially have an operating system architecture to it. It would be efficient enough that most games would go ahead and use it.
TNC: I guess multitasking is essential for a platform that would support animations, graphics, sounds, everything going on at the same time.
TNC: How did you go about implementing multitasking in the Amiga OS kernel?
CS: Well, back in those days, most multitasking systems were pretty large. They were on IBMs and DEC machines, etc. What we needed to do was create something that was very efficient. So, I came up with what I guess was one of the very first micro-kernel designs for a multitasking kernel. Its very thin, very lean and mean to get the job done quickly.
TNC: Micro-kernels are now the rage, but at the time they were really a new concept.
CS: Yes, thats right.
TNC: What are some of the features of a micro-kernel as opposed to an operating system kernel that includes everything but the kitchen sink?
CS: Well, the idea is that a kernel of an operating system can be specifically oriented towards providing a relatively small number of services on which you can build layers.
What you want to do at the heart of an operating system is share resources: memory resources or whether those are profit resources or IR resources What youre really after is to organize, manage and share that stuff. So, the smaller the code for that, the better off you are in terms of a micro-kernel. And then you build layers around it of all the other stuff.
TNC: You had memory constraints that larger systems didnt have. What were some of the conditions that you were working in as far as memory?
CS: The Amiga was an interesting machine because it was a 32 bit machine. It had 25 DMA, Direct Memory Access channels that were all happening simultaneous. It had a bit-blitter and a copper which were essentially running in parallel to the processor.
Pieces of memory had to be used for instance for the sound or for screen graphics, etc. [The OS] had to orchestrate all of that.
TNC: What CPU was used in the original machine?
CS: The original machine was a Motorola 68000.
TNC: A 32 bit processor?
CS: Yes, that was a 32 bit machine. And we had auto-configuration, essentially what they call plug and play today. There was no interrupt configuration necessary. And accelerated graphics, as I said, was bitmapped 4,096 color.
TNC: That was a huge amount at the time.
CS: Yes, oh yes, back in those days there were four colors on the PC.
TNC: What was the video sub-system like? Was there dedicated video memory?
CS: No, the video memory was organized so it actually came out of main memory, a split buffer so that the machine would be running in parallels. You could have the processor running in one area of memory while, for instance, while the video was running in another area of memory. And all of this was highly multiplexed with the BMA channels that were going on.
TNC: Now, Karl you also mentioned, just to go back to the software side, shared libraries, how did that work?
CS: When we started building the system there were other people that were working on the graphics part and the GUI and the audio library.
The traditional way of doing that back in those days was to statically link everything together into a single ROM. And it became clear to me that the projects were so different and so far out of sync that there was no way it would ever get together and essentially do a static ROM build.
Besides, that would have slowed down the entire development process. So, I began to think about how we could use some mechanism of separating out all those modules into pieces that could be loaded dynamically. And as a matter of fact, the ROM is not even statically linked on the Amiga.
When the machine boots, it actually scans the entire ROM, finds all of the different libraries and models that are within the ROM and then links them all together. And that same model happens after, basically comes up off a disk, you know it scans a disk and looks for other kinds of libraries, etc. that are on it.
TNC: Were these ROM libraries written in C or assembly?
CS: The micro-kernel is written totally in assembly code.
TNC: How were these libraries dynamically loaded? What did the interface between the kernel and these libraries look like?
CS: Each of the libraries had a set of standard vectors that that were standardized by the operating system. In addition, each library could have additional vectors, entry points essentially, into the library. And every library could also have its private data area, so if it needed to keep track of information
TNC: Static information?
CS: Yes, information about what things it had allocated, what it had running, it could do that. It had its own data segment for that too. And the device model actually was built on top of the library model and really the only difference between the library model and the device model is that the device model was an asynchronous model. It was message based rather than the library which was more of a C function call kind of [mechanism].
TNC: You needed an asynchronous model to deal with latency. Did the drivers run in kernel mode?
CS: No, everything ran in user mode. Only a very small portion of the kernel ran actually in supervisor mode --the core scheduler dispatcher of tasks. Interrupt processing, for instance, would also run in supervisor mode as they called it back then. Everything else ran in user mode, so it could use other resources and not the kernel resources for what they were doing.
TNC: And the source code was available for these internals?
CS: Well, not in those days. Actually, I dont think the kernel source code is still available to this day. It was proprietary.
TNC: Ive seen postings from users that were recompiling the kernel. Are those sources variants of the original Amiga OS?
CS I think what has happened is that people have actually gone in and disassembled it, taken it and picked it apart and reversed engineered it in the 13-14 years since its creation.
TNC: In retrospect, how successful was this extensible shared library architecture? Did it work the way you originally envisioned it when you designed it?
CS: It was much more than I envisioned. When you design something like this, when you design anything, especially when youre on the cutting edge, youre not quite sure and have some doubts here and there about certain elements of your design.
And you know that only history will tell whether those were the right decisions to make. And, yes, history has proven that the Amiga was a very good design.
TNC: What were some of the other major characteristics of the Amiga operating system?
CS: One of the things I felt strong about was that it was a message-based kernel. In other words, device drivers would be separate tasks that would be sent messages with their I/O requirements. There are usually 20 or 30 independent tasks that are running right after you boot an Amiga for instance.
Even back in 1985 Amigas were running printer device, graphics, and sound drivers, and disk I/O, all separate. Passing these messages around was very important.
TNC: How was message passing implemented? I guess, the first requirement was that it be very fast.
CS: The message overhead was very, very low which allowed the machine to run [efficiently]. On the very first machine, a 7MHz 68000, [the kernel processed] on the order of 10,000 messages a second.
There was a standard message interface and a standard structure for communicating to device drivers.
TNC: Lets take this to the next level. What did the user interface look like? If I were an Amiga programmer, what API would I program to on the original Amiga?
CS: The original Amiga had a GUI with pull-down menus. It had multiple screens as well and this is still a concept that the rest of the world hasnt really seen. You could run multiple resolutions at the same time on the Amiga, but have pull-down screens with different resolutions.
If youre playing a full-screen video game, you could also have on top of it your editor or your compiler and you could switch between these windows of different resolutions quite easily.
TNC: What was the original resolution?
CS: The original, maximum resolution was 720x480 pixels. That was necessary to do over-scanned video output to television sets.
TNC: Obviously, one of the most compelling features of the Amiga was its graphical user interface. What did it look like? Was it also a descendent of what was going on at Park Xerox and what Apple also used?
CS: Yes, very much. We studied the Xerox work. We also studied at that time the Lisa. The Macintosh had not come out the door quit yet. We had different opinions on various things. We also had to deal with color and most of those systems had no color. So, that was a new consideration.
We also elected to go with a two-button mouse for instance which is kind of todays standard for the PCs at least. And that was based on the ability to both point at something on the screen and also be able to pull up menus on the screen.
TNC: Im sure there were many design meetings leading up to that decision
TNC: That's one of the holy wars of personal computing, the one-button mouse vs. the two-button mouse
CS: We were very good at having discussions about these things and really ironing them out. And we came up with very good consensus decisions on things such as the two-button mouse.
TNC: Is that why the Amiga was so successful --at least technically in the early days-- because of this synergy between people working in a small group?
CS: Yes, I think that had a lot to do with it. It was a very small team of people that all sat in the same room and communicated constantly with each other on the design of the machine. There was also the spirit of what we were doing. Its just a great feeling to know youre making a new computer and we had great dreams.
TNC: Lets go back to software development. What was it like to write software for Amiga in the early days? Were there any development tools for programmers? Did they have to write directly with a C compiler to the API? What compilers were available?
CS: With any new machine you have that bootstrap step of getting started with development tools. Amiga initially purchased the Green Hill C Compiler and ported it over to the Amiga.
TNC: What tools did you use for your own development?
CS: I developed the kernel on an HP emulator workstation that allowed me to test the code even before the processor and the hardware were ready.
That was all done in assembly code. We also used 68000 based Sage Computers [that were available] back in those days. We used the Sage compilers for a while and then we ported the Green Hills Compiler.
And at one point Lattice made a C compiler for the PC, they got involved and they made a Lattice C [compiler for the Amiga]. And then after that Manx also got involved and [produced] an even better C compiler. A leapfrog effect started with the competition in the marketplace to make a better C compiler for the Amiga.
TNC: And before long the Amiga supported a number of different languages.
CS: Yes, oh yes. The traditional set pretty much.
TNC: I want to move away from the Amiga. Before we leave the topic however, I have to ask you about whats going on right now with the Amiga. Last year Gateway, I believe, purchased the Amiga technology. Whats going on in the Amiga community these days?
CS: Gateway essentially spun off a division of the company that was independent of Gateway to take and pursue the Amiga vision of that type of product --a high-performance multimedia computer, but still available at kind of consumer-level prices. The following of the Amiga was so strong and it really pushed them in that direction.
They were primarily interested in the proprietary technology when they originally purchased the Amiga. The patent portfolio of the Amiga is quite broad since it did a lot of these things first. But after they received a couple hundred thousand E-mails from enthusiasts all over the world that started believing very strongly in the Amiga, they began to think that maybe there was a good market for all these independent thinkers out there.
TNC: They bought more than just a set of patents. They acquired a movement.
There is development effort underway as we speak to release a new Amiga operating system.
CS: Yes. There is a development effort going on right now. Its hot and furious and moving along and high energy. And they have some great ideas. I dont participate in it on a daily basis. Ive talked to them a few times and consulted with them on some decisions. But I think theyre making some very good decisions, very much following in the footsteps of that Amiga vision or dream for what a computer can be.
TNC: What are your thoughts on Be? Bes vision is in large part Amigas original vision for a truly multimedia, high-performance, no-compromise personal computer. We talked a few times with the Be engineers down in Menlo Park and theyre enthusiastic about the Amiga. They definitely see it as a forerunner of what theyre doing right now. Do you know anything about their technology?
CS: Oh yes, yes. Ive gone down there and visited them and talked to Jean-Louis Gassee and seen what theyre up to, and I like what theyre doing. They have a lot of really good ideas. Many of their ideas are very similar to the Amiga. Its sort of the next generation in terms of the performance and structure.
TNC: Do you follow operating systems very closely? The movement behind Linux is gaining momentum. What are your thoughts on Unix operating systems, for example?
CS: Ive studied Unix for many, many years all the way back to the HP days prior to Amiga. I studied the Unixes that were available at that time. Back then they were more research projects, university and grass-roots products. Over the years theyve evolved into a lot more than that.
Linux is a very interesting operating system to watch because it has this open source momentum behind it. Of course, it has a very Unix-like flavor, but it also has the ability to evolve very quickly. And it seems to be incredibly efficient. And you know its actually a real pleasure to work with
TNC: Do you think its performance is better than another of Unix flavors?
CS: From what Ive seen, yes. We have a number of boxes here because we port REBOL to so many platforms. I think Linux is our top performer. I havent sat down and actually done a precise measurement, but just what Ive seen of it, it does very well.
TNC: Although that may be hard to believe because it is such a large piece of software, Windows NT is built on top of a micro-kernel. Some people would say that it has "too many layers", but it is a layered operating system.
CS: Yes, it is a layered operating system and I think the its kernel is probably pretty good. I have never seen the source code to the kernel but the API seems to be pretty reasonable compared other products from the same company. So, I think it has a basis thats pretty sound.
But like you said, on top of that there are layers and layers and layers of other APIs and other abstractions, many of which have different pursuits and visions. The overall complexity of NT is quite high. There are at least ten thousand, probably more twenty thousand API interfaces to that operating system. One person or a small shop of programmers have a hard time dealing with that kind of complexity.
TNC: And this will probably get worse with NT 5 which is significantly larger than the NT 4.0.
You mentioned complexity. Complexity is one of your favorite topics, and this brings us to REBOL. One of the objectives of Rebol is to make programming easier and to hide some of the complexities of programming. Is that correct?
CS: Yes thats correct. Were after productivity here. Ive been involved with computer languages for 20 years and I essentially invented Rebol. I never found a language one that I felt fully productive in, one that really seemed to be well suited to getting the job done.
TNC: You know, it seems that the complexity of learning a language and using a language interferes with developing solutions. Programmers spend most of their time struggling with the language rather than struggling with the problem domain.
CS: Thats why one of the fundamental principles behind REBOL is to get away from the language and be one step removed from the actual computer language, to speak in terms or to write your solution in terms of the problem domain.
TNC: How does REBOL accomplish this?
CS: We call it dialecting. It turns out that its not an entirely new concept. [Forth uses it], although Rebol is really nothing like Forth in terms of implementation and the way it functions. One of the concepts of Forth that was very good was that if you wanted to control a telescope, you should do it in terms of astronomy, in terms of stars and their locations, azimuths etc Youd write a sub language within Forth that would be for controlling telescopes.
Or if you wanted to control a car engine, youre talking about spark plugs and cylinders and gear boxes and youd want to write your solution in terms of those elements.
This isnt really different from English. If youre a lawyer you speak in legal terms. If youre a doctor you speak in medical terms. You have your own vocabulary and even have your own grammar, the way you rearrange words to make it more specific to that domain. And thats one of the ways we humans have evolved to deal with complexity.
If humans had the same problems communicating through language as computers have, we would just come to a grinding halt as a society. But were able to get around those and we adapt [our language to the problem domain]. That adaptability is very much what REBOL is about.
TNC: How can REBOL users build vocabularies or grammars specific to their domain? Do they need to teach the language?
TNC: Yes. There is a predefined grammar within REBOL, a functional language itself thats underneath it all. And within that functional language there is the ability to handle this dialecting. Were still working a lot of these ideas.
In many cases, you can build a grammar thats very simple without even knowing that youre doing it. In other words, combinations of words and you know values, numbers, strings that kind of thing. And you can kind of build a little ad-hoc way of interpreting those.
But were also working on a way for that to be more formalized, so that you can actually specify the kinds of choices and the grammar, in a way similar to regular expressions or B and F grammar notation. That will make it a lot easier for people to develop their own grammar.
I dont think that everyones will be developing their own grammars.
Specialists in medicine, for instance, will develop a grammar that will be very useful to some kinds of doctors in research and theyll provide that as sort of a layer to all of those doctors. So those doctors wont be writing directly in Rebol, theyll be writing in this dialect of Rebol that was meant for them.
TNC: I want to read a quote from you that I found on the web. It brings together the discussion we had on operating systems and your current endeavor, Rebol: "Once the language is completely in distribution, the second phase is to develop a small and flexible operating system which is integrated in a unique way with the language". Is it still your plan to move on and integrate Rebol into an operating system?
CS: Well, thats a long-term thing. My opinion is that these days you do not set out to write a new operating system. What you do is you set out to make things more productive, or you set out to make things simpler.
Or you set out to build on these ideas of distributed messaging or inner communications, one of the things Rebol is doing. And you get that ability in there. You get that all figured out. And what ends up happening is you end up with all sorts of new applications, that things that exist today but also things that people have not imagined yet.
And you build that first, that whole base of applications. And then the operating systems that run underneath become essentially meaningless to what operating system youre running on. At that point, when you have no longer a need for a particular operating system, you can remove it from the picture. And you know that could happen.
Rebol applications are not bound to the operating system. All Rebol applications are sent rest on top of Rebol. Theyre isolated from all of the [OS internals] and we make [Rebol] machine independent.
TNC: Rebol would be both the language and the platform.
CS: Well, I dont want to really make that statement now. You know people would think youre insane if you said you were going to go out and write another operating system.
TNC: Or an abstraction of an operating system.
CS: Yes. And I dont want people to think of Rebol in that way. I want people to think of Rebol in terms of messaging and dialecting and intercommunication, essentially easier ways of creating software.
TNC: Platform-independence, language and platform, these issues are central to Java. What are your thoughts on Java?
CS: Well, thats a pretty broad question. I think Java has had some very good ideas put into it. I met its implementor a number of years ago and Ive watched his work. Ive always liked James Goslings work.
The thing about Java though, is I dont think its enough. In many ways its very traditional. Of course it is independent of the big software empire. It is also a step in the right direction in terms of object oriented style and removes some of the dangers that C and C++ brought.
TNC: Are you big on object-orientation?
CS: I started out with object oriented technology in 1982. HP was one of the original alpha test sites for the Xerox Smalltalk language. And I just ate that stuff up. I got very deeply involved in object oriented technology. For many years I followed object-oriented technology.
I was involved in the implementation of various languages that were object oriented. You know I used to go to all the conferences on object oriented programming, etc. I think objects are another tool to add to your tool belt in terms of how you solve problems. But in a lot of ways theyre not the panacea. The world is not as object oriented as everyone would like to think it is. I dont tell for instance the table that Im sitting at to move across the floor. It involves a person to pick it up and carry it across the floor.
So those kinds of relationships and interactions arent always expressed well in object oriented programming. And what ends up happening is that things start to become brittle after a while, it starts to crack. What ends up happening is people start reusing code, not by driving their objects, but by going out and copying the source code to objects and then adding what they want to that source code. So driving them essentially not the way the object model is set up.
TNC: Well, one problem with C++ is that the object model is not dynamically extensible. Theres no run time binding between objects.
CS: Thats right. We saw that early on too. When C++ came out I was at Apple Advanced Technology. And there were a lot of the Xerox guys still at Apple Advanced Technology at that time. And we used to joke about it being C+- because these Smalltalk people were looking at the thing without the dynamics. Without dynamic objects, you cant do what you need to do with objects.
And thats what C++ was. And theyve tried many various you know tricks to try to get around that. But essentially all of those have been corruption of the language and the language design.
TNC: That may be why component object models are successful right now. They bring a solution to that one problem.
TNC: You mentioned Apple. What was your involvement with Apple?
CS: Apple got me involved in a parallel processor project that was very secret at the time. Its not well known but they were developing a parallel processor that ran very, very fast, at about the speed of a Cray. And they needed an object oriented operating system for it. And so they hired me to architect that object oriented operating system.
TNC: It seems like there were many operating system projects going on at Apple in the late 80s
CS: Yes, this was in the 86, 87, 88 timeframe.
TNC: Is this the project that evolved into the Taligent deal?
CS: Yes. The first part of that design did evolve into Taligent and what was going on there. But this was very early, this was before that.
TNC: To this day there is no new Apple operating system although Apple very early on identified the need for a modern, high-performance OS to support the type of applications they wanted to run.
CS: Yes, and it is ironic. And it is actually very frustrating because Im a big fan of Apple and I think that Macintosh is also a very well done machine.
And you know my feeling --and this is probably blasphemy-- is that they should have ported their whole operating system over to the Intel Architecture and let it compete directly with Microsoft. The world would be a better place today.
TNC: What are your thoughts on the evolution of technology and software development since that period? What are some comments that you make to yourself?
CS: I think were still in the early days. Before computers I was into neurology. The brain is very slow. The propagation of signals through neurons is incredibly slow.
But you look at everything the brain does and you have to ask, "Why is that so much more adaptive?" Where is that power coming from? Well its coming from parallelism. And we havent even begun to tap the power of parallelism yet in computers.
TNC: One problem is the complexity of the algorithms involved.
CS: Its not only that. It turns out there are matters of bandwidths. The brain is directly connected. Neurons synapse onto other neurons in a direct connection. Most computers use buses, where all of the data has to be transferred through a buffer that becomes a bottleneck to the whole process.
TNC: Do you think neuroscience may give a pattern, a model for computer science, for ways to develop computer systems?
CS: Yes, I think so. Back in the AI days there was a lot of that thought going into computer designs and into artificial intelligence designs. Every so many years well revisit that. I cant project when well actually be able to do this neural type of computer. But I suspect it will probably happen.