A Course in BTRON MicroScript for Beginners

Chapter 1: Introduction


BTRON MicroScript is a high-level visual programming language developed by Personal Media Corporation for the BTRON3-specification Cho Kanji operating system that runs on IBM-PC/AT-compatible personal computers. In addition, it also runs on PMC T-Shell developed for the T-Kernel real-time operating system that is standard equipment with the T-Engine series of embedded system development boards, thus making MicroScript a cross platform programming language.

To users of Microsoft Corporation's Windows operating systems, the closest thing to MicroScript is Visual BASIC, which Microsoft no longer markets. To users of Apple Inc.'s Macintosh computers, MicroScript is similar to HyperTalk, particularly since it runs on top of a hypertext filing system, as did HyperTalk in the HyperCard environment. To users of GNU/Linux, MicroScript is best described as a shell scripting language, of which there are many in the GNU/Linux world.

In order to help the absolute beginner understand what MicroScript is and what it does, the best thing to do is to describe the two main ways of programming in the world of BTRON.

Professional programmers, such as those at Personal Media, use the cross development method based on the C programming language, which is usually called just C language, as their standard application development method. What this means is that programs are developed on a completely different software platform, specifically the GNU/Linux operating system and the GNU development environment, and then they are transferred to the BTRON target system for testing. Accordingly, to program in this manner, one has to:

(1) Learn how to use the GNU/Linux operating system
(2) Learn how to use C language and the GNU development environment
(3) Learn how to use the BTRON operating system
(4) Learn how to use the BTRON software development kit
(5) Read and understand the BTRON operating system specification
(6) Read and understand the TRON Application Databus (TAD) specification

Even for a very motivated programmer, all of the above will take a year or two to master, and not all of the documentation is in English. Moreover, even if you know all of the above, programming in C language is a very time consuming process, particularly when it comes to interface design for applications. Therefore, to help both the seasoned professional programmer and the hobby programmer create interfaces and build simple but useful programs, Personal Media developed the MicroScript programming language. To program in MicroScript, one has to:

(1) Learn how to use the BTRON operating system
(2) Learn how to use MicroScript

Compared to programming in C language on a separate development platform, programming in MicroScript can be mastered in a matter of a few to several months, even in the case of an absolute beginner. A highly experienced professional programmer could probably learn BTRON and master MicroScript in a month or two. So MicroScript, just like other high-level programming languages on other architectures, saves people a lot of time.

Just for the record, there are other programming languages that run on top of BTRON. For example, inside the free software collection that comes bundled with the Cho Kanji operating system, there is "sed/awk(gawk)." On the Web, one can find ports of the "Ruby" and "Mind" programming languages, and the "Perl" language was included in a freeware self development environment on a CD-ROM disc in an issue of the TRONWARE bimonthly magazine. However, the most widely used alternative to standard BTRON programming or MicroScript is the WideStudio integrated development environment, which can be downloaded free of charge from the Web. WideStudio allows programs to be moved from one operating system to another, although the results may not be optimal and/or require the rewriting of certain sections of code at the lower levels for the ported applications to operate smoothly.

For that reason, the best way to get started in programming BTRON is to learn MicroScript. This is particularly the case for the absolute beginner, but it is also true for the seasoned programmer. Learn how to use MicroScript first, then move on to other things.

Computer Languages, Programming, and Scripting

One of the biggest problems in the field of computer science is the cavalier use of words. Consider the term "word processor," for example. Even people who have never set foot in a computer science class know that everything inside a computer system has a number, so the obvious question is--how are words numbered in a word processor? In fact, a word processor does not deal with numbered words, it deals with numbered characters, which are the equivalent of the printing type of a mechanical typewriter. So a word processor is in actuality a "type processor," and even when you are using a word processor's spelling checker, you are comparing electronic printing type clusters in a document you created against the list of electronic printing type clusters in the spelling checker's dictionary.

Another computer science term that has come into widespread use that is misleading is the term "artificial intelligence." To understand what's wrong about this expression, consider what the term "artificial blood" would mean. Artificial blood would be something that provides some or all of the functions of human blood in the human body. Contrast that with the blood applied to the face of an actor in a cinematic fight scene. That blood would just have to just look like blood, and it wouldn't have to possess any of the functions of natural human blood. Such blood could be termed "simulated blood." With this distinction of artificial and simulated in mind, it is obvious to even the casual observer that what goes on in the field of artificial intelligence is actually the creation of "simulated intelligence."

This brings us to the term computer language, which is another computer science term that is very misleading. Many people know that the word "language" comes from the French word for "tongue," and that it refers to the system through which two humans normally interact with each other verbally. Notice what the word implies--a common means, lingual articulation, is used by two biological entities of the same type, origin, and capacity (both normal humans who speak the same natural language; processor A equals processor B) to interact with each other. However, in the case of human-to-computer interaction, humans and computers do not even share the same hardware; all a computer is really capable of doing at the hardware level are simple arithmetic and mechanical logic operations. Moreover, the interaction is one-way; the computer language allows only the human to interact with the computer hardware, not vice versa.

To truly understand what inappropriately named computers languages are, we first have to understand what a modern electronic computer is. A computer is first and foremost a machine, but it is a unique machine. Unlike other machines, it does not move. Think of that--most machines do work by moving, but a computer doesn't physically move. In a physically moving machine, the movement is the methodology, and that methodology normally doesn't change. In a non-moving machine like a computer, the methodology can be changed, and it is for this reason that the computer is sometimes referred to as an "all-purpose simulator." And what is used to create and change the methodologies used inside a computer? Well, it is none other than computer languages, which are in fact "software tool sets for creating mechanical methodologies that operate inside a unique group of machines that do not move to do work assigned to them by humans."

There is bewildering array of these software tool sets called languages, but they are generally divided into high level and low level. Low level languages are aimed at the hardware, and they require intimate knowledge of computer hardware to use. High level languages are, for the most part, aimed at particular computing tasks, which include everything from creating operating systems (C language) to software that simulates human intelligence (Lisp, Prolog). The most important thing to remember, however, is that they have evolved as computer hardware, operating systems, and networks evolved. In particular, as processing power and memory increased exponentially, operating systems grew exponentially with more and more functions. Consequently, the computer programmer, who rarely took into account the operating system on early computer systems, now had to study voluminous specifications to write standardized software, as was pointed out in the previous section.

Naturally, if programming became difficult for the professional who was actually paid to do it, it became all but impossible for the average person who had spend most of the day doing something other than programming for a living. On top of that, the unique operating system specifications of each operating system vendor made the porting of software from one computer to another difficult for all but the most powerful software companies. And then, there was the Internet through which people wanted to share software. As a result, there was a need for languages that could operate across systems and across networks, and this need was met by the development of scripting languages, which have become the most important languages for software development in recent years. In particular, Java, Flash (actually ActionScript in the Flash environment), and JavaScript have become backbone programming languages of the Internet and the World Wide Web, although various others are also used.

One thing scripting languages usually have in common is that operate at the top of the operating system, using what's below as resources. In system charts, the scripting language is usually shown just below the application software level, so it should come as no surprise that Java applications are called applets. Another thing scripting languages usually have in common is that they are interpreted through an interpreter line by line, rather than being compiled through a compiler en bloc. On slower processing machines, the latter leads to faster execution, but on today's blazingly fast processors, even interpreted languages can do things only compiled languages could do previously. Perhaps the most important thing scripting languages have in common is that they insulate the programmer from the complexities of the operating system, and thus allow for the creation of smaller programs. Since programming errors dramatically decrease as programs decrease in size, software development and debugging times become shorter.

All of these features--high-level positioning, interpreted execution, insulation from the complexities of the operating system, and small program size--are what make scripting languages attractive both to professional programmers who have to rapidly develop an application, and to beginners who have neither the time nor the inclination to master standard application programming techniques. To them, scripting is a godsend, and one well designed language for the purpose of scripting is the MicroScript programming language that runs on top of the BTRON-specification operating system.

Scripting Gives Us Freedom, and That's What It's All About

In the previous section, we pointed out that as computer hardware grew exponentially, so too did operating systems. This in turn led to programming becoming more and more difficult for the average person, because there were large operating system specifications to read and follow. What should also be pointed out is that application software likewise grew in size exponentially in parallel to the advances in computer hardware and operating systems, and its development also became more and more difficult. Today's de facto standard business applications are so large and so feature laden that it takes months to master them, and they have to be mastered if one wishes to successfully apply for work in a large corporation or government agency. Since one of the main purposes of education is to prepare people for the job market, this has greatly influenced computer education in schools. A good portion of K-12 computer education today is more likely to be centered on teaching students how to use de facto standard word processing, spreadsheet, e-mailer, and Web browser programs than on any form of programming. At the technical college level, students majoring in computer science will, of course, be taught the basics of programming in C language, but a huge portion of their education will be aimed at teaching them how to set up servers and maintain internal computer networks.

However, things weren't always this way. In the early days of personal computers, exactly because there were no standardized business applications, large numbers of children were taught how to program computers, and languages were specially designed to enable them to do so. While BASIC was originally developed at Dartmouth College to enable college students studying the humanities to write programs, Logo (a derivative of Lisp) developed at Massachusetts Institute of Technology, and Smalltalk, a language developed at the Xerox Palo Alto Research Center that pioneered object-oriented programming, were specifically designed for teaching children how to program. Nicholas Negroponte, the brains behind the One Laptop per Child (OLPC) project, has lamented the fact that not much time is spent on teaching children how to program anymore, because he considers it "absolutely fundamental." His opinion is based on the work of Logo developer Seymour Papert, who discovered that children who write programs understand things differently, and when they debug those programs, they come the closest to learning about learning. But this should be taken one step further, "understanding things differently" and "learning about learning" are two aspects of freedom. They set one apart from the herd, and they allow a person to function independently.

Since this course is aimed at beginners, we do not expect any beginner who reads it to reach the level of being able to write their own software applications and becoming totally free and independent of standard applications, be they commercial or open source. We do, however, expect beginners to reach a level where they can write clever little scripts that will make their lives easier and more entertaining. For example, imagine if you scanned some maps of the city you live, surrounded them with a simple coordinate system, and then developed a small conversion program that would that could take the coordinates output by a GPS terminal and translate it into your simple coordinate system. You would have a moving map system that could be used free of charge outside of a wireless network provider's moving map system. Even simpler things that come to mind are flash cards for learning a foreign language, or animated flash cards that show the user how to write Chinese characters. The possibilities are endless, and they only depend on the "imagination" and "motivation" of the MicroScript programmer. If you can see a simple application in your mind and you really want it, then you can develop it. And if you don't think this is true, take a look at the life histories of the young people who started the personal computer industry. They were all imaginative and motived--highly motivated.

In closing, for those who think scripting is way too difficult and they'll never be able to master it, keep these three simple facts in mind.

Anyone can learn how to write useful scripts, no matter what their age or background. A script that's useful to you doesn't have to be large at all. So let's take the plunge and dive into BTRON MicroScript scripting.


Copyright © 2011 Sakamura Laboratory, University Museum, University of Tokyo