Natural Language Programming

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Formulas when being explained to someone ofc are translated to normal language in order to explain relations between parameters used, what they mean and so on. But when you use the formula in equation you no longer use natural language but rather symbolic/numeric format.
Exactly. But there's a need for both kinds of things in almost all programs. Here's a line of our code in "symbolic/numeric format" from my original post:

Intel $8B85080000008B008B9D0C0000000103.

Meaningful to the machine, no doubt; but rather opaque to mere mortals. Now here's that line, in context, as it might appear in a Plain English program:

To add a number to another number:
Intel $8B85080000008B008B9D0C0000000103.


The routine header, "To add a number to another number:", serves two purposes: (1) it explains what that obscure hexadecimal string does in terms a human can easily understand; and (2) it provides the compiler with the information it needs to determine when and how such a routine should be called; for example, when the programmer says things like:

Add 1 to a counter.
Add the item's price to the total.
Add 1/2 inch to the box's width.


Best of both worlds! And it compiles, and runs.
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
Exactly. But there's a need for both kinds of things in almost all programs. Here's a line of our code in "symbolic/numeric format" from my original post:

Intel $8B85080000008B008B9D0C0000000103.

Meaningful to the machine, no doubt; but rather opaque to mere mortals. Now here's that line, in context, as it might appear in a Plain English program:

To add a number to another number:
Intel $8B85080000008B008B9D0C0000000103.


The routine header, "To add a number to another number:", serves two purposes: (1) it explains what that obscure hexadecimal string does in terms a human can easily understand; and (2) it provides the compiler with the information it needs to determine when and how such a routine should be called; for example, when the programmer says things like:

Add 1 to a counter.
Add the item's price to the total.
Add 1/2 inch to the box's width.


Best of both worlds! And it compiles, and runs.
Again, all your examples contain only the most rudimentary of problems experienced in the real world. If I needed a program to add two numbers together, the reason for my software is pretty silly. For something so specific as add two numbers, this seems like it is a novel idea, however in a large complex system, it is cumbersome.

Let's take your pseudo code example: I would say something like "User inputs the two numbers." So, your compiler is going to have to understand (which means I have to rely on your implementation) to get the user input or I am going to have to code that myself. That would be followed by the "Add the two numbers together." Which, again, I am relying your compiler understands what add means and the numbers. And finally, "Display the numbers on screen." Again, it will have to know how to display.

The benefits of writing pseudo code is to get an general, easy to explain and modify idea of how the code will flow. The actual logic is not important at that state, and is omitted for easy of explanation to people who can't understand actual code.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Again, all your examples contain only the most rudimentary of problems experienced in the real world.
I'm trying to keep the examples short, but typical. If you want to see how our system actually handles more complex problems -- like compiling complete development systems or laying out pages with text in different fonts and sizes and colors with both vector and bitmapped graphics and direct manipulation of on-screen objects with the mouse including moving and sizing and flipping and rotating and enlarging and reducing and spell-checking, etc -- download the thing and see for yourself (write me for the link: gerry.rzeppa@pobox.com if you don't already have it).

If I needed a program to add two numbers together, the reason for my software is pretty silly.
If that was all you needed your program to do, sure. But adding numbers is fundamental to stuff like I've described immediately above.

For something so specific as add two numbers, this seems like it is a novel idea, however in a large complex system, it is cumbersome.
Or not; take a look at the system and see for yourself. We actually had fun writing those 25,000 Plain English sentences; the word "cumbersome" did not come to mind. See below for why.

The important point here is that the technique described for "adding a number to another number" is fully scalable -- we use essentially the same technique throughout the whole compiler. In fact, the compiler does very little besides recognize and format calls to routines that the programmer(s) have defined. Like Charles Moore's FORTH language on steroids. Which makes it (1) small, (2) reliable, and (3) user-extensible.

Let's take your pseudo code example: I would say something like "User inputs the two numbers." So, your compiler is going to have to understand (which means I have to rely on your implementation) to get the user input or I am going to have to code that myself. That would be followed by the "Add the two numbers together." Which, again, I am relying your compiler understands what add means and the numbers. And finally, "Display the numbers on screen." Again, it will have to know how to display.

You could do something like what you've described using the built-in console facility; here's the whole program:

To run:
Start up.
Read a number.
Read another number.
Write the number plus the other number.
Shut down.


Or you could make it fancier by adding a little helper routine:

To get a number and another number from that babe in accounting:
Write "Enter a number > " without advancing.
Read the number.
Write "Enter another number > " without advancing.
Read the other number.


The original program would then look like this:

To run:
Start up.
Get a number and another number from that babe in accounting.
Write the number plus the other number.
Shut down.


Note that the new sentence, "Get a number and another number from that babe in accounting", looks like the others; just another Plain English sentence. But this is a sentence that you "taught" the compiler how to "read" -- it wasn't part of the original system.

As you add more and more such "helpers" to your code the compiler becomes more and more accustomed to your particular "dialect" of English; it learns the common expressions and idioms that you like to use. After a while, you can just type in thoughts as they first occur to you, and it will understand. That's why the word "cumbersome" didn't occur to us when we were writing 25,000 lines of the stuff; whenever it seemed cumbersome to us, we added whatever routines were necessary to make it natural instead.

The benefits of writing pseudo code is to get a general, easy to explain and modify idea of how the code will flow. The actual logic is not important at that state, and is omitted for easy of explanation to people who can't understand actual code.
And if, with just a few (or even no) tweeks you can make that English-style pseudocode actually run, think of the time and effort you'll save! And how (automatically) well-documented your code will be!
 
Last edited:

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
It's an interesting paper. I'm sort of surprised that you didn't encounter it earlier, since you were working in the same area at the same time, but it's not the first time that's happened.

The authors note early that users find it difficult to cope with the block-oriented flow control and boolean decision-making idioms in traditional languages, but they retain those features in their proposed language, as I believe you do in yours. This is another manifestation, in my opinion, of the main issue that has been pointed our numerous times in this thread: that "natural programming" appears to lead to more lucid "code" in simple examples, but becomes difficult to manage for large, complex problems. However, I don't think there is really anything further to be gained from debating it here.

I think that the software business is more amenable to change than almost any other area of human endeavor. Half the things I'm working with daily today I hadn't heard about two years ago. The average programmer has to field new ideas every day. I think this is one business where better mousetraps get used. It's also a business where a lot of things are proposed to be better mousetraps and turn out not to be. You may be right that the moment is ripe to revisit this idea. I don't think it is, and I am not sure it ever will be. The Star Trek computer analogy has captured all of our imaginations at some point or another. But if you really think about it, the people using that computer were interacting with a query and command interface. I don't think the writers ever said so, but I'm personally damned sure the systems that supported that interface were not built using it.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Wow. I have been following this thread for the last week or however long it's been going, and finally got bored enough to try it tonight. I've only tinkered with it about 30 minutes and skimmed maybe 1/3 of the documentation, and I'm just rambling about random thoughts as they pop into my head:

The font. My god, the font. It's horrible. Your already difficult to read code (more on that later) is made much harder to read by this quasi handwritten font.

The GUI. My god, the GUI. It's horrible. To the point of being for all intents unusable. You literally take every convention of UI design and intentionally throw it out the window. I suppose you might argue that a non-computer literate person could learn it, and I suppose there's some truth in that. However, (almost) literally every person over the age of 3 has been exposed to computing in some form and has some very basic expectations of what a UI should be like. You need some kind of reasonably logical grouping of menu options. If I think I know what I'm looking for but can't find it, you force me to look randomly through 26 menus. With any other well-designed UI, I usually have 10 or fewer menus, and can make some logical deductions about where to look for what I want. Also, you need buttons, or visual cues, or something - I literally ended up randomly clicking all over the screen until I discovered how to move up a directory level. And please, support the scroll wheel and don't force your program to run in full screen mode.

Page 11 of your instructions pretty answered the questions I had about the "natural language" capabilities. I don't really see natural language there, just syntax definitions for a very verbose programming language.

You seem to have some really weird design choices in the language and in your introduction. IMO, something is extremely strange when your implementations rely inline machine code (machine code, for christ's sake? not even inline asm?), particularly when it's even necessary for trivial things like adding two bytes together. It seems particularly strange that you'd pollute your language with that considering that you already have the ability to call functions from DLLs, nor does it bode well for the maintainability or portability of your code base, IMO. And why on earth do you design a language targeted at non-programmers, and force them to do manual memory management? Not to mention introducing them to topics like event-driven programs in your very first example.

Lastly, since I've written way more than I intended, this code is extremely hard to follow. I was particularly horrified to see your instructions recommend that functions (statements? things?) be listed alphabetically. I will grant that this can be a problem with any language, but since you seem to have a penchant for 5 KLOC+ source files, it's particularly bad with yours. It also seems very antithetical to your hardline stances banning nested loops or if statements in the name of code quality.

Anyway, I think it's an interesting concept, but not something that's likely to go beyond the "toy" stage anytime soon (no offense). I'll tinker with it some more over the next few days if I have time.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
It's an interesting paper. I'm sort of surprised that you didn't encounter it earlier, since you were working in the same area at the same time, but it's not the first time that's happened.
I'm surprised I missed it myself. But there's a lot out there these days...

The authors note early that users find it difficult to cope with the block-oriented flow control and boolean decision-making idioms in traditional languages, but they retain those features in their proposed language, as I believe you do in yours.
We didn't keep as much as they did. There are none of the usual "end block" markers in Plain English. No nested IFs, no nested LOOPs and no ELSEs, either.

This is another manifestation, in my opinion, of the main issue that has been pointed our numerous times in this thread: that "natural programming" appears to lead to more lucid "code" in simple examples, but becomes difficult to manage for large, complex problems.
I think most programmers would consider the development of a Turing-complete native-code-generating compiler/linker with all the supporting faculties (including wysiwyg page layout) a "large, complex problem". That's why we chose that as our "proof of concept". We wanted to know, first hand, if "natural progamming" really did fail when the problem was non-trivial.

However, I don't think there is really anything further to be gained from debating it here.




I think that the software business is more amenable to change than almost any other area of human endeavor. Half the things I'm working with daily today I hadn't heard about two years ago. The average programmer has to field new ideas every day. I think this is one business where better mousetraps get used. It's also a business where a lot of things are proposed to be better mousetraps and turn out not to be. You may be right that the moment is ripe to revisit this idea. I don't think it is, and I am not sure it ever will be. The Star Trek computer analogy has captured all of our imaginations at some point or another. But if you really think about it, the people using that computer were interacting with a query and command interface. I don't think the writers ever said so, but I'm personally damned sure the systems that supported that interface were not built using it.
It was probably a midget in a box. I think it's encouraging, though, that the big guys with lots of resources like Apple (with SIRI), and super-math-heads like Wolfram (with Alpha), and even interactive fiction gurus like Nelson (with Inform) are recognizing that it's time we started teaching our machines to speak and understand more like we do.
 
Reactions: Quantum Robin

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Wow. I have been following this thread for the last week or however long it's been going, and finally got bored enough to try it tonight. I've only tinkered with it about 30 minutes and skimmed maybe 1/3 of the documentation, and I'm just rambling about random thoughts as they pop into my head:
Lay it on us, bro.

The font. My god, the font. It's horrible. Your already difficult to read code (more on that later) is made much harder to read by this quasi handwritten font.
Originally, the whole thing was in cursive! The font there now is my son's printing, digitized. We wanted a font that would look "friendly" to a youngster learning to code for the first time. We also wanted a font that was not copyrighted and that would be small enough to fit as a hexadecimal literal in the executable so we could be sure it would be available on everyone's machine. Turns out people either love it or hate it. We've gotten enough compliments over the years, however, to make us think seriously about selling it. You can, of course, select whatever font you want (in the text editor) and in text blocks on pages in the page layout facility.

The GUI. My god, the GUI. It's horrible. To the point of being for all intents unusable. You literally take every convention of UI design and intentionally throw it out the window.
Actually, we started with Apple's "User Interface Guidelines", and simply eliminated unnecessary components.

I suppose you might argue that a non-computer literate person could learn it, and I suppose there's some truth in that.
Now you're getting it. Shouldn't an interface be intuitive, even to the uninitiated?

However, (almost) literally every person over the age of 3 has been exposed to computing in some form and has some very basic expectations of what a UI should be like.
We wouldn't be very good iconoclasts if we just gave people what they expected.

You need some kind of reasonably logical grouping of menu options.
They are logically grouped: they're alphabetical. "Copy" is under "C" and "Print" is under "P" and "Save" is under "S". What could be simpler?

It appears you're thinking of menus like a "Table of Contents" in a book; our approach considers them more like the alphabetical Index at the back of the book. And sure, a Table of Contents might be handy when all you want is an overview of a book; but it's the Index that is chiefly used to find things once you're familiar with the thing.

If I think I know what I'm looking for but can't find it, you force me to look randomly through 26 menus.
Actually, no. Commands where a user might think of a synonym are listed both ways.

With any other well-designed UI, I usually have 10 or fewer menus, and can make some logical deductions about where to look for what I want.
Logical deductions are not needed here. This is phonics. Say the command you're looking for (out loud if it helps), concentrating on the beginning sound; then ask yourself which letter makes that sound -- and look there.

Also, you need buttons, or visual cues, or something -- I literally ended up randomly clicking all over the screen until I discovered how to move up a directory level.
You can move up a directory level by closing the current level with the Close command (it's under "C"); or you can hit the Escape key; or you can click on the Tab at the bottom of the screen.

And please, support the scroll wheel
Scroll wheels were not "standard equipment" when we developed the thing, and we were trying to reduce the number of hardware features required to run the program to a minimum in any case (to avoid "feature creep"). We'll be putting in scroll wheel support in the next release.

and don't force your program to run in full screen mode.
No can do on this one. We want to show not just what a simpler IDE might look like, but what a simpler operating system might look like as well. You can write, test, and even document an entire system without leaving our desktop; you can create, rename, delete, and backup files; you can edit; you can draw; you can create PDFs; you can print; you can do everything that needs to be done without having to look at a single Windows icon, button, scroll bar, or dialog box. And our IDE looks and behaves exactly the same on every version of Windows since Windows 95; so once you get used to it, well, you're done.

Besides, full screen mode is essential to good design in many instances: point-of-sale systems, advertising kiosks, critical systems in factories and power plants, games, etc.

Page 11 of your instructions pretty answered the questions I had about the "natural language" capabilities. I don't really see natural language there, just syntax definitions for a very verbose programming language.
You obviously have a different definition of "natural language" than we do. We mean "sentences that would be perceived by a native speaker as the written equivalent of his language."

You seem to have some really weird design choices in the language and in your introduction. IMO, something is extremely strange when your implementations rely inline machine code (machine code, for christ's sake? not even inline asm?), particularly when it's even necessary for trivial things like adding two bytes together.
Our compiler uses only 26 of the machine instructions available on an Intel x86 chip; we wanted to demonstrate that the hundreds of others are nothing but expensive, heat-generating bloat. Since we could assemble those 26 instructions easily enough in our heads, we didn't see the need for an inline assembler. And we never intended for a Plain English programmer to have to code machine language at all; that's why machine code appears, sparsely, in only the Noodle and the Compiler files. But of course machine code is needed to add two bytes together -- that's a primitive!

It seems particularly strange that you'd pollute your language with that considering that you already have the ability to call functions from DLLs...
It was the fastest and simplest way to the those bytes added together. When a DLL was more convenient and reasonably efficient, we chose that path.

...nor does it bode well for the maintainability or portability of your code base, IMO.
Machine code appears only in the Noodle and the Compiler files; and there are no DLL calls, to Windows or anyone else, above the Noodle level. Converting the code to run on a different operating system or CPU therefore requires changes in only two places. The desktop, file manager (the nasty stuff is in the Noodle), editor, dumper, and page layout facility are isolated.

And why on earth do you design a language targeted at non-programmers, and force them to do manual memory management?
Mainly because we haven't yet found a "garbage collection" algorithm that meets our ridiculously high standards. We were tempted to use the one Wirth used in Oberon, but got the feeling even he wasn't that sure about it. Besides, everyone -- even non-programmers -- should learn to clean up after themselves.

Not to mention introducing them to topics like event-driven programs in your very first example.
Oh, goodness. Event-driven programming is trivial. For anyone. In fact, I just read a study where non-programming fifth-graders were asked to describe how they thought certain video games worked, and by and large they described the process in event-driven terms.

Lastly, since I've written way more than I intended, this code is extremely hard to follow. I was particularly horrified to see your instructions recommend that functions (statements? things?) be listed alphabetically.
I think you're just not at home enough with the "CTRL-HOME CTRL-F start typing" kind of incremental find recommended in the instructions at the bottom of page 7. (It's essentially an implementation of Jef Raskin's seminal "Leap" mechanism.) And it goes a long way toward eliminating unnecessary scrolling.

I will grant that this can be a problem with any language, but since you seem to have a penchant for 5 KLOC+ source files, it's particularly bad with yours.
The two (the incremental Find command) and the five-thousand-line source files go hand in hand; either, without the other, would not be as effective.

It also seems very antithetical to your hardline stances banning nested loops or if statements in the name of code quality.
I'm not sure what you mean here since any routine with nested LOOPs or IFs can be made clearer by eliminating them.

Anyway, I think it's an interesting concept, but not something that's likely to go beyond the "toy" stage anytime soon (no offense).
It's interesting how people classify languages. Here's how we do it: If a language is (a) Turing complete (with a reasonable degree of convenience and efficiency) and (b) can recompile itself, it's a real language; otherwise, it's a mule -- sterile, unable to reproduce. Most of today's popular languages (like Javascript, HTML/CSS, PHP, Python, Ruby, etc) are mules -- or at least virgins (languages that could reproduce themselves but haven't).

I'll tinker with it some more over the next few days if I have time.
Perhaps it will grow on you...
 
Last edited:

douglasb

Diamond Member
Apr 11, 2005
3,157
0
76
It's interesting how people classify languages. Here's how we do it: If a language is (a) Turing complete (with a reasonable degree of convenience and efficiency) and (b) can recompile itself, it's a real language; otherwise, it's a mule -- sterile, unable to reproduce. Most of today's popular languages (like Javascript, HTML/CSS, PHP, Python, Ruby, etc) are mules -- or at least virgins (languages that could reproduce themselves but haven't).

Every single "popular" language you listed has usually been (up to this point) interpreted, JIT'ed, or run within a browser.

Furthermore, you left out C, Java, C#, C++, and Objective C (all of which are probably more popular than any of the languages you listed) because they obviously do not fit your example.

However, upon further inspection, even the claims you make about "mules" and "virgins" seem to be without merit:
  • Ruby
  • Python has PyPy (and perhaps others)
  • Node.js can probably do it for JavaScript, although I don't know if anyone has actually done so
  • I don't know if anyone has done it for PHP, but it definitely can be done, and there are compilers for other languages written in PHP.
  • HTML/CSS are not programming languages and shouldn't even be in this discussion.

Like Markbnj said, the fact that Plain English is bootstrapped is unremarkable. What is truly remarkable to me is that you are still pushing this thing as some sort of "holy grail" when virtually all of the feedback you have been given here indicates otherwise. I would suspect that your experience with this language elsewhere over the last 7 years has yielded similar results, which is why nobody has adopted it. Best of luck to you.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Every single "popular" language you listed has usually been (up to this point) interpreted, JIT'ed, or run within a browser. Furthermore, you left out C, Java, C#, C++, and Objective C (all of which are probably more popular than any of the languages you listed) because they obviously do not fit your example.
I got the list from Codecademy (http://www.codecademy.com/). I presumed that everyone already knew that C and its derivatives are real (and real ugly!) languages. Curious that you left out FORTH, Pascal, and Oberon from your list!

However, upon further inspection, even the claims you make about "mules" and "virgins" seem to be without merit:
  • Ruby
  • Python has PyPy (and perhaps others)
  • Node.js can probably do it for JavaScript, although I don't know if anyone has actually done so
  • I don't know if anyone has done it for PHP, but it definitely can be done, and there are compilers for other languages written in PHP.
  • HTML/CSS are not programming languages and shouldn't even be in this discussion.
Thanks for the update on who's still a virgin and who isn't. My point was that the original developers of these languages apparently didn't like their own stuff enough to use it themselves. Kernighan and Ritchie wrote C compilers in C; Moore wrote FORTH in FORTH; Wirth wrote Pascal and Modula and Oberon compilers in Pascal and Modula and Oberon -- these are guys who really believed in and lived with their own languages.

Like Markbnj said, the fact that Plain English is bootstrapped is unremarkable. What is truly remarkable to me is that you are still pushing this thing as some sort of "holy grail" when virtually all of the feedback you have been given here indicates otherwise.
If I'm pushing the thing at all, it's as an thought-provoker -- something significantly different yet self-consistent enough to make it worthy of study. Entertaining, too.

So many things are wrong (and yet taken for granted in our industry) that we think it's important for people to be given the opportunity to see that it doesn't have to be that way. Case in point: Why isn't the post editor on this forum wysiwyg? Why do I have to edit here, preview there, and then -- when I find a mistake -- return to the original box and scroll around to find the corresponding place in the "source text"? Which of you stern critics think that's good interface design? And why doesn't somebody fix it? I suspect it's because (1) most programmers don't even realize how substandard it is; and (2) most programmers don't have the necessary skills to fix it even if they realized it needed fixing.

I would suspect that your experience with this language elsewhere over the last 7 years has yielded similar results, which is why nobody has adopted it.
Actually, we released the compiler in 2006, promoted it a bit, and then got distracted by other things. My elder son Dan got married and started a family, and my wife Sharon and I spent those years nurturing our "miracle baby" (slash "how do kids actually learn to speak natural languages" lab rat) -- born when Sharon was 57 -- into the eight-year-old Plain-English-programmer he is today. So I'm really just getting back into the thing, (re)testing the waters.

Best of luck to you.
Thanks. We've got a couple of guys working on converting it to "Plain Spanish" right now (an independent contractor in California, and a university professor in Argentina). After all, why should native Spanish speakers have to learn programming (and compiler design, etc) in English? We're considering "No necesitamos ningún apestoso Inglés para escribir programas" as a marketing slogan. Ya think?
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
I think I'm starting to see the problem with your language. You're trying to be the ultimate high level language that anyone can use, but also a serious language that real programmers can use for complex tasks. As a result you've ended up with something of a mess which has pitfalls for both sides.

Your documentation definitely plays up the simple and accessible aspect. You've made your own GUI and so on to try to appeal to non-programmers, but you've also saddled them with difficult concepts which you like (or maybe you just can't escape from?). Touching on a few examples that I already mentioned:

Event-driven programs may be trivial to you. Have you ever taught an introductory programming course? You may be surprised just how many college-level students have serious trouble grasping other trivial concepts that are much simpler than an event-driven UI.

Despite what your documentation claims, memory management is most definitely not trivial. "Just remember to destroy what you create" is, frankly, a bold-faced lie. Determining object lifetime and ownership is a challenging problem, even for experienced programmers.

There are other issues along similar lines, like the inclusion of pointers and linked lists (even though they may be called "things"), but I don't have time to delve into all of them.

For experienced programmers, you have a whole host of other issues:

At the moment, Notepad is a more function editor than the one included in your program. It's interesting that on one hand you defend this terrible editor by claiming it eliminates preconceived ideas, but on the other hand defend it by saying that I'm just familiar enough with your "Ctrl-Home, Ctrl-F" preconceived notion of how one should navigate an editor.

As far as I can tell, there is no way to decouple the language from the editor. Even if you love your editor, preventing people from using tools how they desire is a HUGE barrier to entry. Expose a command line interface for your compiler/linker/debugger and save everyone some headache.

The whole attitude of your documentation is a turn-off. Although I think it was probably written (at least partly) in jest, I suspect that roughly 98% of programmers will read your page on debugging, roll their eyes, and think some variation of "you've got to be kidding". Just as I did.

You obviously have a different definition of "natural language" than we do. We mean "sentences that would be perceived by a native speaker as the written equivalent of his language."

I mean that your "sales pitch" in this thread gives the impression that you can essentially just talk to the compiler, and it's able to parse your intent. That's not the case - your language adheres to a particular grammar, just like Java, C, or anything else. You just happen to have a grammar that's less rigid and more verbose than most other languages.

It's interesting how people classify languages. Here's how we do it: If a language is (a) Turing complete (with a reasonable degree of convenience and efficiency) and (b) can recompile itself, it's a real language; otherwise, it's a mule -- sterile, unable to reproduce. Most of today's popular languages (like Javascript, HTML/CSS, PHP, Python, Ruby, etc) are mules -- or at least virgins (languages that could reproduce themselves but haven't).

Eh, that's not a very high standard. Brainfuck is a Turing complete language, and it's a toy. IIRC, C++ templates form their own Turing complete language. You can do some amazing wizardry with template metaprogramming, but anyone using it to solve complex problems should probably be smacked upside the head.

Also, perhaps I'm missing something (I've never been that into the theory side of things) but shouldn't any Turing complete language be capable of self-compilation? I imagine most of the languages you mentioned are built on a C or C++ backend for performance reasons, but I'm not really seeing why they couldn't self compile if you wanted them to.

Perhaps it will grow on you...

Honestly, it seems like a language that was designed in response to the 80s/90s era when most people wanting to learn programming were stuck choosing between C and BASIC. Yeah, we've moved past that point. Like I said, the concept is interesting, but so far I really see no advantages over modern high level languages like Ruby or Python.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
I think I'm starting to see the problem with your language. You're trying to be the ultimate high level language that anyone can use, but also a serious language that real programmers can use for complex tasks.
Fair enough. Plus providing experienced programmers the opportunity to question some of their (possibly erroneous) preconcieved notions about the subject. Plus laying the groundwork for an "apparently intelligent" machine like the HAL 9000. But yes. I definitely wanted a single, simple, consistent tool -- language and interface -- that I could use to teach my eight-year-old son everything from basic programming to compiler design.

As a result you've ended up with something of a mess which has pitfalls for both sides.
Well, it's a tall order.

Your documentation definitely plays up the simple and accessible aspect.
Was it Einstein who said, "If you can't explain it simply, you don't understand it yourself"?

You've made your own GUI and so on to try to appeal to non-programmers...
And as an example for others. When I see my dentist's receptionist with all sorts of windows and widgets on her screen that she doesn't need, it strikes me that programmers would benefit from being reminded that it's okay to "make your own GUI" to better serve your user.

...but you've also saddled them with difficult concepts which you like (or maybe you just can't escape from?). Touching on a few examples that I already mentioned:

Event-driven programs may be trivial to you.
And the fifth-graders in that study I mentioned.

Have you ever taught an introductory programming course?
Yes. I've taught database design and programming, in both classroom settings and private tutoring sessions, for the past 30 years; literally thousands of students, of all ages, have been under my tutelage.

You may be surprised just how many college-level students have serious trouble grasping other trivial concepts that are much simpler than an event-driven UI.
I've found that nothing is hard for any motivated student when (1) the student has the proper prerequisites, and (2) the new material is presented step-by-step.

Despite what your documentation claims, memory management is most definitely not trivial. "Just remember to destroy what you create" is, frankly, a bold-faced lie.
Or an accurate description.

Determining object lifetime and ownership is a challenging problem, even for experienced programmers.
Not the way we write code. I don't recall having a single "memory leak" during the development of our entire system (25,000 lines) that wasn't both the result of a mere minor oversight and easily fixed.

There are other issues along similar lines, like the inclusion of pointers and linked lists (even though they may be called "things"), but I don't have time to delve into all of them.
As I said above, we wanted a system that would be useful to both beginners and experts; that would support the writing of everything from simple console applications to advanced wysiwyg page editors to native-code-generating compiler/linkers with the very same language and interface. So we wrapped up our linked lists as "things" for the beginners, but left enough of the details exposed for those who wanted to delve deeper.

For experienced programmers, you have a whole host of other issues:

At the moment, Notepad is a more function[al] editor than the one included in your program. It's interesting that on one hand you defend this terrible editor by claiming it eliminates preconceived ideas, but on the other hand defend it by saying that I'm just [not] familiar enough with your "Ctrl-Home, Ctrl-F" preconceived notion of how one should navigate an editor.
It's almost always true that "user friendly is what the user is used to." And different tools are designed to be used in different ways. As I mentioned before, our editor is designed to be used with Raskin's "Leap" paradigm rather than the ubiquitous "scrollbar" paradigm. You will, of course, hate the thing if you try to use it as you would use Notepad (or some other, traditional editor). But use it as it was intended for a while, and I think you'll see the advantages of the thing.

As far as I can tell, there is no way to decouple the language from the editor.
Right. That's why it's called an integrated development environment. When the compiler finds an error, for example, it automatically takes you, in the editor, to the file and line where the error was discovered.

Even if you love your editor, preventing people from using tools how they desire is a HUGE barrier to entry.
"Barrier to entry" for whom? We're well aware that it's unwise to put new wine into old wineskins. If the advantages of our approach aren't more-or-less immediately obvious (or at least enticingly curious) to someone, that someone is most likely beyond our reach. Next person in line, step up!

Expose a command line interface for your compiler/linker/debugger and save everyone some headache.
By "everyone" I think you mean "that subset of humanity that is both heavily left-brained and experienced with command-line processing." Not going to happen. Neither is a completely "visual programming" interface going to happen. Some things are better expressed/done with words, some with pictures. The balanced interface (like the balanced brain) is what we're striving for.

The whole attitude of your documentation is a turn-off. Although I think it was probably written (at least partly) in jest...
We put a lot of humorous stuff in there, to be sure. But we're also quite serious about every point that is made.

I suspect that roughly 98% of programmers will read your page on debugging, roll their eyes, and think some variation of "you've got to be kidding". Just as I did.
No, not kidding at all. That's how we really did debug the thing -- and it's a non-trivial program. That's pretty much how programming greats like Niklaus Wirth used to debug, as well.

Whenever possible, I have the programmers who work for me work in teams of two. We set them up with two monitors connected to a single machine (both monitors displaying the same stuff) and we have one guy run the mouse while the other runs the keyboard. The mouse guy is the leader; the keyboard guy types what he is told (or what he anticipates the leader is thinking). The players switch roles from time to time as different kinds of expertise are called into play (typically one guy is more left-brained and does better with the math stuff; the other is more right-brained and does better with the interface stuff). We've found this technique exceptionally beneficial: every line of code is double-checked by two pairs of eyes before it is run; left- and right-brain biases are balanced in both the design and code; and nobody turns into a cranky and friendless introvert who prefers machines to people.

I mean that your "sales pitch" in this thread gives the impression that you can essentially just talk to the compiler, and it's able to parse your intent.
And you can, for the most part, once you code up enough "helper routines" that describe the kind of things that you want to say to the compiler. Remember, every routine you code becomes, in essence, part of the (now expanded) syntax of the language. But of course it's only a prototype, an experiment. A "proof of concept". It still needs lots of further work.

That's not the case - your language adheres to a particular grammar, just like Java, C, or anything else. You just happen to have a grammar that's less rigid and more verbose than most other languages.
All languages, including natural languages like English and Spanish, have a grammar. But our grammar and parsing are not like Java's and C's. Consider, for example, the keywords in those languages, words like "typedef" and "struct" and "void" and "volatile"; now consider our keywords: words like "a" and "the" and "of" and "in" -- articles and prepositions, for the most part. In other words, our compiler keys off the true "marker" words of the language, the words that naturally appear between what you're talking about. Which allows the programmer to extend the syntax and grammar of the language simply by programming: every new routine not only accomplishes some end, but becomes an automatic and immediately operational template for additional sentences forms. If you must compare our language with others, try FORTH; the similarities there are much more pronounced.

Eh, that's [Turing complete, able to recompile itself, with reasonable convenience and efficiency] not a very high standard.
But it is. How many programmers here, do you think, have ever created such a thing? And how many popular languages can't meet that standard?

Brainfuck is a Turing complete language, and it's a toy.
Because it doesn't meet the criteria above: it's not convenient.

IIRC, C++ templates form their own Turing complete language. You can do some amazing wizardry with template metaprogramming, but anyone using it to solve complex problems should probably be smacked upside the head.
Again, failure on the "convenient" part of the standard.

Also, perhaps I'm missing something (I've never been that into the theory side of things) but shouldn't any Turing complete language be capable of self-compilation?
Capable of, yes. But someone has to actually write the compiler before it meets our standard (else it's a mere virgin). And most such projects would fail either on the "convenience" or "efficiency" requirements.

I imagine most of the languages you mentioned are built on a C or C++ backend for performance reasons, but I'm not really seeing why they couldn't self compile if you wanted them to.
Many of them could. The question is why the original developers of such languages (1) produced something that performed so badly, and (2) didn't want to use them to reproduce themselves.

Honestly, it seems like a language that was designed in response to the 80s/90s era when most people wanting to learn programming were stuck choosing between C and BASIC.
Actually, it was designed in response to the era in which it was developed (2005-2006). Programming just wasn't fun anymore. Instead of a small language that could be mastered in a day, with a small library of intuitive and useful functions for manipulating the screen, disk, mouse, keyboard, printer, and communications port, we were faced with mammoth frameworks and convoluted APIs and ill-conceived object-orient paradigms that forced us to spend our time learning about someone else's way of doing things and searching huge files for not-quite-the-right object to do what we wanted to do. I don't know what that is, but it's not programming.

Yeah, we've moved past that point.
But too far (or not far enough!) to get a wysiwyg post editor on this forum!

Like I said, the concept is interesting, but so far I really see no advantages over modern high level languages like Ruby or Python.
We think the "last" programming language will allow us to produce the "ultimate" in code: something like a math book: a natural language framework with snippets of specialized syntax (and even graphics) where appropriate. If we're right, Plain English is a step in that direction because it's a trivial matter to add specialized sub-compilers to handle those snippets to our system; but it's next to impossible to add Plain English processing to a language like Ruby or Python. That's the advantage, though it's a future one.
 

douglasb

Diamond Member
Apr 11, 2005
3,157
0
76
I got the list from Codecademy (http://www.codecademy.com/). I presumed that everyone already knew that C and its derivatives are real (and real ugly!) languages. Curious that you left out FORTH, Pascal, and Oberon from your list!

The key word was "popular" (hence the quotes when I mentioned it the first time). I was referring to languages that are still commonly used today. The 5 I listed are #1-5 in popularity according to the Tiobe index (which is probably a fairly decent metric for this sort of thing).

If you had listed Pascal as a popular language 20+ years ago, you would have been right. The other two never caught on. FORTH doesn't even crack the Tiobe top 50 and Oberon doesn't appear at all in the top 100 (which is where the list stops).


We're considering "No necesitamos ningún apestoso Inglés para escribir programas" as a marketing slogan. Ya think?

Love it. By far the best idea you've put forth yet on this forum.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Right. That's why it's called an integrated development environment. When the compiler finds an error, for example, it automatically takes you, in the editor, to the file and line where the error was discovered.
----------
"Barrier to entry" for whom? We're well aware that it's unwise to put new wine into old wineskins. If the advantages of our approach aren't more-or-less immediately obvious (or at least enticingly curious) to someone, that someone is most likely beyond our reach. Next person in line, step up!

Wow...

I seriously cannot think of any major programming language which requires you to use one single IDE for your development in that language. In fact, all of the best IDEs tend to be built around integration with many different languages: Visual Studio, Netbeans, Eclipse, Code::Blocks. Not to mention that a very significant number of developers strongly prefer working with full-featured text editors (Emacs, vim, etc) while using command line build tools.

Honestly at this point I can't tell if you're trolling us, or if you're genuinely so idealistic that you're ignorant about how your tool choices are a turn off for 95% of serious developers. Either way, I suppose I fall into the category that you consider "beyond reach", so I'm just going to say good luck and bow out of this thread.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Wow... I seriously cannot think of any major programming language which requires you to use one single IDE for your development in that language. In fact, all of the best IDEs tend to be built around integration with many different languages: Visual Studio, Netbeans, Eclipse, Code::Blocks. Not to mention that a very significant number of developers strongly prefer working with full-featured text editors (Emacs, vim, etc) while using command line build tools.
I think we just come from different backgrounds; different programming language "families." Pretty much every language I've programmed in was packaged with a single, integrated IDE -- from Applesoft Basic to QuickPascal to Delphi to Oberon.

If you have the time someday, take a look at this book:

http://www.ethoberon.ethz.ch/WirthPubl/ProjectOberon.pdf

It will give you a better idea of the kind of system (and the kind of programmer) we admire, and why we were led to develop the kind of fully integrated system we did.

Honestly at this point I can't tell if you're trolling us, or if you're genuinely so idealistic that you're ignorant about how your tool choices are a turn off for 95% of serious developers.
I think 95% my be a bit high, but yes, our way of thinking doesn't generally appeal to programmers brought up in the "C" and Linux and object-oriented (and even the Microsoft) way of doing things. Macintosh programmers don't typically think it's odd, though -- in fact, they generally expect a standard, integrated way of doing things.

Either way, I suppose I fall into the category that you consider "beyond reach", so I'm just going to say good luck and bow out of this thread.
I really don't know you well enough to do any categorizing. But thanks for your comments. I wish you well.
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
Two questions. First - dang, dude, what happened to all those millions???

Second - did you ever fix that horrible font and interface thingy?

I am most certainly NOT a programmer, and when I think back to the more complicated programs I've written in LISP or FORTRAN I can't imagine how I'd structure them in plain English so that I'd get what I wanted. But not being a programmer - which seems to put me in the target demographic here lol - I am intrigued. Only I can't deal with any more ugly in my life.
 
Last edited:

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
Two questions. First - dang, dude, what happened to all those millions???

Spent 'em. Took me nearly 30 years, and a great ride it was. Time to start again.

Second - did you ever fix that horrible font and interface thingy? I am most certainly NOT a programmer, and when I think back to the more complicated programs I've written in LISP or FORTRAN I can't imagine how I'd structure them in plain English so that I'd get what I wanted. But not being a programmer - which seems to put me in the target demographic here lol - I am intrigued. Only I can't deal with any more ugly in my life.

The font and interface on the prototype are still the same, but I suspect the final product this time around will be quite different -- especially since we're accepting design input from anyone who backs the project with $1 or more!
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
Spent 'em. Took me nearly 30 years, and a great ride it was. Time to start again.

The font and interface on the prototype are still the same, but I suspect the final product this time around will be quite different -- especially since we're accepting design input from anyone who backs the project with $1 or more!
That's a great answer and I wish you well, but I cannot deal with that font. Your compiler was on my machine for literally two minutes. Just the instructions defeated me. That is a type of ugly for which a word has not yet been coined.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
That's a great answer and I wish you well, but I cannot deal with that font. Your compiler was on my machine for literally two minutes. Just the instructions defeated me. That is a type of ugly for which a word has not yet been coined.

Changing the font in the documentation quickly, I'm sorry to say, is difficult since each topic was laid out to fit perfectly on a single 8-1/2" x 11" printed page; an alternate font would have to have exactly the same metrics.

Changing the default font for the interface and editor, however, is a trivial matter. You simply find the line in the Noodle that reads...

Put "osmosian" and 1/4 inch into the default font.

...and change the font name from "osmosian" to the desired one.

Note that the font used in the text editor has always been user-selectable via a menu command. Just look under "F" when the text editor is open. Fonts in the built-in wysiwyg page editor are also user-selectable via the same commands.

If you like, I can email you a version of the system with the default font set to the font of your choice.
 

Gerry Rzeppa

Member
Dec 13, 2013
195
1
76
www.osmosian.com
I don't think it's a good idea just because of the word order. Commanding a child or animal or other person is not same as programming. If you want to program a door opening, for a compiler it is difference when you write open door, door open or open the door, same as, shut door, shut the door, or close door. It would be possible but then programmer would have to remember all sorts of exact commands which would be hard to sustain, given the way we use our natural language, so it's easier for both compiler and programmer to remember door_open=1 or door_open=0 and use the command accordingly.

On the contrary, with a well-developed Plain English library the programmer doesn't have to remember anything -- he'd simply expresses a thought as it naturally occurs to him, and it compiles. "Clear the screen" or "Erase the screen", it doesn't matter -- since the same thought is being expressed.

Anyway, with natural language anyone virtually could be a programmer without hard effort and I don't agree with this, if I have to work hard to become doctor, lawyer or firefighter and more, why not a programmer?

You may as well say that professional authors should write in a different language than the people who read their books lest those readers become tempted to become authors themselves. Writers and good writers are two different things, but it's not the language that makes the difference -- it's the skill with which the language is used. Ditto for programmers.
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
On the contrary, with a well-developed Plain English library the programmer doesn't have to remember anything -- he'd simply expresses a thought as it naturally occurs to him, and it compiles. "Clear the screen" or "Erase the screen", it doesn't matter -- since the same thought is being expressed.

What are you using to extract the intention from unstructured language? In my current project at work I have implemented coarse geocoding (finding places in raw text) using the Stanford NLP model as well as Open NLP. Are you using a similar approach?
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |