Saturday, April 30, 2016

Writing an IDE Expert (Toronto Delphi User Group, April 2016)

Meeting notes including links to sample code for writing your own IDE wizards is over on the Toronto Delphi user group site:

http://www.tdug.com/2016/04/april-meeting-follow-up-2/

That article has links to some open source repositories with my starter expert dll and wizard bpl code samples.  

Sample expert and wizard repositories on bitbucket:

https://bitbucket.org/wpostma/helloworld_delphi_expert/src/master/ (git)

https://bitbucket.org/wpostma/helloworld_delphi_wizard/src/master/ (git)

Sunday, April 24, 2016

Patterns in the History of Computing: The Birth, Life and Death of the Tech Empire

As a student of history and a geek deeply interested in computers, Computer History is a personal passion of mine.

I am particularly interested in unlikely upstart success stories, most of which have a predictable arc:


  • A founder has an idea which is considered ridiculous to whatever industry he or she plans to disrupt.
  • A founder executes a minimum viable product, and iterates. 
  • Building an ever growing product line, the company flourishes, expands, and reaches a point I will call the Apogee, the highest point in an orbit.
  • Someone else has an idea which is going to disrupt this founder's business. This founder ignores this disruptive change and continues on the original plan.
  • The company, after realizing too late that a change in the market is afoot, eventually dies or is acquired.
  • We fondly remember the company, and its founders, who made so many pivotal or important technologies, and which is now all but gone.
I think anybody here can list 100 of these, but today I'd like to talk about DEC, and Ken Olsen, and do a brief retrospective on his accomplishments, his brilliance, and his critical mistake.

What do we owe to DEC and Ken Olsen?  The original internet machines built by Bolt Beranek and Newman were built around DEC hardware modules.  The ultimate success of Ethernet networking was due to collaboration between Xerox and DEC.  Xerox could be another example of a failed company, but rather than dying, they're merely a niche imaging company instead of the godfathers of computing.  The idea of owning your own computer and the computer being used directly by individual operators, a key element of Personal Computing, was first made possible by small DEC machine that were not even called "Computers" in the earliest days because the term was too strongly associated with the priestly caste of IBM mainframe programmers in their glass-walled temples.  And yet, Olsen's failure of vision were twofold.  He failed to move DEC towards RISC technology fast enough that they could realize the architecture benefits of RISC, which have informed subsequent CISC architecture designs, while RISC itself is dead, the process improvements in silicon ULSI design and fabrication that RISC permitted have lived on.   He famously derided the idea that personal computers, of the kind that Microsoft wanted to see proliferate would eat DEC's entire cake, killing the VAX and the PDP-11, and almost every 1970s mainframe and minicomputer company.

What is ironic to me is that DEC became what it originally was intended to become an alternative to.  Today's developers would not see much distinction between an IBM system 360 and a VAX 11/780. Both are dinosaurs, artifacts.

I actually took a whole term course in 1990, not that long ago, on VAX Assembly language. What the hell was the University of Western Ontario thinking when it set up my curriculum? VAX Assembly language?  Yet I'm happy I learned it.  The VAX architecture was and is beautiful. The VMS operating system was beautiful.  Dave Cutler, the Microsoft alpha geek (ha, did you get that pun?) behind Windows NT, basically rewrote VMS and it's running on your computer today, first it was called OS/2, then Windows NT, later Windows XP and today it's called Windows 10. It's the same kernel, and its architecture owes a lot to VMS. Like VMS,  Windows is not the only operating system that runs on PC architectures.  Unlike DEC, Microsoft at one point in its life made a lot of its money selling software. What would a Microsoft that makes most of its money selling Cloud and SaaS plus selling Enterprise platforms and tools look like? We're about to find out.

Today, Microsoft in 2016 is at the same point that DEC was in 1988. You can see Microsoft hosting huge events like Build 2016.   They have money, they have influence, and developer mindshare everywhere except on mobile.   They have a brilliant CEO who like the founder at Microsoft, is also a competent technologist.  They understand that Microsoft without change internally, is the same company in 2016 that DEC was in 1988, a few years away from irrelevance and death, unless they pivot. IBM pivoted and is now 90% an IT services and consulting company and maybe 10% a mainframe hardware company.  IBM will still be around in ten years.

What does it mean to pivot?  Microsoft is executing one right now. Go look. At Microsoft, it's free Visual Studio Community, free Xamarin,  Ubuntu Bash running unmodified user binaries on Windows 10 desktops, it's .net core, a radical (and beautiful) rebuild of the .net platform for the next 30 years of cloud and corporate computing.   Will Microsoft break the chain of companies with disruptive ideas (A computer on every desk) and unlike DEC, still be around in 20 years? I think it will.  Will Blackberry? I don't think so.   

What about the things you build? What about your company? Will you and the leadership in your organization recognize disruptive change, and the need to pivot your organization to survive? What if today you are a software vendor but you need to become a SaaS IT Provider to survive, or precisely the reverse? How will you know?  More thoughts on that later.  Only this in conclusion: The market will shift. Your skills and your current product will become a commodity, or worse, a worthless historical artifact, like buggy whips.  How will you adapt, and change so that you, and your organization will flourish?

Tuesday, April 12, 2016

Linux Essentials for Delphi Developers

There is currently no way using Delphi to target Linux. Long ago there was a thing called Kylix that worked on one version of RedHat Linux, barely, back in the 1990s. But in the Community road-map, targeting a fall release, there might be a way to target Linux Servers.  Here's hoping.  If that happens, or even if that's delayed a bit, now is a fantastic time to hone your Linux skills.    I'm not going to tutor you.  You can google probably almost as well as I can.  But I am going to outline a plan of attack for a competent Windows developer to learn the essentials of Unix systems, with a focus on Linux.  I recommend this plan be carried out on a virtual machine inside your main windows PC. You can NOT learn everything there is to know about Linux just by using the Windows Subsystem for Linux.  There's no linux kernel, no linux networking stack, no desktop environment in the WSL.  Learn on an Ubuntu VM.



My belief is that Linux matters on the Server because:


  • It is currently the right way to deploy for the web in 2016. 
  • It is the right technology for cluster scale technologies.
  • It is currently the right way to build systems that are easily administered remotely, whether in the cloud, or at remote sites, or in large numbers.
  • It is a lighter weight technology and currently has mature support for containers, big data technologies, and about 1000 other things in that vein.
  • It has a better way of upgrading, without requiring as many reboots.
  • It has a mature set of binary dependency management (package installer tools), container and orchestration tools.

There are several aspects to learning to be a competent Linux server developer

  • You can install, upgrade, troubleshoot and maintain both client and server Linux systems.  You know the 50 most common command line tools and their everyday uses. You can log in, change your password, obtain root access, check what groups a userid belongs to, install and remove, and upgrade packages.
  • You have installed and learned several different distributions.  The entire concept of distributions deserves some study by a person who wants to know what Linux is. You know not only how to use apt-get (on debian and ubuntu) but several alternatives such as those on RedHat/Centos and others.  You know roughly what changes from one major family of related distributions to another.  I recommend Ubuntu to every beginner, and Debian to every intermediate and advanced user.  In some corporate environments, you may find that RedHat Enterprise Linux (RHEL) or its open-source variants CentOS and or Fedora are preferred.   I recommend you learn Ubuntu first, and learn a RedHat variant later.
  • You know how the Linux boot process works, from BIOS or EFI to the boot loader, to the kernel startup, to the init sequence, and service startups, and you know what runlevels are, and what systemd is, and what /etc/init.d.  You appreciate that unlike Windows, when a system refuses to boot, it's not that hard to repair it.
  • You are comfortable in the Unix style shells, such as bash, csh, and tcsh. You can write shell scripts and read and repair shell scripts.
  • You are familiar with the basics of C development in Linux, including the use of GCC and CLANG, build tools, and associated parts. You  can download something.tar.gz and unpack it, read the instructions and build it from source.  When it breaks you can read the output and figure out what's wrong, and if googling the error doesn't help, you can dig in and fix it yourself.    You know what static and shared libraries are, and you can find and install dependencies (libraries, tools) that some package needs to build.
  • You are comfortable with rebuilding the Linux kernel from source code, you know what kernel modules are and what lsmod and modprobe do, and you know how to reconfigure a kernel, turning options on and off.  You know how to upgrade or add an additional kernel to your system's boot loader.  This is actually really fun.  You may find that having a system you can completely and utterly modify to suit your own needs and requirements becomes a bit of a giddy experience.  I know that I feel like I'm actually in control of my computer when I run on Linux.  On Windows 10, I feel like my machine belongs to Microsoft, and they just let me use it sometimes, when it's not busy doing something for the boys in Redmond.  That being said, I really like Windows 10, and I still primarily enjoy developing for Windows systems.  But knowing both Linux and Windows is a very useful thing to me.
  • You have a decent understanding of system administration core concepts, including the wide set of tools that will be on almost every unix system you use. You can find files using several techniques. You can list processes. You can monitor systems. You know how to troubleshoot networking issues from the command line.
  • You will know you've gotten in deep, when you have taken a side on the vi versus emacs debate, and become extremely proficient in the use of one or the other. (Hint: The correct choice here is vi. Die emacs heretics, die die die.)
The above should give you enough to chew on for a year or two.  What should your first steps be if you know nothing?



  • You will need at least 20 gigs of free space.
  • Download the latest Ubuntu 15.xx as an .ISO file.
  • Install Ubuntu into a virtual machine.  I recommend Client Hyper-V on Windows 10 which is included in Windows 10, or if you're still using that ancient Windows 7 thingy, then download VirtualBox, which is free.  If your Linux install worked perfectly, the client integration piece that makes working with a mouse within a virtual operating system will work perfectly. If the client integration piece didn't work, make sure to learn how to manually "free" your mouse pointer from the VM if it becomes locked inside it and you can't figure out how to release it.
  • Play with virtual consoles (Ctrl+Alt+F1 through F8). Learn what they are.  Watch some tutorials on basic Linux stuff like logging in.  Learn a bit about bash shell.  Learn about the structure of unix filesystems, learn the basics of unix file permissions and ownership.
  • Learn about commands like ls, find, locate, grep, ps, pswd, more, less, wget, ssh, ping. chmod, chown, and others.  Use the man command to learn about them (man grep).
  • Learn to install and start up Apache web server.  Learn a bit about configuring it.   Examine the configuration files in the /etc/apache2 folder
  • Browse from your host (Windows) PC web browser to the IP address of your Virtual Machine.  Use /sbin/ifconfig eth0 command to display your current IP address from a terminal prompt.
  • Learn to start and stop the X Server. When the X server is stopped, you have a text mode only operating system, which is ideal for server deployment. When it's running you have an opportunity to try some of the available IDEs that run on Linux.
  • Optional: Learn some Python and learn to write simple web server applications with Python.  (I do not recommend bothering to learn PHP, if you don't like Python then look into Ruby and Go as server side languages.)
  • Optional: Learn the fundamentals of compiling some small applications from source. Write some small command line applications in C, since that's going to give you a bit of a flavor for the classic Unix environment.  C programming on Unix is easily the single most important skill I have on Linux.  If you can get over your preference for begin/end and learn to work on Unix in C when it's useful to do so, you become a much more well rounded developer.
  • Optional: Install some open source Pascal compiler.   Don't expect things to be exactly like Delphi, because they aren't but you might enjoy messing around with FreePascal (compiler), or Lazarus (IDE).

Come to the dark side. We have fortune cookies...





Monday, April 11, 2016

Ubuntu on Windows is here, first thoughts.

For Windows 10 users who have the Insider preview enabled, who have the slider all the way to the bleeding edge side (fast track means all the way to the right), a new preview windows build will become visible and ready to install within about 24-48 hours after you switch to the fast ring.

After that you have to enable the new Windows Subsystem for Linux (beta) and make sure in the system settings that Developer mode is enabled. After that you should open a command prompt and make sure that the use legacy console checkbox is not checked in your command prompt (conhost) properties.

Now open a console window and type bash. The system will install. If you get an 0x80070057 error and you skipped past the link above about legacy console go back, and listen to me next time.  If you get a different error then try Googling the Error Message.


Once you have it installed, you will be in one of several different mental states. If you are like me and you have been using Linux (and other Unix operating systems) since before anyone thought of slicing bread, then you will have lots of fun things you will want to try.  If you are familiar with basics of working in the commandline environment in a Debian or Ubuntu variant of Linux, you will know that it uses apt-get to install packages from repositories, which are configured in /etc/apt/sources.list.  If you open that file you will see that this is not some customized set of binaries created by Canonical (the parent company that releases Ubuntu) so that you can pretend to run Linux binaries.  These are real Linux binaries, unmodified from their real ubuntu versions.  You are running a Linux userland on windows.  On what does it run? Is there a Linux kernel? No.  If you know how Posix environments (broadly compatible Unix implementations claim some level of interoperability and commandline shell compatibility) you know you type uname to find out about the kernel.  Let's do that:

root@localhost:/etc/apt# uname -a
Linux localhost 3.4.0+ #1 PREEMPT Thu Aug 1 17:06:05 CST 2013 x86_64 x86_64 x86_
64 GNU/Linux




So right there I'm surprised.  I would have expected Microsoft to have their Linux subsystem for Windows report something other than 3.4.0+ as the kernel version. That ought to make a person think when they see that.  This means they implemented all the system calls (syscalls) that things like libc would invoke on a real system. This is zero overhead, extremely efficient, and is a relatively large amount of API surface area for the Windows team to take on.   This is not Steve Ballmer's Microsoft, this is Satya Nadella's Microsoft, and it's kind of awesome.

  The performance here is native. The ABI (binary interface) between userland and kernel is at a 3.4.0 level, but it's not exactly perfect because there will be APIs that are in Linux that the Microsoft emulation layer will not emulate perfectly, at least not yet.   This should impress you.  If this does not impress you, you really don't know what this is doing, and you should remedy that lack of knowledge about windows.   Subsystems are a powerful concept that has lain dormant in Windows since the death of the Windows NT subsystems for Posix, which Microsoft grudgingly brought about to win some big US government contract, and then let wither and die.

Now let's talk about those of you who still have their heads in the sands about the importance of Linux. Why is Microsoft putting a pure Ubuntu "userland" experience for developers (not for production server use) into Windows?  They've been pretty clear. For Developers, Developers, Developers. If you are a developer and you still have no skills at all on Linux systems, then you have your head firmly in the sand my friend, and should fix it.  If you have no prior knowledge of Linux at all, I highly recommend installing a full real Linux environment in a virtual machine and spending some time learning it and using it.   If you expect to be employable in the future, as a server side developer in the future, and you don't plan to only work for small desktop/workgroup codebases for the rest of your life, then Linux systems, containers, cloud technologies, cluster scale technologies, and big data technologies are all something you can just ignore.  Continue to play with your datasets and your data aware controls, and live in your own tiny 1990s world.

I will write a second post on getting started on Linux shell in Windows, and on possible things that might be useful for Delphi developers to learn first in a second post. For now, I suggest you create a VM and install the latest ubuntu.  No matter what you do, you will learn more in that, than you will from playing with this beta ubuntu on windows.

Some things you might like to try:

apt-get install joe
Then run the joe editor:
joe hello.txt

Note that joe (joes own editor) uses those Ctrl+K Ctrl+B / Ctrl+k Ctrl+K type shortcuts you might remember as a Pascal/Delphi old-timer.  These Ctrl+K based set of shortcuts actually predate delphi/pascal and come from the 1970s WordStar editor/text-processing system, which first appeared on CP/M.   Guess which platform TurboPascal supported even before it supported IBM PC on DOS? That's right!  CP/M on Z80.

Some more nostalgia, anyone?

apt-get install fp-ide
then run it
fp

Well that wasn't really perfect yet. I guess this thing has bugs.  (Update 1:The screenshot below is messed up because the command prompt font was set wrong.)



What else has pascal in the description?  Type apt-cache search pascal.


This seems like a great place to be in 2016, with the public road-map for Delphi showing that Linux support is important to them.  I would love to be able to build and dev-test with a local gdb debugger against a server side service built in Delphi.

Update:  Here's FP ide with the font fixed in my command prompt (Lucida console works!) and rocking out like it's 1992:


Sunday, April 10, 2016

Happy 20th Birthday to the ICS Internet Component Suite by Francois Piette

One of my favorite fellow Delphi coders, Francois Piette has built a lot of great stuff, and the one thing I have used in more places that use Delphi than anything else is ICS.

Maybe Indy gets used more, and these days, Indy always comes in Delphi already pre-installed but I preferred to use ICS.  I used its SMTP email client and FTP client, and I used it as a TCP client and server base-class when I needed to write a Modbus-TCP client and Modbus-TCP server component. Unfortunately I did not ask my former employers to open source it, so I can't post the links to its code.   In retrospect I wish I had, as it might come in handy.    There is an indy based modbus tcp component for Delphi up on github, but I don't think my own exists anywhere outside the employer where I built it.

A beautiful clean architecture makes for elegant maintainable solutions.  Congratulations Francois and to his component for two whole decades!  Next year ICS will even be old enough to get a drink in Texas.

You can find this and much more at the overbyte site:

http://wiki.overbyte.be


Wednesday, April 6, 2016

On Data Files and Configuration in Delphi Programs

On a Delphi programmer's group on Facebook the other day, someone asked a question about configuration files. They were going to use an INI file to hold their data, and someone suggested the Registry.  I chimed in that I find INI files superior to Registry files, with the provision that perhaps one exception to that might be if your application needs to configure where its textual configuration files are. The reasons why textual configuration files (in json or ini format) are better than the registry are:


  • Because it's easier to back up and restore application state.
  • Because for your support people, reading and inspecting a text file is easier than finding and inspecting the registry, which is in the end like one enormous INI file that describes the state of your entire system.
  • Because even if the system won't boot any more, it's easy to get that configuration data and copy it off that machine, just by booting from a live CD image like GPARTED or by slapping a completely dead PC's disk in another machine after the motherboard died.  In my career this has turned out to be a good choice when customers were told to make a backup and they didn't. Data recovery has saved my customer's bacon when they didn't do the backups they were taught and told to do.
  • In my automated bug report submission systems, using MadExcept (or you could use EurekaLog), I find it helpful to attach log files (showing what happened to the app recently, on the client and the server side) and even some of the configuration files, so you know what options the user has configured.
  • In some domains we might even check the configurations into version control, or keep backup copies of configurations. It's harder to do that with a registry.
Some interesting technical questions that remain:

  • Where should data files, including configuration files be stored?  For state which is different for each user, I believe in the user's appdata folders, either Local or Roaming, under a Subfolder named by your company name, with a Subfolder under that for your product name.  
    Similarly folders under c:\ProgramData should be used for global data that is not different for each user who logs in. So for AwesomeSoft's product SuperTool, on my computer, C:\Users\warren\AppData\Local\AwesomeSoft\SuperTool is used. If the installer version is known and this folder are backed up, then your entire application state can be backed up and restored without even backing up the program files folder.  Perhaps a local database or data file might also automatically live here if it hasn't been configured and moved elsewhere.
  • Even back in Windows XP era (2002) Microsoft has given clear guidance that non-system privileged processes should not have write access to the Program Files folders.  In order to preserve system stability, only the installer, which operates in an Elevated privilege level, and is started by a person with appropriate system administrative role, should be able to write to the program files folder.  I consider hacking NTFS permissions on Program Files to be a bad practice, on end user machines.   What I think is kind of hilarious though is that in 2016, Microsoft SQL Server 2016, still defaults to primary SQL Table data file storage under Program Files folders, instead of moving that global state data to Program Data where Microsoft's own guidance would have suggested they put it.   What do you do when even Microsoft's left hand doesn't follow Microsoft's right hand?  I would say, you do what is right, what is a solid engineering principle.  Protect users and their systems by separating your binary code and non-user-defined static state (maintained by installers and patch updaters) from your user defined static state (your data folders).
  • When if ever are binary configuration files appropriate? I would say, only when the problem domain or technical requirements cannot be met with a text file. For example, if the configuration files contain a million entries and really just needs to be a key-value store. In that case, I would say, you should use a proven binary container like SQLite, and not invent your own shabby one, or use some unmaintainable binary blob technology that you found on the internet, like that b+Tree algorithm you found somewhere randomly on the internet.  Binary files are opaque, testing binary file reads/writes is a harder problem, and inspecting that binary file for damage becomes a difficult task, unless you choose something like SQLite that already has tests, and file integrity checking already built.

Tuesday, April 5, 2016

Ask that "Dumb" Question, and No You Aren't Dumb. Or maybe if you are, that's OK.

I found this blog post called "Being a 'Dumb' Girl in Computer Science" incredibly awesome, so I have to share it with everybody.  As someone who wants to see more women in STEM (Science, Tech, Engineering, Math) in school and in the workforce, I'm encouraged to see how brave and resourceful, and actually totally not dumb, some of the young women in university these days really are.

University was a long time ago for me, but there are some things that haven't changed whether it's school, or the workforce.  A really big problem I have seen in the workplace, is when people don't ask questions.  We leave people behind. We don't get buy-in, we don't have a shared understanding of the problem domain. We all lose. The company loses. The customer loses. The team is less effective. The software product quality suffers.

Ask the dumb questions. They are not dumb questions after all.  One of the comments on the issue puts it perfectly:

20-year veteran here, CTO of my company, yadda yadda yadda. It’s my publicly-proclaimed policy to ask the “dumb” questions. Get those answered and there seldom remain any “smart” questions to be asked. Skipping the “dumb” questions usually means asking irrelevant ones. Beware the “geniuses”—they write the worst code. At first you’ll feel dumb not understanding it, but you’re blaming the victim. Simple, easily-understood code is the product of hard work more than cleverness. Mysterious code is a menace. I hope you keep your beginner’s mindset. You’ll learn far more and be a better team player.
This fits perfectly with something that Confucius said a long time ago:

If I am walking with two other men, each of them will serve as my teacher. I will pick out the good points of the one and imitate them, and the bad points of the other and correct them in myself.

Don't be the person who thinks he's a "genius" and has nothing more to learn. Learn from Cat that you should ask the question.   Learn how not to be the people in her class that made her feel bad for asking the question.  Correct in yourself the pride and smugness you find. You are like a full cup, and you can receive nothing more.  Begin with the knowledge that you know nothing, feel free to call yourself "dumb", just don't call other people dumb.  And know that if you learned something, it wasn't a dumb question at all.   Maybe someone else at your workplace or your school wanted to ask that question and didn't understand at all either, and needed you to step forward.

I went through a phase where I preferred to write the most inscrutable code, and considered that great.  If people can't make powerful concepts simple, and build useful systems out of understandable parts, and share knowledge, then our schools are failing, and our companies will fail soon after.  Ask the question if you have it. Give the answer if you have it.  The knowledge economy is a gift economy.