Technology is easy, but...

This is the first of a two part series of articles. The essential content of both parts was originally written as a single article and scheduled for release on February 14, but I found myself unhappy with the manner in which it flowed. It’s been reworked a bit and the second part will be published on March 7.

At some point I will stop working, though I am not ready for that yet. I actually like what I do. At some point, too, I may have a grandchild or two. If I do, I will probably be guilty of telling them a story that goes along the lines of, “I was present as the Internet was being created and played a part in its creation.”

Let’s be totally clear about this: I was too young and not nearly smart enough to have been part of the team that created the Internet protocol that defines how computers on the Internet find and speak with each other. I wasn’t in the room when Bob Metcalfe invented Ethernet, nor did Tim Berners-Lee call and ask for my advice as he was placing the metaphorical cornerstone of the World Wide Web using his NeXT Cube, a computer I desperately wanted to have, but could not afford. I wish to avoid creating an uncomfortable Al Gore moment for myself through making the above statement: Al didn’t create the Internet, nor did I, and I do not wish for my comments to be misconstrued (as I believe his were).

But, I was present as the Internet was “taking off”, though my contributions were quite small in the large scheme of things. However, they were important for the human service agency I worked for at the time. The Executive Director of the organization, Ginny, is a remarkably far-sighted leader who was also the agency’s founder. She had identified technology as one of the factors that would be critically important to the future of the organization. At the time, I was working in finance, but, without question, had a stronger interest in technology and had been “supporting” what the agency was doing for years on an ad hoc basis. She asked in the early 1990s if I’d be interested in heading up an effort to create a plan for the agency to be more purposeful about its technological endeavors.

This was an assignment that I was delighted to have! I jumped headfirst into the task and did what all “good” managers do in such a situation—I created a committee!

Seriously, I did that, but I’ve never been accused of being a “conventional” manager and Ginny, likewise, has never been accused of being a “conventional”, well, anything. She always sees the world differently than do others. We agreed that the committee could be open to anyone in the organization who had an interest and that it was enough of a priority that managers were tasked with accommodating any of their staff’s participation by managing scheduling such that they could attend. Nice!

At any given point, there were a dozen to a dozen and a half people working with the group. As I recollect, we had, too, the participation of one or two members of the governing bodies of the agency. It was a really good working group and we created a report and a plan that was remarkably advanced for a smallish (at the time) agency serving people with developmental disabilities. Such organizations have historically faced a constant issue of constrained funding, so being “forward thinking and acting” was always a challenge, though back then, at least, this agency punched far above its weight.

Back in those times (nearly thirty years ago), the Internet played very little part in most businesses and essentially no part in the lives of regular people. Its use in non-profit organizations was very much exceptional. For a management retreat in 1994 during which I was “pitching” the promise of technology, I had mocked up a Web site with the newly created information from the “agency brochure” (which was far more than a simple pamphlet) and displayed it from my computer (not a portable) that I lugged to the conference center and somehow connected to a large television. For many of the participants, it was the first they had “seen” in person what the Web was all about and for some, I am sure, it was the first they had heard about the “Web.” The presentation did create a buzz and there was strong support for making technology an organizational priority.

Every organization has financial limitations that force hard choices. For non-profits, this is very often far more impactful than what would be found in “regular” businesses. To make the technology initiative work—there would be no investment from public funding sources—a campaign to solicit donations from family constituents was developed. The fundraising campaign was successful and, with the “seed funding” in place, work began on implementing the vision of the committee. Amazingly, over the subsequent years, we essentially executed on the plan we had set forth. It took vision, leadership support, hard work, and contributions of both time and money by numerous individuals who came and went over the years.

During those early years, we made a lot of good bets. The agency installed a connection to the Internet shortly after the aforementioned “Web demo” at its main administrative and educational site (a whopping 56K circuit). The next year, we connected our other main “program” site over our first wide area network and shortly thereafter we upgraded the Internet to a whopping 1.5 Mbps connection over a T1 circuit. We questioned at the time if we might be overdoing things!

We were early users in our field of email and even hosted accounts for some of our partner agencies before they had such services for themselves. The agency operated a number of residential sites and by 1998 or so we had all of them also connected to the Internet so that we could use email everywhere in the enterprise for communication and coordination. We exceeded the committee’s plans throughout the decade in almost every way and, in the non-profit sector at least, we were leaders in how we used technology to operate the enterprise.

Personally, I made a handful of key predictions that turned out pretty well and that were not obvious in 1995 or 1996. At the time, I kept a note taped to my computer stating, “It’s the network, stupid.” It was there to remind me that my primary mission was to connect everything, which is obvious now, but was not as clear back then. I was committed to connecting everything as quickly as possible.

One of my best technical predictions was that the TCP/IP protocol would eventually “eat” all other networking protocols and would eventually take over the transmission of every type of data, thus destroying or transforming “old” industry after “old” industry. I believed the transition would be complete when TCP/IP took over video distribution and killed cable, something that is happening quickly these days. As a cable cutter for the past several years, I can attest to the fact that streaming services and really nice “over the air” devices that connect an antenna to your home network make cable TV unnecessary. It’s long been too expensive and I can’t figure out why anybody would subscribe to cable service. It’s already dead, but just doesn’t know it yet.

In 1998 or so I declared that all future custom application development within the agency would be done via Internet technologies and that the Web browser would be the deployment target. This was back when the mainstream view was that client-server applications (predominantly running on Microsoft Windows) would rule enterprises, but even in its primitive state at that point, I saw the writing on the wall. This was another good bet that also led to the earning of a CIO Magazine 50/50 Award for Innovation for one of our first Web applications.

I believe myself to have had a pretty decent track record with regard to seeing important inflection points in technology over the past several decades, even if I often did not recognize precisely how the technologies would be commercialized or who would reap the benefits.

My father’s purchase of an Apple II+, which I thought an absolutely ridiculous investment when he made it, was the first such personal revelation. When I realized that I could control the machine by entering individual commands, I was intrigued. When I realized that I could store these commands in a program to be run again at any time, I was hooked and my father’s purchase ended up having a profound influence on the course of the rest of my life.

I started using a Macintosh in a work setting in 1985, a year after its introduction, and immediately recognized that someday all computers would employ a graphical user interface. Ordinary people were not going to interact with a command line, so the GUI was critical for bringing a whole new set of people to computing. It was immediately obvious to me.

When I first encountered the Internet, the World Wide Web had been invented, but hadn’t achieved “liftoff” as the “point and click” browser had not yet been invented. In my first exposure to the Internet, I did get to play around with the “Gopher” protocol and saw the (very) raw potential of a globally connected network open to all comers. When I first saw the Mosaic browser in action, I knew that the Internet was going to explode.

Discovering Linux in the the late 1990s represented another of those revelation moments. I was experienced already with UNIX-based operating systems. However, seeing what was possible by individual contributors from around the globe through utilization of the easy connectivity enabled by the Internet combined with a revolutionary software license and really competent leadership and management, I knew I was looking at something that would change the world. The brilliance of Open Source was that it created a playing field on which David (individual developers) could compete cheaply and effectively with Goliath (big companies). When I first saw Linux, you could not be a competent CIO/CTO and not consider what Microsoft was working on (they seemingly were driving everything as IBM had before). Largely because of the proliferation of Open Source, it’s been a long time since I had to concern myself with any of their initiatives.

The mobile cell phone was obviously a huge innovation, but it wasn’t until I played with an iPhone that I recognized its transformative power. For many years, companies had been trying to design and sell a “network computer,” a vision in which a lightweight end user device was used as an interface to services that were fundamentally Internet-based. In one elegant stroke, Apple put them all to shame with exactly the right approach in an unbelievably well-integrated package of software and hardware. The computer had been shrunk to the point where it could be carried in your pocket all the while attaching to anything on the Internet from pretty much anywhere in the world at pretty much any time.

In our current time, I think Artificial Intelligence and at least the concepts surrounding blockchains are the newest contenders for key technologies of the future, though I consider both in the “Gopher” stage of development. Neither are yet comprehensible enough to regular people at the moment to break into the monsters I think each will become. They both are awaiting their “Mosaic” moments. That’s for another article, though.

The point of all this is that if you’re paying careful attention, the critical technologies of “coming” years can be relatively apparent. Prognosticating about fundamental technology developments isn’t necessarily easy, but it’s also not a “dark art.”

However, when it comes to the “human” aspect of technology, well, that’s really, really hard. How will people really use new technologies? For what purposes will they employ it? What are the unforeseen side-effects of adopting a new technology?

The ability of technologists to push progress forwards at an accelerating pace appears to be overwhelming our ability to collectively put new technologies to truly beneficial and productive use. When I was that small part of the Internet’s “take off,” we had ideas on how it would impact people with developmental disabilities. We were pretty confident we understood how it would impact regular people and organizations. We got some things right, but, overall, we (myself included) severely underestimated both the breadth and the depth of the impacts of technological innovations that were coming at an accelerated pace. The word “severely” is completely insufficient to express the level of the miss. We were not really “wrong,” but we didn’t see enough to have been really “right.”

That said, I can’t think of anybody who really got even a significant part of the total picture “right.” A lot of smart people have made predictions and sometimes appeared to have invented the next big things, only to turn out to be completely and spectacularly wrong. Humanity, as always, tends to be consistent in being both eminently predictable as they are unbelievably surprising.

Technology is easy. Using it for our collective benefit is hard. However, our ability to utilize it effectively for the benefits we need in the 21st century may be the most critical factor in the success of this period of history. The pace of development of new technology is happening at a speed never seen before and the genie will not go back in the bottle. So, we have much to learn in a relatively short period of time. We may indeed find ourselves and our ability to grow tested as we never been ever before.