Why is Education a Cyber Target?

Greg Price

As a computer technologist, an innate bias envelops the word “technology”; whenever I hear the word, I immediately think of computers, software. Similarly, when a reference to security arises, instantly I think of cybersecurity.

Our modern-day society is predicated on many forms of technology and a collective desire to progress is inextricably intertwined with the advancement of technologies. Among those technologies, undoubtedly, are computers, applications and a fascinating blend of things yet-to-be contemplated.

So, for these comments, please share my predilection that technology inherently suggests some form of computer technology.

Our schools are reliant on technology. The business of learning and fostering knowledge is deeply steeped in efficient, reliable technology.

Computers provide access to boundless resources; we no longer refer to libraries as libraries, rather, they are media centers. I haven’t seen a card catalog in two decades – the physical volumes of the media center are cataloged within a database. Student ID cards reveal identity and serve as a digital passport for access to food services, secured structures, sporting events, the media center. Classrooms exhibit smartboards, digital displays, interactive media and mobile devices.

The hallways are guarded by closed-circuit television. Textbooks are often paperless. Computer labs are an anachronism – some schools issue tablets, laptops to students. With the proliferation of high-speed wireless networks, the students and faculty are always “plugged” in.

I doubt any of these comments are shocking to anyone.

How are these technologies sustained?

A new version of my cellphone appears every fall, every three weeks my software provider announces a new update, every day my computer installs new antivirus and anti-malware defenses, new firmware for my home router arrives, my wireless cameras exceed storage space, and on and on and on.

Take those individual pieces and multiply them by a few thousand, by several thousand. The annoying becomes overwhelming.

Yet, technology is easy, right?

Developers march forward, seeking greater expansion and application of the newer and the better. Vendors offer their wares as the next generation of the latest and greatest. Rapid development techniques and intuitive user interfaces suggest greater advancement coincides with simpler management, lowered costs and ease-of-use.

But, don’t be fooled.

Today’s technology is incredibly complex. The digital architectures upon which our devices operate, and information flows require constant observation and maintenance. The rapid development of software results in flawed, error prone products. Our penchant for chasing the connection of all things creates an awkward mash-up of inter-connected devices.

The requirements to manage thousands of digital devices and software and users requires resources.

Most organizations, including educational entities, do not have adequate information technology resources.

As Frankenstein networks emerge, combined with increasingly fragile software and high-speed cyber highways, the opportunity for security risks rise significantly.

Every school hasn’t replaced textbooks with tablets; every classroom isn’t equipped with a smartboard and digital display. Without a doubt, variability in the use of, and adoption of, technology exists among our schools. However, the single thing that exists among all entities is security concern.

Technology adoption will increase. With the growth, security concerns will flourish. Inadequate support resources coupled with frightening risk is a recipe for disaster.

And the bad guys know it.

Why do would-be bad actors target education?

Opportunity is abundant and the environment is ripe with desirable goods.

Educational organizations house treasure troves of personal information: employee and student biographical data, health data, financial data, performance data.

Data is the new currency. With data, a bad actor can buy, sell, trade for practically anything. With data, a bad actor can embarrass, attack, impersonate another.

Technology presents fabulous opportunity for students and teachers. Similarly, technology presents opportunity through unmanaged risk for exploitation and manipulation by those who endeavor to cause harm.

Recent events underscore the value of adequately addressing cybersecurity needs in our schools. Ransomware has crippled school systems, phishing scams resulted in lost funds, hijacked credentials ended in reputation ruin, and the list goes on.

In a recent discussion about computer resources being held hostage, a participant stated to the group that “we can teach without the computers.” I agree to an extent. We can also teach in temporary shelters following a natural disaster, but should we?

Technology isn’t going away; we must increase our awareness to the threats presented by technology and work to safeguard our students and employees from the effects of cyberthreats.

In order to close the gap in our defenses, the community must commit to supporting educational technologies comprehensively.

If you employ technology, you have risk. If you collect student and employee data, you possess a commodity desirable by those who have the knowledge and means to do “evil”.

What should we do?

Support is needed. A structured, pragmatic approach to managing and mitigating the cyber risk is here. Prescribing awareness and best practices are a solid foot forward. However, to achieve maximum effectiveness, we must provide the proper resources and guidance to ensure that adequate controls are in place.

Additionally, we need to expect and request more from our technology developers and integrators – we’re not alone in this voyage.

Election Run Amuck

Greg Price

A simple definition of technology follows: the application of resources to achieve a goal. Often, the goal is a scientific endeavor, other times, it’s an efficiency objective, and let’s not forget a more obvious desire: solve a problem.

We live in a world littered with fascinating technology, ripe with seemingly constant change. If you take a few moments and ponder the major changes that have occurred in your life over the past few years, it’s likely that technology can be found among those events.

As a technologist, I will testify that technology is often imperfect. In fact, I worked in design for many years and the process of developing a new “computer” technology isn’t immaculate. I’m certain you’re familiar with the old saying, “you don’t want to see how the sausage is made”.

Technology development can follow the scientific method. Careful review, testing and analysis are the hallmarks of pragmatic development. However, nowadays, the desire to reach a goal, such as a new app, frequently requires abandoning rigorous testing. As a result, poorly designed software has become a normal for many of us.

We have become the testers, the evaluators, the frustrated audience for the rapid development of new software technologies. If enough of us complain and allow error logs to be whisked away, patches will arrive. Well, maybe patches will arrive.

Are there bad consequences of these approaches?

No doubt. Crashed apps are common, frustrated users are normal, and technologists fear moving away from stable software platforms.

Just this week, we were reminded of the real-world consequences of poor software development. The Iowa Democratic caucus was Monday night. It’s election year, 2020.

What does that mean? An app of course.

As the nation held its breath and waited with anticipation of the results from a crowded Democratic field, the new technology didn’t fare well.

Last month, the Iowa Democratic Party announced that it planned to use a mobile app to report precinct results. Despite requests by many, the Party refused to reveal much about the app. Independent security companies asked to review the app’s source code (the underlying instructions that constitute the app), those requests were denied. Some sought the testing process and those results; denied. Who developed the app? No comment.

According to the Wall Street Journal, elected officials asked for details about the app; those were met with the same refusal from the Democratic Party.
In the aftermath, we know what happened, at least we made observations and have notions of what happened.

The Iowa precinct chairs could not get the app to work properly. It crashed repeatedly. The app was built hastily, and testing was woefully inadequate.

What are some lessons learned from the Iowa Democratic app debacle?

As a starting point, let’s appreciate the importance of an election. There are few things more personal and important than one’s right to cast a vote. In doing so, we place our confidences in the systems and people who manage the technologies that facilitate our desire to voice our choice. The process of voting should be transparent and devoid of obstacles.

Based on the responses to inquiry before the Iowa caucus and the aftermath of the event, one thing is certain, the notion of rapid software design failed.

It’s important to state that the app did not cast votes. Rather, it was designed to deliver quickly the results of precinct votes to the state party. So, based on our basic definition of technology, it appears that the problem that was being addressed was expediency: deliver the results quickly. After all, all eyes were on Iowa – who had the time for slow results reporting. We want what we want right now.

If you’re in the business of running an election, transparency of your technology is essential.

Whether you’re using pencils and paper ballots or computer-based voting machines, allowing inspection, review of the technology and explaining to the voters what’s being used builds voter confidence. If I asked for information on the pencils and paper ballots and the response is “you’ll see, don’t worry about it.” I’m instantly worried.

So what are you to do?

First, explain what’s going on. Mount an open campaign about the technology and explain the purposes and reasons for the approach.

Next, allow, require independent inspections. Consider the value of positive validation of your technology from someone not directly involved in the process.

Test, test, test. Inadequate testing of software is irresponsible, especially given the purpose behind an app assigned to a voting process. Proper, rigorous testing will reveal deficiencies and allow for mitigation efforts – hoping for success and accepting likely failures as part of the process is disingenuous.

Lastly, provide adequate resources for success. Technical support resources should be highly available. Planned contingency efforts are a must. And, without a doubt, realistic time for all of the above is mandatory. Reports suggest that the Iowa Democratic app process was executed within two months. That is a tight timeline.

Conspiracy theories are running wild in the aftermath of the caucus. Russians are a favorite, the app developer, Shadow, Inc. has been beaten up – but, in reality, the explanation is far simpler.

Rushed software and unrealistic expectations gave way to an unfortunate experience.

The bottom-line? Technology development should take into account the intended use for the enhancement and develop accordingly. For voting technology, the technology must be accurate and open.

Trust in our election processes is essential. Failed technology is always disappointing, but, in this case, the failure eroded confidence in existing voter technologies and brings their design into question.

Be safe.