Election Run Amuck

Greg Price

A simple definition of technology follows: the application of resources to achieve a goal. Often, the goal is a scientific endeavor, other times, it’s an efficiency objective, and let’s not forget a more obvious desire: solve a problem.

We live in a world littered with fascinating technology, ripe with seemingly constant change. If you take a few moments and ponder the major changes that have occurred in your life over the past few years, it’s likely that technology can be found among those events.

As a technologist, I will testify that technology is often imperfect. In fact, I worked in design for many years and the process of developing a new “computer” technology isn’t immaculate. I’m certain you’re familiar with the old saying, “you don’t want to see how the sausage is made”.

Technology development can follow the scientific method. Careful review, testing and analysis are the hallmarks of pragmatic development. However, nowadays, the desire to reach a goal, such as a new app, frequently requires abandoning rigorous testing. As a result, poorly designed software has become a normal for many of us.

We have become the testers, the evaluators, the frustrated audience for the rapid development of new software technologies. If enough of us complain and allow error logs to be whisked away, patches will arrive. Well, maybe patches will arrive.

Are there bad consequences of these approaches?

No doubt. Crashed apps are common, frustrated users are normal, and technologists fear moving away from stable software platforms.

Just this week, we were reminded of the real-world consequences of poor software development. The Iowa Democratic caucus was Monday night. It’s election year, 2020.

What does that mean? An app of course.

As the nation held its breath and waited with anticipation of the results from a crowded Democratic field, the new technology didn’t fare well.

Last month, the Iowa Democratic Party announced that it planned to use a mobile app to report precinct results. Despite requests by many, the Party refused to reveal much about the app. Independent security companies asked to review the app’s source code (the underlying instructions that constitute the app), those requests were denied. Some sought the testing process and those results; denied. Who developed the app? No comment.

According to the Wall Street Journal, elected officials asked for details about the app; those were met with the same refusal from the Democratic Party.
In the aftermath, we know what happened, at least we made observations and have notions of what happened.

The Iowa precinct chairs could not get the app to work properly. It crashed repeatedly. The app was built hastily, and testing was woefully inadequate.

What are some lessons learned from the Iowa Democratic app debacle?

As a starting point, let’s appreciate the importance of an election. There are few things more personal and important than one’s right to cast a vote. In doing so, we place our confidences in the systems and people who manage the technologies that facilitate our desire to voice our choice. The process of voting should be transparent and devoid of obstacles.

Based on the responses to inquiry before the Iowa caucus and the aftermath of the event, one thing is certain, the notion of rapid software design failed.

It’s important to state that the app did not cast votes. Rather, it was designed to deliver quickly the results of precinct votes to the state party. So, based on our basic definition of technology, it appears that the problem that was being addressed was expediency: deliver the results quickly. After all, all eyes were on Iowa – who had the time for slow results reporting. We want what we want right now.

If you’re in the business of running an election, transparency of your technology is essential.

Whether you’re using pencils and paper ballots or computer-based voting machines, allowing inspection, review of the technology and explaining to the voters what’s being used builds voter confidence. If I asked for information on the pencils and paper ballots and the response is “you’ll see, don’t worry about it.” I’m instantly worried.

So what are you to do?

First, explain what’s going on. Mount an open campaign about the technology and explain the purposes and reasons for the approach.

Next, allow, require independent inspections. Consider the value of positive validation of your technology from someone not directly involved in the process.

Test, test, test. Inadequate testing of software is irresponsible, especially given the purpose behind an app assigned to a voting process. Proper, rigorous testing will reveal deficiencies and allow for mitigation efforts – hoping for success and accepting likely failures as part of the process is disingenuous.

Lastly, provide adequate resources for success. Technical support resources should be highly available. Planned contingency efforts are a must. And, without a doubt, realistic time for all of the above is mandatory. Reports suggest that the Iowa Democratic app process was executed within two months. That is a tight timeline.

Conspiracy theories are running wild in the aftermath of the caucus. Russians are a favorite, the app developer, Shadow, Inc. has been beaten up – but, in reality, the explanation is far simpler.

Rushed software and unrealistic expectations gave way to an unfortunate experience.

The bottom-line? Technology development should take into account the intended use for the enhancement and develop accordingly. For voting technology, the technology must be accurate and open.

Trust in our election processes is essential. Failed technology is always disappointing, but, in this case, the failure eroded confidence in existing voter technologies and brings their design into question.

Be safe.

Hello Facebook

Greg Price

Facebook’s business model is based heavily on the collection and sale of user data.

Fostering digital “friendships” and promoting likes are some of the beguiling tools used to keep you clicking and browsing your feeds – maintaining engagement equals income for Facebook.

Despite Facebook and its leader’s claims to value online privacy, the continued issues and perplexing security conundrums suggest the company is struggling to maintain a positive image.

In 2018, following the Cambridge Analytica debacle, Facebook promised to restrict developer access to user data.  Recent announcements by Facebook suggest the new privacy policies haven’t been applied to every developer – possibly over one-hundred application designers continue to have access to the personal data of users in Groups.

Data harvested by the developers include names, profile photos, phone numbers and Facebook reactions, such as your “likes “.  According to Facebook, despite the neglect and continued release of the data, the data hasn’t been abused or used inappropriately – trust me, I’m from Facebook.  Who knows if the data has been misused, most don’t know it’s being used by other firms.

The incredible irony in these continued abuses is Mark Zuckerberg’s statement that “the future is private”.  Is the statement dishonest or the result of poor engagement?

Here’s a simple fact.  If you use Facebook, your data is being sold.  Stop, don’t argue, don’t venture any further.  That’s Facebook’s primary source of income.  After all, you are allowed to use Facebook for “free”.

This week’s latest Facebook controversy involves a bizarre issue on the Facebook app for Apple iOS.

When you look at an image or video within the Facebook app, the Apple device’s camera activates on its own, for no known reason.  When the issue was reported, nobody had any idea why the app opened the camera.

When you open a photo within the app, swipe down and you will see that your phone’s camera is running live in the background.  Why?

Facebook has corrected the issue through a hastily-delivered fix to the Apple App store.  Simply visit the App store and download the latest version of the app.

The very peculiar thing for me, when I tested the app on a lab phone, was not once did the Facebook app ask for permission to launch the camera app.  At first, I thought the issue was a design intent that presented an impersonated camera interface or maybe a quick include to launch the camera interface rapidly. However, I moved the phone and the surroundings changed – the camera was live.

I could not reproduce the problem on an Apple device running an older version of the iOS; only the latest version, 13.2.2 presented the problem.

I haven’t noticed a formal notice of the issue from Facebook, simply the push of a new version of the app that appears to resolve the matter.

Was the problem the result of buggy software?

Maybe.

If you’re running the latest version of Apple iOS, you have a few options.

First, delete the Facebook app.

Not only will you resolve the current camera problem, but, you’ll tackle all future failures of the social media platform.

But, seriously, you don’t have to use the app to check Facebook.  You can use a web browser such as Safari or Firefox and interact with your account through a common tool.

If you’re not ready to abandon ship just yet, obviously, the easiest thing to do is update the Facebook app to the most current version.

Lastly, if for whatever reason, you can’t update the app, disable the camera access for the Facebook app in the phone’s privacy settings.  Simply visit the Settings app, select Privacy and then tap Camera.  Find the Facebook entry and toggle the green switch to off to disable the camera access.

While you’re there, take a look at the other apps that you’ve granted access to your camera.  See something you don’t like or don’t recall enabling?  Disable those too.

If you can’t tolerate the thought of deleting Facebook, I urge you to consider restricting what Facebook knows about you.  In order to do so, you must make your profile settings as private as possible.

Keep in mind, adjusting the settings to reduce data collection will not make you immune to the inspection and exchange of data; but, perhaps, tightening your settings will allow you to control more of your data and reduce what Facebook collects.

Facebook provides a security checkup – but, only on the desktop version, for now – you cannot perform the security checkup from the mobile Facebook app.  The security checkup is supposed to reveal what data is being shared.  As you observe those data, you can restrict some of the data.

The downside?

Your tailored, or customized ads and recommendations will be less specific to you – from my perspective, the creepiness will be reduced – not a bad thing.

How do you run the Facebook privacy checkup?

Click the question mark at the top of any Facebook page.  Then select Privacy Checkup.  Three options should appear: Who can see what you share, How people can find you on Facebook and Your data settings on Facebook.

Click each of the three options and adjust the settings based on your personal needs.

As you step through the privacy checkup, you will see which apps are sharing your data and which data is presented to the public. 

I recommended the security checkup to a friend recently.  He sought the feature within the app for a day or so before he emailed me.  Remember to use a desktop device and a web browser to check the settings and to make adjustments.  You can’t do this from within the mobile app.

Interestingly enough, after perusing the settings and associated data, he emailed me and asked how to remove the Facebook app and delete his profile.

Be careful as you look behind the curtain, you might not like what you see.

Be safe.