[dropcap]O[/dropcap]N December 2, 2015, health inspector Syed Rizwan Farook and his wife Tashfeen Malik were armed with semi-automatic pistols and rifles as they opened fire on a San Bernardino County Department of Public Health. At the time, the rented banquet room was hosting a training event and holiday party, and the couple ended up killing 14 people and seriously injuring 22 more.

This event was one of deadliest shootings in Californian history and the first terrorist attack on US soil since September 11, 2001, which is probably what prompted the Federal Bureau of Investigation (FBI) to open a counter-terrorism investigation.

Most notably, the investigation uncovered an iPhone that had been issued to Farook by his employer. Investigators suspected it would contain data related to the shooting and its development, but last month the FBI publicly announced that they were unable to unlock the iPhone due to its advanced encryption security.

Farook, like most smartphone users, had placed a passcode on his iPhone. Passcodes are combined with a 256-bit secret key unique to each device, which entangles the two and creates a new passcode that encrypts and decrypts the device’s data. When the correct original passcode is entered, the phone executes a computation that combines the two codes, which unlocks the device.

The FBI tried to unlock the phone using a brute force passcode cracking method, which repeatedly enters different codes until the right one is found. A secondary, user-enabled security feature exists, however, that sets a limit on the number of permitted passcode guesses before it would entirely erase the passcode key and permanently render the data inaccessible. Currently, the FBI does not know whether or not Farook had previously enabled this passcode entry limit, and it is too delicate of a situation for them to learn the hard way.

To work around this, the FBI obtained an order for Apple to create software that would disable the passcode limit and allow the FBI to decrypt the phone using the brute force cracking method. They also wanted Apple to ensure that this crippled software would be saved to the memory instead of the actual disk, in order to preserve the forensic soundness of the data. Even if Apple were to do this, the proposed method of cracking could take over five years to run through the millions of possibilities of six digit alphanumeric codes.

Apple rejected the order immediately and asked the court to rule it unlawful. CEO Tim Cook penned an open letter to explain Apple’s position, stating why they promote the privacy rights of their customers and challenging the FBI’s demands.

In his letter, Cook described the FBI’s request as “asking Apple to hack our own users and undermine decades of security advancements that protect our customers—including tens of millions of American citizens—from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.”

Apple is not dismissing the tragic effects of the San Bernardino attacks, but they are wary of the ramifications of creating crippled software. Adherence to the FBI’s order would create a legally accessible “backdoor” into any Apple device; the creation of a tool to break into one iPhone would make all iPhones vulnerable. Cook states that this amounts to Apple “being forced to expose its customers to a greater risk of attack” from anyone with the software, whether they are government agencies or cyber criminals.  Ultimately, a New York judge ruled that Apple would not have to comply with the FBI’s demands, which erased the possibility of a backdoor being created.

The court ruling makes it unlikely that such a case would occur in Canada, but Canadian police have already established a significantly wide range of digital records procurement options. An interesting example can be found in Bill C-13, which became law in 2014. It was an evolutionary step in successive attempts by the Liberal and Conservative governments to pass “lawful access” legislation. Presented as a bill to end cyberbullying, it has many less obvious repercussions because its language is intentionally vague. This bill can potentially be used to order mobile carriers and third-party service providers to provide decryption keys for encrypted and stored data, whether it be local or in a secure cloud.

Tragic, inhumane events like the San Bernardino massacre provoke questions about the balance between individual privacy and public safety. Some individuals will prioritize the prevention of future terrorist attacks at all costs, while others fear that some preventive measure may lead to the government possessing too much power over the personal lives of citizens.

It becomes much easier to diminish the importance of privacy in times of catastrophe and loss, because our society is presently faced with an immediate safety threat. At these moments, we must remember that privacy gives us freedom to pursue many interests without interference. To this extent, Canadian identity must prioritize both privacy and safety.

These rights are meant to be balanced, not sacrificed for one another. Decreasing privacy rights does not necessarily produce a matching gain to public safety. A by-product of our increasingly tech-centered lives is the significant digital footprints we leave online. Loosening limitations on law enforcement’s ability to access our devices and data could leave students and other young people vulnerable.

Terrorism can harm the populace but so can governments with too much power. Moving forward, we must aim to curb both possibilities, to preserve privacy, and to ensure citizen safety.