Standing still in a changing world. The fight to keep BC's kids competitive in the age of code.

Before I get started, the usual disclosures. What follows is a personal political opinion and is not speaking on behalf of, or endorsed by, any organization I am associated with. 

The announcement.

If there was a theme to the recent BC Tech Summit, it was the cliche line, "These are not the doors of a billionaire Richard!", a quote from the hit TV show Silicon Valley. The summit, an $850 a ticket meetup for big business and the subsidy sector was seen as a failure by many tech observers.

The presentations varied between the worst qualities of Qualcomm's Born Mobile CES keynote and infomercials for Baron-IT products... and then the Premier's announcement speech talked more about how we should all be chasing the dream of owning a Ferrari, rather than promoting the ideals of a prosperous life involving family and civic duty. It was simultaneously insulting and cringeworthy in so many ways, but mostly it was just counter-productive -- big announcements on corporate welfare, foreign workers, and unicorn-mania all came off as a big screw-you to the revenue-generating and tax-paying small tech businesses and high-technology workers of BC.

There were a couple positive announcements in the mess however:

  1. 1. The BC Developers Exchange (DevX) is experimenting with more open and lightweight procurement models (which I'll cover another day in a separate post as the project launches)
  2.  
  3. 2. Tech education for kids, and the promise that a mandatory curriculum in "code" will be embedded into the K-12 system.

The former DevX announcement was ignored by the media. I mean really, who in the MSM -- between the layoffs -- is going to take the time to parse what is potentially the largest government procurement shift in generations.

The code-for-kids controversy.

Code for Kids hit the MSM in a big way. Clark promised to educate kids in "code" and the BCTF reps were having none of it. A media storm erupted with comparisons to empty LNG commitments and the teachers were out on the radio slagging the idea. "Not everyone needs to learn to code" was even uttered by a prominent educator. The BC NDP jumped on board and criticized the premier for the announcement. They brought up the digital divide (a real problem, but presented as if every kid in BC was learning by candlelight)... and said it was evidence of why this plan was unfunded and unworkable.

To date, for clarity, the plan is simply this: We're going to teach kids to "code" in the existing K-12 system. Nothing more. Nothing less.

The usage of the word "code" in this context is carefully chosen. It's not synonymous with "programming" or "encypherment", but rather incorporates the same expanded deriviation of the word as you would find in "building code", "code of ethics" or "genetic code". It's a broad term that encapsulates all the foundations of digital literacy, of which programming is a very small, almost insignificant slice -- and thats why its such a buzzword right now.

In reaction to the controversy -- which I can only pin on a fear by teachers that they're all going to lose their jobs to a legion of web developers -- the Education minister put his foot in his mouth saying "You don’t actually have to be sitting in front of a computer to learn coding," .... "There’s lots of different ways to do that."

Now, the Minister isn't wrong, teaching coding has nothing to do with more screen time or the ability to download feature films on school internet backbones -- but as so often happens with statements made by people who actually know what they are talking about... it came off as idiotic and counter-intuitive. The NDP pounced:

"In response to the lack of funding behind the Christy Clark government’s commitment, the Minister of Education actually said, with a straight face, that students don’t need computers to learn computer coding. That is like telling a kid to learn to ride a bike without a bike."

This is where I started to get really grumpy. As an open-source programmer and a community advocate I spend so much of my time giving back and working on these issues -- I've spent years sitting on boards arguing over the structures for community investment programs, organizing Hackathons for kids, writing tech books, engaging in infrastructure research and connecting with civil servants and politicians in this space.

Occasionally we get through to both the politicians and the kids. 

Occasionally we can create millions in funding for non-profits. [Disclosure: I am on the Board of Directors for CIRA]

So to see BCTF and New Democrats pushing back against a long overdue commitment to teach code in our schools is upsetting to me and I suspect to the thousands of other folks trying to work on these key issues of digital literacy. I'm sure for the NDP, partisanship plays into it a bit, but this is not an issue to make a wedge of -- the outcome is simply just too important.

Getting to the bottom of things.

Not satisfied with the suggestion that our kids schools are candle-powered, I filed an FOI request to determine the actual state of affairs. I was trying to get to the bottom of just how many schools have computers and internet access in BC and was hoping to scope out the state of the digital divide. Are we tech-forward? WiFi-fearful? or are the kids really reading dead trees by candlelight?

This isn't my first attempt at tracking down the digital divide -- when the Connecting BC Agreement was launched I went looking for a Broadband Map of BC backbones and the state of our last mile availability. Long story short, the telecoms wouldn't share the information because -- and this is an epic policy failure -- they consider it confidential, proprietary information. Why is it confidential? Because a competitor might connect an area they don't serve!

My FOI request was filed with the province (Education, Premiers Office and MTICS) to have them produce their statistics on the digital divide in BC schools. I suspected the kids weren't out there reading by candlelight, but the extent to which the digital divide affects BC schools should have been illuminating.

The FOI request came back a few days later, well before the 30 days I had expected. It was an easy answer, all schools in BC have access to the Internet, but the number of computers in each school is not tracked by the province. There were no responsive records to the computers-in-schools portion of the request.

School District Autonomy

I was incredulous at the response -- how could the province not know how many computers are in our schools? This seems like basic stuff, and certainly core to any policy objective around digital literacy.

Turns out, there's a really good reason the province doesn't know this data -- the School Districts are autonomous in this regard. They're given a budget and expected to achieve outcomes, but are not otherwise monitored in how they achieve those outcomes. Provincial exams don't kick in until high-school and the province apparently cannot step on the toes of the districts.

There's a key policy change needed here, basic stats about the state of our province's education system should not be out of reach for our provincial ministries. An answer suggesting they don't track basic information should be scandalous, not considered collaborative with the school districts.

State of Computers in our schools.

I went searching on open sources for any evidence of computer levels in BC schools. Between globs of information about how schools, districts and neoluddistic parents had fought against Wifi and technology being introduced into our schools, I managed to find some info about our region.

In the context of reporting about upgrades to the SD63 school district, VicNews' Natalie North reported that there were some 2500 computers in the SD63 school district, and a 'typical school' like Stelley's had about 300 computers connecting to their server. Source.

Far from reading by candlelight, at least in Saanich, our kids have access to technology and should have no problem learning to code -- either in the lab or on the playground.

Unpacking Teaching Kids to "Code"

When the Education Minister says it doesn't take a computer to teach kids to code he's totally right. You don't seat a Kindergarten class in computer lab and tell them to follow along with the instructor. Instead, you upgrade hopscotch to teach computational literacy. A little later, you use a deck of cards to teach conditional logic. In later grades, you reconfigure the math classes to teach Boolean Algebra, De Morgan's laws, and Computational Logic... you teach Bayesian Algorithms, and how to make decisions using probabilities. You teach computational joins and set theory adjacent to existing education in Venn Diagrams. You teach digital citizenry and new media literacy in English class -- kids today don't need to learn formal memo writing, but they sure need to understand how to parse new media, evaluate a body of conflicting information and learn from online sources. They need to know how resume writing has transitioned into portfolio building, digital presence, networking and reputation. None of this has anything to do with screen time. It's about thousands of small changes to existing educational outcomes to create a curriculum supporting digital literacy. It's about teaching foundational understanding of the technology and digital culture that underpins our modern society.

Learning to "code" has nothing to do with learning HTML or JavaScript or any sort of vocational training in Java, C# or Python. If you think teaching code in K-12 exclusively or even majorly involves sitting students in a computer lab or teachers becoming developers, designers or engineers -- frankly you're doing it wrong and simply don't have a clue.

So where does one get started learning "code"? Well, code.org is a good start ... Khan Academy, MediaSmarts, Maker Movement, Mozilla Webmaker, Lego Mindstorms, Raspberry Pi, Arduino, and the list goes on. Don't even think about picking up a book on the programming language of the day.

The gender gap argument

Another criticism heard in opposition is that "code" is the domain of boys, focused on rockets and killer robots. But the reality is that these folks are just wrong. Girls more often outperform boys in digital literacy, and in-fact women even invented the field [see Ada Lovelace et al] -- but somewhere along the line this 'code is for boys' mythos emerged and has had disastrous social consequence. Think "code is for boys" and you need a stereotypical gender-targeted approach to break through the preconceptions? Try these options out...

Hour of Code - Frozen
Adafruit GEMMA V2

More importantly, History Class needs to teach girls about Ada Lovelace.

The curriculum

Teachers are also decrying the lack of a curriculum, but are, of course, also ignoring that if the province pushed one upon them they would be riotous about being told how to do their jobs. The province can set the high-level requirements and be accountable to competitiveness, digital literacy rates, etc... but our educators have to step up in order to implement the new commitments. They have to adapt to a changing world like the rest of us, and without the expectation of add-on funding and formal retraining -- the job is changing, but all our jobs are changing and teachers must adapt as well.

So where can educators find the curriculum? Well, an example can be found here and is a good start, but every community is different and integration into local priorities and learning outcomes will be key.

The money

So I hope I've established that our students are not learning by candlelight, and that learning code is not about spending time in the computer lab. What about the arguments about a lack of funding for this commitment?

I won't wade into the political minefield that is the BCTF vs Province relationship -- there's enough bad blood on both sides of that equation that to pick sides is impossible. One can't even wade into Districts vs the Ministries, after incidents like the OpenStudent debacle or some districts refusing to balance their budgets, the sector is a mess. No one holds the moral high ground.

What I can say is that there is money and resources available, at least in the NGO space. Most members of the Canadian tech sector are now involved in some level of community investment work, telecoms like Telus have massive community investment programs with money firmly aimed at bridging the digital divide. Small businesses are organizing around NGOs and are holding Hackthons and Hour-Of-Code meetups. code.org, khan and the like are offering free and open courses and curriculum to use.

So long story short, If you need computers for your school, get in touch -- there's a massive amount of NGO resources trying to help you, both with direct grant-funding and with time and human resources available to teach kids to code.

Why is "code" important / we're not all going to be computer programmers.

The final anti-argument goes something like "we're not all going to be computer programmers so we don't need teach kids to code, thats what college is for"....

We truly are entering the age of code -- not since the industrial revolution will the nature of work have changed as much over the course of a single generation. In the next few decades, if a job can be automated, it will be. If a job is a known commodity, it will be handled by the arbitrage of trade. Robots will self-assemble, reconfigure and create their own processes. Cars will eventually drive themselves. Mining, Biology and Finance will be robotic. Manufacturing and product creation will be bespoke, not institutional, and craft will dominate, spurred by the democratizing influence of technology. Even traditionally offline tradesmen, like mechanics and woodworkers now spend much of their time configuring flow-jet cutting machines, CNC milling machines, laser cutters and the like. Apple Farmers are now relying on tech to water and feed their orchards -- and this, not in some tech-forward experimental field, but found upon Gulf Islands like Gabriola. Traditional media is dying, replaced by new media. Traditional print and TV media personalities fail to achieve the readership of bloggers and Youtube stars -- ask your kids about Bethany Mota, Zoella and PewDiePie -- then take a look at their reach statistics. Podcasts like Serial garner up to a million unique listeners per episode. 

What all this means the language of work will change -- is changed. The ability to learn how to learn without being formally taught will be the most important skill-set for the coming generation. It's hard to explain the pace of this change, and its acceleration, but equipping the students of today for change is going to be the key to future competitiveness. Countries that teach kids "code" will lead the transition to the cognitive economy of tomorrow while countries that don't will find themselves uncompetitive in the new cognitive world.

"Code", therefore is the foundation of the new economy, and our kids better be ready for it. It's time to stop fighting change and pushing back against digital literacy. We must make this commitment a reality for every BC school and every BC student.

Challenge: Key space division in TOTP

One of the things I'm working on right now is a 2FA solution for some professional work and it's got me thinking about TOTP again. Years ago I took a trip down the rabbit hole on TOTP after Google Authenticator launched and they started using it in the mainstream. The TOTP algorithm stands unbroken, and the cryptographic community consensus is that TOTP codes don't reveal anything about the key. That is, that you could observe thousands of them and they wouldn't help you find the shared secret. But I've always had a nagging feeling that the whole thing behaves as some sort of perpetual motion machine in the Shannon space.

I can't prove it, I'm not an academic cryptographer and my math skills just aren't at that level, but theres something very visceral that my brain 'just knows', as if on instinct. I get this feeling from time to time and I've learned to trust it. 

So with my work returning to the 2FA realm, I'm still nagged by the following intuitive assertion; 

That every observed TOTP code N at time C divides the secret key space K.

That is, the number of candidate keys that can match the actual key decreases with each observation, and the set of observations has a relationship to the key space. The greater the number of N @ C pairs, the fewer the possible Ks that can meet all the observations. I believe this relationship to be extremely divisive and that only a small number of keys will produce the same 32 TOTP codes at the given times. 

For clarity I am referring to the knowledge of the partial input C influencing the key space of K, not the number of secrets in the 16 byte space. I refer to a reduction in the set of possible keys to search based upon the set of N @ C pairs. 

As a result, I have a challenge that is setup thusly; Using a popular off the shelf PHP TOTP library ( https://github.com/Spomky-Labs/otphp ), i've generated 32 TOTP codes each one day apart in time, and using openssl's random pseudo bytes to create a key of 16 bytes in length. I've printed the output of the following script. 

<?php

$strong = null;

$secret = openssl_random_pseudo_bytes(16, $strong);

echo "Strong: ". var_export($strong, true) . PHP_EOL;

use OTPHP\TOTP;

$totp = new TOTP;

$totp->setLabel("kevin@example.org")

     ->setDigits(6)

     ->setDigest('sha1')

     ->setInterval(30)

     ->setSecret($secret);

echo "Secret: ". base64_encode($secret) . PHP_EOL;

$time = time();

echo "Run Time: ". $time . PHP_EOL;

for($i=0; $i<32; $i++) {

  $timestamp = $time + ($i * 86400);

  echo "Timestamp: ". $timestamp;

  echo " TOTP: ";

  echo $totp->at($timestamp) . PHP_EOL; 

}


The result is 


Strong: true

Secret: <redacted>

Run Time: 1449789268

Timestamp: 1449789268 TOTP: 291494

Timestamp: 1449875668 TOTP: 216134

Timestamp: 1449962068 TOTP: 287667

Timestamp: 1450048468 TOTP: 439154

Timestamp: 1450134868 TOTP: 930590

Timestamp: 1450221268 TOTP: 056836

Timestamp: 1450307668 TOTP: 372930

Timestamp: 1450394068 TOTP: 604009

Timestamp: 1450480468 TOTP: 907565

Timestamp: 1450566868 TOTP: 806569

Timestamp: 1450653268 TOTP: 377693

Timestamp: 1450739668 TOTP: 047923

Timestamp: 1450826068 TOTP: 207111

Timestamp: 1450912468 TOTP: 676863

Timestamp: 1450998868 TOTP: 717443

Timestamp: 1451085268 TOTP: 444988

Timestamp: 1451171668 TOTP: 480191

Timestamp: 1451258068 TOTP: 131072

Timestamp: 1451344468 TOTP: 562432

Timestamp: 1451430868 TOTP: 119421

Timestamp: 1451517268 TOTP: 807229

Timestamp: 1451603668 TOTP: 221399

Timestamp: 1451690068 TOTP: 750264

Timestamp: 1451776468 TOTP: 624718

Timestamp: 1451862868 TOTP: 120224

Timestamp: 1451949268 TOTP: 461853

Timestamp: 1452035668 TOTP: 079031

Timestamp: 1452122068 TOTP: 139722

Timestamp: 1452208468 TOTP: 590428

Timestamp: 1452294868 TOTP: 979260

Timestamp: 1452381268 TOTP: 587875

Timestamp: 1452467668 TOTP: 643794

Can you find the key that produced these codes? If you can, you've just earned some serious bragging rights in the crypto community, as this is supposed to be impossible. On the other hand, if you can you find a lot of other keys that will produce all 32 codes at these times you may prove the algorithm secure. I have the key, so prove my intuition right or wrong. 

That is the challenge.

Whats wrong with BC's tech investment policy.

First, about me,

I've been in the BC tech sector since my first web design gig in 1998. Born and raised in Victoria, I incorporated my company federally in 2003. In Internet time that makes me older than Facebook -- I've created jobs, both contractors and employee's and i've paid dividends. I've written books on web development, helped to set tech policy and now sit on a couple of elected boards including the Open Data Society of BC, and the Canadian Internet Registration Authority [CIRA] the organization that manages dot-ca and helps to build a better online Canada. 

My corporation, StormTide Digital Studios Inc occupies the 0-4 employee small business sector -- sometimes called freelance web development. My primary business is selling PHP development services to the e-commerce sector. Between myself and a web designer I work with, we handle the e-commerce presence for more than 50 Canadian and American retail operations. Shopify we're not, but we do a respectable business, with a customization and integration ability unmatched in the industry. Want to put a million parts online, and have them index well, work with your dynamics platform and ship from a dozen pick and pack locations? Come talk to me. If you want to sell 30 sku's on the Internet? Go see Shopify. 

Additionally as a twice-elected (members slate) board member at CIRA I am involved with granting of millions of dollars to Canada's not-for-profit tech sector as well as overseeing the administrators of more than 2 million dot-ca domain names.

I write what comes as a personal political opinion, and not representing any organization I am associated with. 

The idiocy of chasing growth.

At some point in my career as an Entrepreneur I came across Dan Pink's "The Surprising Science of Motivation" and it really spoke to me; he talks a lot about mastery and purpose and that is largely what has driven me in business. It's not a forget-the-money proposition, but one that says, money doesn't buy happiness, but mastery and purpose can. 

I pay myself a fair salary from StormTide, somewhere near the tax sweet spot that leaves just enough retained earnings to even out the high's and lows. I get no pension or benefits and I rent a condo in Victoria. I'm no tech billionaire, but I also work 9-5, get the chance to serve on boards, be political and work on amazing projects with no bureaucracy. I get to write books on PHP when publishers come calling. I get to hack on some serious works of public interest like online voting and cybersecurity research. I also do civil rights work, and can be a massive pain in the rear to people who would abuse tech for dystopian means. In short, I might not be a billionaire, but I have mastery and purpose in abundance. 

It is in that lens that I look at the BC tech policy and shake my head. Hard.

Consider a Doctor with a General Practice, your typical family doctor down the road; now consider a society that measured their success by whether they had yet founded a Hospital. Thats basically BC's tech policy today; forget the family practice, the going concern, we're all going to build Hospitals, be billionaires and have scores of people working for us. The archetype to aim for we're told is Bill Gates, Steve Jobs and Mark Zuckerberg. If you're not landing a billion dollar valuation, well, you're not a /real/ player. 

I went looking for the stats today, you can see them here [pdf], and they confirmed what I already knew. 98% of BC businesses are small businesses and 79% of BC businesses have 0-4 employees. That puts my StormTide venture somewhere between tragically mainstream and the winter doldrums. It also means your chance of being a tech billionaire is less likely than you beating Muhammad Ali in a fist fight or scoring the winning touchdown at the Superbowl.

So then, one would expect that BC schools would be doing what they do for doctors, setting expectations of a lifetime of family practice, of careers in noble institutions, in expectation of decent wages and a work/life balance. That they would be aiming expectations for the Mr Cleavers of the world, the wage earner with a home and kids, and the time to see them, rather than suggesting we're all going to be the next Zuckerberg.

But thats not the tech policy. No, the official government policy is that we're all going to be rockstars, and everyone's a special snowflake. Literally, everything is sacrificed on the alter of growth, and those who fly too high like icarus, well, they just couldn't handle the heat. 

Its an idiotic model and one that needs to change.

100 Million in Series A nonsense. 

Today Christy Clark announced she's creating a new public venture fund to the tune of 100 million dollars.  The fund will ostensibly provide Series A funding to politically connected BC startups. They're targeting the Jobs not the Woz. 

I've been around a lot of tech startups for nearly two decades now; my experience teaches me that public money is toxic to startups. I could get into a rant here about SR&ED and such but Ben Fox over at Medium has already penned a great 10 minute read on the topic, entitled BC Startups; The government is not your friend.

Beyond this mess, all real startups are incorporated in Delaware. They call this being VC-ready, and its not optional if you want to play the series funding game. The founders might live in Vancouver sure, but name a successful VC-path company that isn't incorporated in Delaware? Is BC going to invest in Delaware based corporations?

So if the real VC startups are out, the real small business community (the 0-4'ers) are out... who is interested in a pile of BC government cash? No surprise there, its the BC-based government-grant-writing subsidy suckers, the blue chip corporate lobbies parading as tech councils.

BCTIA who was standing next to Clark when this mess was announced, has a wonderfully varied board that mixes between exec's from Telus, the surveillance expert from MDA, and peeps from Electronic Arts -- the progressive firm that helped bring you the high-technology worker designation and the gem that is EA Spouse. Want to know who is looking for 100m in BC's public capital? Well take a gander at the board makeup.

These folks have nothing in common with your typical BC tech business, the small 0-4 sector guys and gals just trying to earn a living in tech; the folks who dream one day to own a house in their hometown and to be able to walk down the city streets without seeing tents setup in the park. More to the point, they've also got nothing to do with BC startups, rather focusing on the Delaware C Corp markets like everyone else in the vulture game. 

Thinking you're going to startup fund 100 million into CCPC's and they're going to become the next Facebook and stay a CCPC paying taxes to the crown? Dream on. Think you can require CCPC? You just killed that startup. 

Think family practices, not hospitals. Horses, not unicorns.

The typical BC Tech Business 

Your typical BC Tech business is an entrepreneur and some contractors loosely setup to do something cool; if they've hired their first real employee (read filled out a TD1), then they're doing well. 

Throw a stone in this town and you'll find 50 of em. They come out to meetups, they're the geogeeks and open hackers, the Open Data kids, the civic developers. They're the designers who meet at the bar on Friday and discuss the latest moronic client to ask them to do work, for free, on spec -- because the clients idea is 'just that good'. They're all profitable, tax paying, and for the most part working off of their revenues -- you wont find convertible notes, venture funding or credit-default swaps in this crew. They're incorporated in Canada, not Delaware. They think a ratchet is tool for working on a car -- and thats a good thing. 

When the Venture crowd moves into town, they immediately suck the oxygen out of the room. They hire up a bunch of developers at 20%-50% over market rates, and build offices with kegerators, automated coffee machines and playground equipment -- the rest of the revenue businesses do their best to keep up, but don't really compete; after-all, they're competing not with the marketplace but with some nebulous pot of cash that has to be spent. The excess should be scandalous, but everyone seems to count it as just the cost of doing business in proximity to unicorns. Don't mind the rent in silicon valley and the social problems gentrification is causing when theres no corresponding increase in affordable housing. 

90% of these venture firms fail in the first few years. A fraction of 1% of the funded 'make it', go public and convert that investor equity into serious cash from Joe Public's retirement account.

The result on the industry is a boom/bust cycle, rapid wage inflation and equally fast deflation; a tragic instability in billable rates, service revenues and the availability of qualified help -- after all, none of this results in a single new qualified programmer being added to the marketplace, and startups don't pay for student tuition. There's no extra worker capacity in this system, and high tech workers already bill salary rates considerably higher than the median income. 

Worse, training a developer is a 20 year proposition, starting from grade-school -- I can count on one hand the number of developers I've met that started programming after graduation. Its not a field you retrain into, its a craft you master -- it would be as if the NBA was looking to create the next great team by investing in training adults to play basketball for the first time. 

You want to train programmers and grow the pool of talent and create good new jobs? You make sure every kid has a computer in the home and free access to the best self-paced learning opportunities outside the school system. You teach kids how to safely talk to strangers, and allow them to hang out on IRC channels and on mailing lists. You identify the difference between social media and internet communities. You kickstart digital literacy, and attack the digital divide.

But thats just the vulture capital model everyone knows and loves. What happens when the SR&ED sapsuckers get involved is even worse. Some ~5 billion in totally unnecessary corporate welfare is poured on the Canadian SR&ED sector each year and tech gets the lions share of it. The program was supposed to increase our innovation pace, but like copyright and patents, it stopped doing that a long time ago. You can read a bit about that here;

Due to the corporate welfare programs, these venture folks are billing a big chunk of their worker's salaries back to Ottawa and are adopting a profit-comes-later attitude. Even the market giants ignore revenue and continue the lie that they're all going to be the 1000x'ers once they finish building market share. Its a killer combination. The fair market rates for services crater. The tech industry calls it disruption, but really its just temporary market instability as no one really expects these companies will ever turn a profit or that the lower prices will become the new normal. There's a world of hurt out there for your typical Canadian family just trying to co-exist with the jerk-tech sector. Consumer's love it for a while, as the low prices seem almost too good to be true. But the catch for governments, is that they really are just that -- too good to be true; every so often, perhaps once per generation, someone invents the printing press or the automated loom and an industry changes forever, but most of the time, its just sock puppets and Superbowl ads.

The market effect is killer, the PHP developer who was previously freelancing for a fair wage is now facing competition from a venture firm who's billing 50% of the developer back to Ottawa and making their payroll through series A funding from the province. That developers now 'disrupting PHP' and giving away development time for 'free'. The whole thing is an unsustainable joke, but with the presence of founder-ratchets, down-round protections and the other 50 tools in the vulture toolkit, theres still lots of profit to be made for the financiers, even when things predictably go off the rails. In fact, while most founders believe they have the greatest investors ever, if there's an unfavourable ratchet in the agreement, chances are your investors are actively working to ensure the next down-round. 

You could call me cynical, but I've seen the cycle more times than I can count. The number of unicorns pushing 10 years old is an exceptionally short list. Whats not a short list is the number of failed startups and bankrupt tech geniuses who made bad business decisions. 

This is a failure architecture, and we need to give up on it already. You're not going to be a unicorn, realize the game, see the matrix, and aim for sustainability, mastery, purpose. Aim for craftsmanship.

The developer exchange.

If there's one light in the BC tech policy pipeline right now its the BC Developers Exchange. Its pretty much under-wraps, but they're developing it in a quasi-open way with public servants working publicly on GitHub, so little gem's are spilling out here and there. 

The idea here is to create some sort of pay-for-pull-request model where civil servants can get the freelance crowd to hack on their project backlogs. Its a solid idea, and it should be supported. 

But there's an issue, it seems to be following the same architecture of the prior Open Data program, in that there's no significant legislative commitment, no big projects to kick it off and no real meaningful funding announced. They're in the singles bar, but afraid to go talk to anyone because well "they've been hurt before". 

Worse, when the freelance crowd has managed to get governments on board (as they did in the openStudent debacle) the Ministries involved have always got cold feet and gone for the IT baron's products. I think they're following the no-one-ever-got-fired-for-buying-ibm model. A convenient defence mechanism that prevents real opportunity from ever taking root. 

IT done like this dies a death pecked to death by ducks and its why the Gov cant stand up a website for anywhere near what it costs in the private sector. 

What they should have done.

My prescription for the BC Tech sector would be to get out of our way. 

- Ease the securities regulations that make Kickstarter and crowdfunding essentially illegal in BC.

- Allow for Social Enterprises to incorporate, and develop an appropriate legislative body around this concept. Kickstarter itself just incorporated as a b-corp, so we're behind the times already.

- Reform the treatment of stock options and shares so that taxes are collected at legitimate liquidity events and not on a calendar basis (remember the JDS uniphase fiasco? The market does.). Folks are happy to pay taxes on legitimately realized gains, but there should never be taxes assessed on purely paper gains. 

- Develop things like flow-through shares and other capital mechanisms that encourage casual investment without the absurdities that come with venture capital type funding. These models actively encourage CCPC's over the Delaware model.

- Eliminate the concept of a qualified investor. The public can be accountable to its own bad investment decisions, and limiting risky investment opportunities to the rich while allowing folks to buy million dollar homes on leverage is totally inconsistent.

- Ban the Double-Irish tax avoidance nonsense. There's no point in seed funding the next Facebook if they're just going to turn around and pay no taxes.

- Get rid of SR&ED credits entirely and instead reduce payroll taxes. Thats 5 billion that could go a long way to eliminating the need for CPP remittances.

- Cancel the 100 million dollar Series A fund and address the reasons BC Tech businesses think government is not their friend.

- Reduce the overhead on maintaining a corporation; My biggest single expense after salaries and computer hardware is accounting and compliance services.

- Open all the things. Open Data, Open Source, Open Government, Open Corporates. Put some real $ behind it. There's a billion dollar civic tech market at the doorstep and it will pass us by for lack of a few peanuts worth of investment to address the cost-recovery problem and a lack of data warranty.

- Kick the Robber Barons out of BC government IT procurement. Learn how to develop in-house capabilities again and how to contract and work collaboratively with a vibrant market of small vendors. Become ready to work with 79% of BC's businesses. The 0-4 employee sector drives this economy. Ditch the warranty/insurance requirements and validate the work in-house like anyone else accepting a pull request on a private sector project.

- Develop a diverse set of programming languages, application servers and technology platforms, reject the Microsoft monoculture. Embrace BYOD. Embrace infrastructure as a service.

- Buy a Raspberry Pi and an Arduino for a kid every now and again.

- Learn how to evangelize BC Tech. Making connections is key to any tech business. Trade missions and other tools that can connect small business with export markets will pay dividends. Exporters aren't all rip+strip+ship. Theres thousands of BC small businesses that export tech services, and they contribute to the GDP, both intra-provincially and internationally.

- Most of all, don't pick winners and losers in the tech sector, ban corporate welfare, grant writing and funding applications for commercial entities. The 0-4 sector is hurt by these actions and they're downright counterproductive to market forces.

At the end of the day, the best tech policy would be one where we don't know there's a tech policy and have no reason to advocate for one. 

Rethink the plan BC. 

The long-term solutions City Council doesnt want to hear on Homelessness

With the Times Colonist reporting that City Council doesn't want to hear about long term solutions to homelessness at their upcoming town hall I thought I'd publish some comment I had been preparing; They're rough, not all workable, but hopefully helps folks to understand some of the issues we have to discuss if we're going to solve this crisis. Repeated bouts of criminalization and "quick fixes" are doomed to failure and waste both the City and advocate's resources on court challenges and needless harassment of our citizens.

 

In the spirit of truth to power; 

 

Initiatives to improve market affordability by creating market supply. 

  1. End the CRD urban containment boundary.
  2. Eliminate DCC's that are not directly attributable to a project.

  3. Eliminate new development/business parking requirements.

  4. Eliminate the extortive phrase 'amenity package' from council vocabulary.

  5. Reduce the number of zoning types from 628 zones to a handful representing residential, urban, commercial, and industrial zones.

  6. Eliminate zoning variance and spot zoning practices.

  7. Reduce the tax mill rate on residential units with assessments less than 1 million.

  8. Enact use-it-or-lose-it bylaws that require an occupation or active development permit or face speculation taxes.

  9. Have council set development policy, have staff enforce approval/rejections. End public hearings and council involvement in the approval of every shed built in the city. Return to a concept of strong and well-defined property rights and allow civil courts, not council, to deal with nimby/banana related disputes.

  10. Pay developers a bonus for every unit they create equal to 10% of the new taxes that will be generated for 10 years. (Incentives for creating newly taxable value)

  11. Improve transit options to allow for car-free living and eliminate parking costs.

 

Initiatives to deal with core causes of homelessness. 

  1. Safe consumption/injection site paired with well-funded rehabilitation programs.

  2. Self-exclusion programs for liquor retailers.

  3. Institutional care options for the most severe mental health issues when related to repeated criminal convictions.

  4. CrASBO (Criminal Related Anti-Social Behaviour Orders) framework for repeat-offender cases of theft, vandalism, intoxication in public, etc. End the revolving door cycle of arrest, release and re-offending for minor crimes.

  5. Greater funding for fiscal self-sufficiency programs. (Education in money management)

  6. Work-Ready programs to ensure everyone has valid identification, a social insurance number, up-to-date tax filings, bank account access, and access to clean clothing and personal hygiene services. Assist with filing bankruptcy and achieving a 'clean-slate' where applicable.

  7. Casual/At-will labour opportunities within the city; work opportunities based on a single days' effort or based upon a unit of production. Can do better than collecting refundable cans. Seek claw-back waivers from welfare programs to allow retention of benefits while doing a minor level of qualified casual work.

  8. Public outreach programs to tell citizens of Victoria not to give to panhandlers, and highlight better donation opportunities for the same charitable $, food banks, our place, etc.

  9. Adopt a case-management approach to each person experiencing homelessness -- tailor personalized solutions and interventions appropriate to each individual. Start with chronic criminal reoffenders.

 

Initiatives to deal with youth and young-adult homelessness.

  1. Fund more young-adult care options. Too many at-risk children age-out-of-care and are thrown to the streets with no safety net or supports.

  2. Better funding for social work programs for in-home interventions to deal with parental abuse, mental health and addictions issues.

  3. Positive youth opportunities for casual community contribution and paid work.

  4. Create stable and appropriate market housing opportunities for families.

  5. Deal with sources of societal and family marginalization including supports for LGBTIQ youth.

  6. Fund more anti-bullying/harassment programs and support systems for victims of this behaviour.

  7. Better Integrate community policing and restorative justice programs in a way that allows youth to see policing and social workers as friendly and in-partnership rather than always in a disciplinary/negative interaction setting.

  8. Provide better self-learning opportunities for literacy, numeracy, and computer skills. Provide a free and self-paced path to cognitive employment and a dogwood diploma.

  9. Provide free pathways to pardon services for the rehabilitated. Allow young-adults to escape the stigma of their past actions and achieve a 'clean slate' upon which to build.

  10. Ensure there are market housing options that are affordable (30% = $680/mo) of a median individual salary. ($27,200/yr @ 2013)http://www.statcan.gc.ca/tables-tableaux/sum-som/l01/cst01/famil107d-eng.htm

 

Initiatives to deal with the symptoms of homelessness

  1. Chattels protection. (Lockers placed throughout the city with time-release locks where homeless can place belongings for a period of time, and with a disclaimed expectation of privacy enforced by user-agreement such that police can search as appropriate)

  2. Post-office box services where homeless can receive mail. An address is core to receiving many government and employment services.

  3. Basic tenting platforms with bike/chattels lockers in city parks experiencing camping. Usage by permit available at homeless shelters and needs tested. Can revoke permits for those who offend the social order (eg public-view drug use, chattels not within tent or locker, etc). To reduce the security/neighbourhood impact, a maximum of 6 individual platforms/acre should be targeted. Tents must be taken down during the day, but may be stored in park lockers. Tents (see red-tent campaign design) would be supplied with the permit and only the approved tents may be used on the platforms.

  4. A public washroom station (including a shower), sharps container and emergency call station placed in every city park.

 
These are just a few additional ideas, and aren't intended as a replacement for the valuable and needed contributions of affordable & project housing, homelessness supports and councilling services provided by dozens of organizations throughout our town. 
 

Updated 3: Yubico reinvents the Yubikey

Last January I did a comprehensive review of the Yubikey for a client and published the results to this blog. My overall review was disappointment and a non-recommendation for the technology. I still hold that opinion when it comes to Yubico’s OTP technology, but it appears they took my review (and similar ones throughout the industry) to heart and have spent some considerable time and effort reinventing the technology. I’m pleased to have the opportunity to review the new technology, for the same client, and with the same end-goal in mind — replacing a installed-cert PKCS12/CA PKI solution with a more easily administered solution that maintains the same (or ideally better) security level.  

Yubico has launched 3 new products, the NEO the NEO-N and the U2F Security Key. You’re probably asking yourself ‘but there was already a NEO?’, and in some ways you’re right — the early NEO’s shipped under the same name but were a very different product. I panned them in my previous review for trying to make the OpenGPG application do things it was never meant to do, for having user replaceable applications and for generally having poor administrative tools. The good news is that those concerns have been addressed with the new NEO and it truly is an entirely new device worthy of a second look. 

The new keys are all based on similar technology (the NXP A700x MCU) and have 3 distinct modes of operation. The U2F key being limited to U2F mode, while the NEO and NEO-n can operate in OTP, CCID or U2F modes. I’ll get into U2F shortly, but of more immediate interest is the CCID mode. They’ve fixed the user-management of the card applications, and now ship fully configured and with static configuration. The end user cannot upgrade the apps, but neither can a hacker replace them with malicious versions either. Its a solid security move and one they should be commended for.

In CCID mode the keys implement a new applet, now supporting the PIV-II standard, which is an appropriate and ideal choice for identity and authentication. With PIV-II supported, the keys can interact with pretty much any standard PKCS11 supporting software, from OpenSSH to Firefox. PIV works with proper x509 certificates, CSRs and the like, and do so in a way that is both predictable and stable. The keys still support OpenGPG, but, critically only try to do so for use with OpenGPG and not other authentication models to which it is not suited. This is as it should be and one can tell that they’ve considerably upped their game with the PIV approach.

I’ve done a thorough test and review on the PIV applet, and while there were some challenges with building, and using, the key personalization tools, these challenges did not considerably interfere with my ability to setup the key. In this regard, better end-user testing on multiple platforms and better user documentation would go a long way, but a determined sysadmin is going to work through them as I did. Once the keys are personalized, they behave as expected and actually, in my testing, flawlessly with the OpenSC 0.13.0’s PKCS11 interface. CCID smartcards are a hard technology to get right, and at least in this respect the Yubikey NEO now appears to be leading the pack.

From the perspective of a user of a personalized key, the only barrier to entry is the installation of OpenSC and the configuration of the PKCS11 modules — which may be advanced topics for some end-users, but ones that largely go with the CCID territory. In testing I was able to get SSH key authentication, with an onboard-generated RSA2048 key, working as intended with against an Ubuntu 14.04 server.

Overall if you are in the market for a CCID smartcard, I would recommend the new Yubikey NEO and NEO-n devices.

Moving on however, its generally recognized that CCID technologies are not going to be accessible to the truly end-users who really only want to use a smart card for web browser based authentication actions and expect a driverless plug-and-play experience.

Enter U2F and the FIDO alliance.

U2F

The FIDO alliance is a massive collaboration project of industry giants on what 2nd factor authentication should look like on the web — and its quite good and very ambitious. The technology involves a bidirectional authentication architecture, borrowing heavily from site pinning technologies and certificate authentication solutions. Interestingly, it has succeeded in simplifying the user-experience for certificate enrolment over traditional pkcs11 solutions, and as such is likely useable-enough for mass adoption.

From a security standpoint, the FIDO U2F standard is mostly solid, but has some caveats that can weaken or even defeat the security architecture provided. I have great reservations as to the pluggable design of the technology, and fear that cheap tokens and software-emulation will weaken the security level desired by site operators. The technology gives the user considerable rope with which to hang themselves — and thats an architecture decision I question in a security product. Fortunately these issues don’t apply directly to the Yubikey’s, but rather to the ecosystem and server middleware around them.

The only hard security concern I have with U2F relates to how keys are stored on the token devices. U2F specifies that each site will receive a unique key and that keys will not be reused between sites. This is a great design feature and ensures both privacy and security. It however requires a fairly large amount of on-device memory to store what could be hundreds of security keys. To work around the secure storage requirements, the spec also allows for tokens to use a wrapped-key pattern, where these U2F private keys are stored, encrypted, but outside of the secure element — with only the device master encryption key being stored securely in the element. The idea being that so long as the master key is secure, the encrypted-but-stored-in-insecure-memory keys should be too. I fundamentally disagree with this design decision, as it allows for token cloning by breaking the encryption scheme or by factoring one single key. This element alone likely makes the technology unsuitable to applications requiring high-level FIPS certifications, and their use in the management of classified or sensitive information. That said, the spec allows for tokens that do not wrap keys to exist, allowing end-users to choose more secure tokens from the market. (though, none exist yet!). The Yubikey NEO is a device with very limited amounts of secured memory (There is ~80Kb of eeprom in the NXP 700x) and therefore I posit that it is implementing a wrapped-key pattern. (Update 2: Yubico has released some detail on their wrapping (actually a deriviation) mechanism. Details at https://www.yubico.com/2014/11/yubicos-u2f-key-wrapping/ No technical description confirming this is available from Yubico at this time). Because of the use of wrapped keys, it is my opinion that these devices should not be recommended for high-security use where device cloning is a concern. Update 2: The Yubikey's are implementing a remotely-referenced derived-key pattern. The security of the scheme is relying primarily on HMAC-SHA256 pre-image resistance. This allows the Yubikey to derive a key based on a publicly shared combination of nonce and application id, by muxing in a master key.

The above has a caveat that the U2F specification does include a counter that should prevent long-term dual use of a cloned token. However, in testing I have determined that the Yubikey U2F developer reference application does not correctly handle the counter feature and only treats a bad counter as an authentication failure, rather than an event that locks out the token. This allows for an attacker to find the valid counter by starting at the cloned value and iterating upwards until a login is allowed. The user of the real token would see one auth failure, but likely try again and succeed. This is a serious security failure in the reference implementation. I fear, that relying on the server implementers to correctly handle complicated security protocol elements like this counter will lead to failures — we’ve seen it time and time again from verifypeer to bad random sources. Basically, if theres a way for a developer to screw it up, they will. Interestingly, I also found issues with the way randomness was being generated in the reference application, however, it was promptly fixed with a bug report and it was not likely to affect most implementations.

The specification, and technology, requires a significant level of expertise for relying parties to implement the server side correctly and this implementation should not be taken lightly. I would recommend anyone implementing U2F engage a reputable security firm and perform hostile testing against the service. Aggressive logging of certain authentication failures is an absolute must.

UPDATE 1: Oct 30, 2014

Another serious security bug in the U2F reference application was found by another developer. It's recorded as Issue 6 and the reporter 'araab' points out that the U2F challenge response mechanism is not properly being validated. This again points to the fragility of the implementation side of the U2F equation and how its so hard to program cryto software securely.

Beta Software

Moving beyond the specification, we get into the implementation software running the U2F experience for the end-user. Currently only Google Chrome is supported and only with an extension. In this authors testing the product launched in a horribly broken state, with the browser reliably and predictably crashing. I have verified the result on a number of machines and the reviews of the plugin also report similar issues. This is truly disappointing and the device should never have launched with such glaring problems in the accompanying software stack. That Google also launched Security Key is truly surprising and I am taken somewhat aback that this got past Google’s technical review. This issue affects Chrome 38 on OSX 10.10 and would prevent the deployment of this technology until a fix is deployed.

After complaining on twitter, I received some helpful advice — to try the beta channel of Chrome. So I did, and I’m happy to report that Chrome 39 has resolved the main crashing problem. However, problems remain with regards to multiple tokens inserted at the same time.

The issue of multiple tokens might not seem like a big deal, but if one is to use a CCID mode for SSH authentication token (say a permanently connected NEO-n) and then try to make use of U2F tokens as well, the technology will break down currently.

 It is also not possible to concurrently use U2F and CCID mode from a single token. (Updated 3: This is now fixed with Yubico NEO Manager 1.0 [Released Nov 19, 2014], multi-mode use is now enabled and works as expected)

The NEO tokens are also slightly hard to configure. You need to download an application, referenced by link in a PDF document to configure the mode of the token to U2F from the default of OTP. For this reason, I would strongly recommend that anyone implementing U2F specifically target the blue U2F version of the device, which is already placed in U2F mode and is ready for registration upon first insert without further user configuration. 

Here the documentation about how to get started with the token could be significantly improved. 

The Hardware

The hardware feels surprising resilient, with the NEO-n being nearly indestructible. The NEO key format fits nicely into a USB slot on a MacBook Air, but could be snugger in the slot, as the key flops around as you try to press the button.

I believe the keys will stand up to reasonable wear and tear and are suitable for purpose.

Supply Chain

In the last review I noted that the Yubikeys were being shipped from a residential address and gave me pause with regards to the chain of custody. I'm happy to report that my new keys arrived having been shipped from a commercial address in Palo Alto. 

Conclusions

Overall I will likely recommend the Yubikey U2F solution to my client, but conditionally on the market adoption of Chrome 39+, and on the resolution of the multi-token problems. This likely means a pause in adoption for now, but recommending future support in the product map, and switching the functionality on at some future date when the browser experience stabilizes. 

I would however not recommend the Yubikeys to anyone requiring high-level of security that would include cloned devices in their threat model. This non-recommendation is due to the wrapping (Update 2: Derived based on HMAC-SHA256 and its pre-image resistance properties) and storage of keys (Update 2: Nonce+Mac) outside the secure element and problems with the counter-verification model. (Update 2: I'm also looking at how the keys are handling counters internally, but have not yet completed my analysis) ... While I am not aware of any cloning attacks on these devices, the architecture of the key leads me to believe they are possible, and that such attacks may not be reliably detected due to defects in server implementation models. I would recommend Yubico develop a security checklist for implementers that clearly specifies what to do with a counter-conflict and other authentication failures. I would also recommend that they opensource their key-wrapping technology so as to invite peer review of the approach.

Cloudflare, Keyless SSL and the selling of 'secure'.

Cloudflare is an amazing content-delivery-network (CDN) platform that powers a lot of the web. They make the web faster and more stable by distributing content around the globe so that it is nearer visitors, blocking abusive users and offering a school-of-fish approach to web security. They also have a unique reputation in the security field, both for making predictions about security that turned out to be wrong, and for having the openness and transparency to make their assertions publicly and to admit mistakes. Their Heartbleed challenge and their general approach to security helped the community take a theoretical problem in TLS (the technology that tries to secure the web) and prove that it was an actual issue deserving of immediate mitigation and attention.

Yesterday, Cloudflare announced a new product offering called Keyless SSL. They held the technical details back from the press until today. Now with the full technical detail release, I write this post as my excitement for a claimed innovation turns again into the disappointment of overhyped marketing and a product that doesn't deliver on the promise of secure end-to-end computing in the cloud.

The marketing material boasts that you can have "secure" TLS operation without divulging your Private Keys. The key claim made in their blog post yesterday ( https://blog.cloudflare.com/announcing-keyless-ssl-all-the-benefits-of-c... ) was:

"The lock appeared and the connection was secured, end-to-end." [emphasis mine]

There's only one problem -- its simply not accurate, and the connection is not secure end-to-end. That is, the connection isn't secured between the user and the organization they're doing business with. The reality is that the connection is being intercepted, in the middle, by a third party and with all the inherent security implications native to MITM proxies.

To their credit Cloudflare has put up a great technical description post of what they have built and you'll find the technical blog post at https://blog.cloudflare.com/keyless-ssl-the-nitty-gritty-technical-details/

Here's where the claims of security and end-to-end encryption break down: the Session Keys. In the Keyless SSL architecture, the Session Key is shared with Cloudflare. Whether its the confidentiality of user input (think accounts and passwords), the content of the website (think bank balances), the security policy of web scripting (think same origin), persistent storage access (think cookies), and so on and so forth -- its the Session Keys that form the foundation of the web's security and confidentiality model. Without confidentiality of the Session Keys, you don't have any security.

With the Session Key you have a read/write ability to view and modify any of the data flowing between the user and the organization that they are doing business with. This gives rise to a host of legal problems that come from the /ability/ to break TLS connections, but the short of it is, if you can read and write the confidential data in the middle of the connection (like at Cloudflare), then the session is no longer end-to-end secure and at some point, if you're big enough, you should probably expect a government will come knocking (or even hacking).

If you're an organization and you willingly give those Session Keys to a third party, you're just deputizing them for your entire online business. The product is anything but "Keyless" and involves significant confidential data disclosure to Cloudflare. While the service is running (read, the normal state of things) they have all the same authorities and problems that you do as it relates to the session security, and the problems of maintaining the confidentiality of your user information.

So lets go back to banking. Banking's hugely complicated but one of the key obstacles to their adoption of cloud technology is that there are strong privacy protection rules that govern banking confidentiality. This prohibits a third party from receiving confidential information about account holders -- and it makes it impossible to share a Private Key or break encryption to allow for the use of a CDN service. Such prohibitions apply despite the benefits to the bank of using a CDN service, such as improved fault tolerance and service resiliency. Kicking the proverbial can down the road from the SSL certificate Private Key to the Session Key, doesn't change anything as it relates to the confidentiality of banking information -- Like seeing the users account username, password, or account balances. In both public and Session Key scenario's Cloudflare's service can read your client's information -- (compelled, accidentally or hackerishly) -- and thats the rub, its just not end-to-end secure.

Law Enforcement and Political Risk

There's also the issue of the Law Enforcement Agencies (LEAs) using the Cloudflare service to break into user accounts. In effect, a Cloudflare-like system expands the range of actors that can be forced to disclose sensitive key material to government agencies. This includes ongoing and logged access to the generated Session Keys and TLS parameters, which could be compelled by government order.

We all know about the cheeky spy-agency 'SSL added and removed here' slide, but even outside the clandestine, we have witnessed the willingness of American authorities to compel American companies to disclose this type of information before. The case in point is Lavabit. In the Lavabit case the company, which was strongly marketing privacy and security as key service selectors, was forced by legal process to provide cryptographic material to facilitate the installation of interception devices against the entire Lavabit service. The technical design of the system did not protect against such a brazen, service-wide approach to user-account access by law enforcement. Thankfully Ladar Levinson, after a long legal fight, shuttered the Lavabit service as a result of the political damage.

This is all to say, that technical security to guard against oneself as a bad actor is extremely important. It takes more than rhetoric and the hope of doing good to build truly secure services that are immune from the political risks we saw in Lavabit. If you want to known more about the Lavabit nightmare, Moxie Marlinspike has an absolute must-read on the subject. http://www.thoughtcrime.org/blog/lavabit-critique/

The lesson here is that it is important that Cloudflare and other services trying to implement TLS services truly understand the scope of the political risk they are creating when they start managing keys, even Session Keys, on behalf of others.

They must be up-front with partners about these very real risks.

Practical Abilities - Private Keys vs Session Keys

The only significant difference between the Session Key and the Private Key in a TLS setup is that the Private Key comes first in the sequence. The end goal of the handshake is to derive a shared Session Key, and its this Session Key that provides the abilities and confidentialities expected of TLS. The TLS Handshake/Private Key process can be thought of abstractly like a Session Key minting machine, but its these Session Keys that actually work the locks. Once you have the Session Key for a connection, the Private Key is no longer relevant for that connection.

With access to a service's Session Keys you can:

  • - Intercept confidential user information. (usernames, passwords)
  • - View and change the content of user web pages. (MITM attack, see bank balances, etc)
  • - Identify and isolate specific website users. (User profiling)
  • - Be subject to government requests for data interception, manipulation.
  • - Suffer a security breach the same as a bank could.
  • - Access user information like cookies and browser storage.
  • - Script against the user's browser as the domain.
  • - Cryptographically impersonate the service and authenticate with an end user.
  • - (Future? Bind the service to a specific TLS key.) https://tools.ietf.org/html/draft-ietf-websec-key-pinning-20
  • - Operate an app firewall/CDN service and inject captcha's and similar into secure sessions

Only one very small part of the TLS operation is performed by the certificate Private Key in question, and its none of the things we really care about -- like the ability to maintain communications confidentiality and respond to law enforcement as a first party. Really, the only significant new ability the Keyless solution offers over a shared Private Key solution is the ability to turn off the creation of new session keys; like in the case of Cloudflare's service becoming compromised, its a better kill-switch. If thats the only contingency you're planning for, Keyless is right for you.

If thats not your contingency, and you have practical issues with third party data access; like ones of legal, policy and malicious attackers -- well then, its like Moxie described of Lavabit:

"Unfortunately, their primary security claim wasn't actually true."

To translate to Cloudflare Keyless SSL, I'll posit:

Unfortunately, their end-to-end security claim wasn't actually true.

Reinventing the wheel

The really bad news though is that, what they "invented" using raspberry pi's and fabled stories of skunkworks development was already largely found in commercial off the shelf products known as networked hardware-security-modules (Net-HSMs).

Thales, Amazon, among others make network HSMs -- You put your Private Keys in them, they stay in the datacenter, and then you point a webserver (or group of webservers like cloudflare's CDN) at the them using an OpenSSL engine (among other methods). The HSM handles the Private Keys, signs the secrets and, in effect, provides a similar kind of service as what Cloudflare is doing with their signing daemon. The webserver just offloads the cryptographic signing operations to the HSM via OpenSSL directly.

So really, there's nothing all that new to see here, networked HSM's have been around for a long time, and they do practically the same job, all-be-it at a high cost. However, due to the problems inherent in trusting keys (even Session Keys or oracles) to third parties, they have never really been popular for third party access or use. They're primarily used within an organization as a defense-in-depth technique to limit the damage caused by network intruders.

New security risks

Contrary to claims made about the service providing equivalent to on-premise security for SSL keys, I find there are a number of entirely new risks presented by the model.  Just a few of the newly expanded security vulnerabilities include:

- Trusted Cloudflare staff compromising the service. (Third party employee risk)
- Government agencies "hunting sysadmins" at Cloudflare. (LEAs, NSA, CSEC, GCHQ, etc)
- Hacker risk both at Cloudflare and with oracle access control. (unauthorized use)
- Technical downtime risk. (more points of failure)
- Oracle attack risk.
-- This is where the Oracle is tricked into revealing something about its internal keys. The post covers a couple of these known attacks, padding oracles and timing attacks, but at least for the latter it doesn't solve them, it just pushes responsibility off to the OpenSSL library that, while the best we have, suffers regularly from new attack vectors and zero-day vulnerabilities. I would be curious to see how the signing oracle design will stand up over time to the increasing sophistication of adaptive chosen plaintext/ciphertext attacks, among others.
- Confused deputy risk. (attacking the oracle to sign malicious data)
--This is where something goes wrong at the service and you sign bad data anyway. For example, the signing of specially crafted data. There are a number of TLS security concerns regarding the signing of crafted data, and this presents an entirely new risk when a third party is involved and able to get you to blindly sign data.

and the list goes on and on. In the world of TLS security, extra eyes and extra legal organizations (possibly in different jurisdictions) with access to data creates massive risk. Oracles that blindly sign or decrypt arbitrary cryptographic data are generally frowned upon as an architecture, as are deputized daemons that cannot verify the source of their inputs; as a result I have to question the idea that the the Cloudflare Keyless SSL solution provides the same level security as on-premise keys.

So with the Keyless SSL architecture, in my opinion, thoroughly debunked, we're still left with the fairly 'classic' E2E problem; how can we leverage the cloud's benefits while maintaining the confidentiality of the communications?

Cracking that E2E security nut to work with intermediaries will be one of the key research projects of our generation. Its one of the reasons I was so excited when Cloudflare announced they'd managed to do an E2E-secure CDN and optimistic that if anyone could solve it, it was them.

Sadly, with the announcement of the Keyless SSL technical details, it seems that future is still over the horizon.

As always, I welcome a response from the vendor and will happily update this post to include their response if provided.

Victoria Amalgamation - Grasshoppers and Ants.

Amalgamation is back in the news today, the polls look supportive, but is data, ignorant of the financial consequences useful or actionable?

I’m a data guy, and when it was suggested that Victoria put a question to voters: “Are you in favour of reducing the number of municipalities in Greater Victoria through amalgamation?”  I thought about the issue and realized I had no information on which to base that decision.

Amalgamation is a super complex subject — Victoria has 13 regional governments plus the CRD, a lot of redundancy and as evidenced by the Sewage issue, problems making decisions that don’t boil down to not-in-my-backyard. Amalgamation could be hugely helpful here, and would tend to bend my thinking to the YES side of the question, but then there’s this nagging question in my head. WHAT WILL IT COST?

I sought to answer that question. I pulled in all the favours from all the data agencies I could think of. Apologies to the Data BC team, and Citizen Services as I requested and FOI’d data from the province — which it turns out they don’t even track. I had a simple question to solve:

What is the financial position of each of the 13 municipalities in Greater Victoria?

Turns out, no one, not even the province can answer that question - and I have the negative FOI response to prove it.

All municipalities are required to submit an annual report to the province, and that includes data about debt, reserves, income, etc… it even has some data on non-financial assets (those things like sewers and roads that municipalities are principally responsible for) … but, and here’s the rub, the data is historical — how much they spent, and how much the spend depreciated. An old city like Victoria with its aging infrastructure looks a lot smaller than it is on paper because so much of the infrastructure was installed in the early 1900’s. So wheres the financial position really sitting? Is the value of a city its assets as classically understood, are its liabilities really just financial instruments, spends and depreciation — or is the liability really the fact that the city has to maintain its infrastructure service level? We cant turn off sewers, water mains, or stop maintaining bridges and roads.

There’s a surprising lack of sophistication in tracking that liability, and it has led to a phenomenon known as ‘borrowing from the pipes’ wherein a municipality defers critical maintenance to pay for politically popular amenities. Its certainly hit the Greater Victoria region, and hit it hard over a number of councils and is generations away from being fixed — this is a long term problem that requires long-term solutions.

I set out to answer that question though, what is the financial position of the 13 regional municipalities? — and I’ve started to get answers, but only via very time consuming FOI requests. No one has studied this, and the poll-accessible public has no idea what this amalgamation thing will cost them.

Amalgamation supporters suggest that studies come after the question — but for me the question is unanswerable. I would support amalgamation if it were a philosophy, but not if my Dads household (Saanich residents) taxes go up, while services go down to cover off municipalities that have failed to maintain their assets.

Aesop’s Fable of the Grasshopper and the Ant come into play here. While the ant dutifully toils all summer to put away food for winter, the grasshoppers just sung and played. When winter came, the grasshoppers were banging at the ants door for food only to be given a hard lesson in planning for the future. This region's municipalities range from ants to grasshoppers.

Not one to take no for an answer, I have begun to FOI the region's municipalities for data on this ‘borrowing from the pipes’ question. City of Victoria was the only municipality to proactively publish the information on their website — and I now have 3 other FOI responses, from Saanich, Esquimalt and Oak Bay.

The data’s up at Google Docs forgive the formatting as its a transitory dataset and will be cleaned up when I’m done.

Here’s the 30 second version; You take the book value from the annual report (what the municipality considers the financial value of the assets in the ground) and divide it by what it will cost to replace when its useful life is up (the replacement cost figure)…. This gives you a ratio — lets call it the McArthur Infrastructure Ratio. This isn’t a perfect measure, and there’s been lots of problems pointed out with it (inflation and appreciation of fixed assets being the biggest issues)… but we can factor out most of these as they are comparable between cities. On a per-city basis the ratio isnt particularly informative, but when compared to its neighbours, it tells a story.

So far the ratios in Victoria break down like this;

Format
City - McArthur Infrastructure Ratio (Book Value : Replacement Cost) [Future Liability R-B]

Saanich - 39% ( $758,105,520 : $1,946,400,000 ) [$1,188,294,480]
Esquimalt - 34% ( $77,312,184 : $219,560,000 ) [$142,247,816]
Victoria - 18% ( $342,756,413 : $1,708,000,000 ) [$1,365,243,587]
Oak Bay - 10% ( $49,548,291 : $485,039,900 ) [$435,491,609]

I’m working on getting all 13 prior to november, but FOI is a slow process.

What this tells us is that Saanich is full of Ants — prudently paying for their infrastructure as it ages and deferring amenities until they can be afforded. Oak Bay, not so much. Lots of happy singing and heel chirping coming from that region. Victoria sits in the middle. Most importantly, there are billions in future infrastructure liability for our next generation.

So with that in mind, what does the Amalgamation question look like? Well, it looks like Saanich residents are going to get a pretty raw deal — they bring over double to the table when compared with Victoria on a financial basis, 4x as much as Oak Bay. They will certainly lose in an amalgamated structure. Oak Bay on the other hand, would do very well financially — and it is this fact that, that I believe is so strongly driving this agenda in the wealthier circles of town.

All of this is to say, its way too early to ask the question: “Are you in favour of reducing the number of municipalities in Greater Victoria through amalgamation?” …. I would instead ask “Do you support committing funding to study the issue of municipal amalgamation?”. That would be the democratic question. Asking for an opinion of an ignorant public is little more than ‘distraction’ and the result isnt useful. Sadly, making a case based on data doesn’t seem to be on the agenda for the pro-amalgamation lobby — and we saw that again today with this poll.

Metadata, privacy, access and the public service.

On May 15, 2014 the OIPC (office of the information and privacy commissioner) released order F14-13 [pdf] denying a Section 43 application (to disregard a FOI request).  Being the data/privacy policy wonk that I am, I tend to read all the orders put out by the OIPC — there’s usually something interesting. This one was really interesting.

Someone had filed a request for the metadata associated with government emails — that is, who emails whom, and when — but excluding the content of those emails. The Open Data community has long mused about filing such a request, as it could be the single most important dataset for understanding how our government works, however, it was always considered extremely audacious to file as the public service was sure to have a strong reaction to an unprecedented level of analysis of their communications. On May 15, I had no idea it had been filed, or that there was even a case before the commissioner.

So, upon seeing the OIPC ruling, I filed an FOI request with Citizens Services (now denied) for the Section 43 application and the supporting documentation, that resulted in the order. I was hoping to learn why the province felt it should ignore this request, and under what justification. I also contacted the privacy commissioner’s office to see if there was any way to become an intervenor on the file and provide an amicus-type opinion for the commissioners consideration.

Through the opendatabc mailing list, I posted the story, and Paul Ramsey came forward and shared that it was his request. For those who don’t know, Paul is a brilliant data geek, having helped build the PostgreSQL database software that powers much of the internet — if anyone has the ability to work with this information, it is he.

Moving ahead 30 days later, I have my FOI answer — records prepared for an officer of the legislature (ie the OIPC) are outside the scope of FOIPPA and my request for the Section 43 application and documentation was denied outright by Citizens Services. The OIPC process wasn’t fruitful either, as the Section 43 matter had already been ruled on and they weren’t sure the file was going to come back to them — so no avenue for comment there. (I’m now told, via Paul, that the request has been denied again subsequent to the Section 43 ruling and has gone back to the commissioner for another round. I’m still hoping to be able to provide comments.)

This issue might be the single most controversial FOI request filed in BC history — and it will set a lasting and groundbreaking precedent. At question is whether the public service is accountable to the public in its metadata records. The public interest in the metadata cannot be understated, nor can the complexity of the access rights in question.

As a comparative however, CSEC, Canada’s signals intelligence agency spends obscene amounts of money analyzing the metadata of foreign governments — under the guise of increasing Canadian economic advantage. Will the FOI legislation, allowing citizens to oversee our own government, be given the same funding and economic priority as say, CSEC spying on Brazil’s government?

A core question is that of whether it is ‘just metadata’? — privacy commissioners have disagreed citing privacy implications, spy agencies have argued its no big deal arguing it has different privacy expectations over say a telephone wiretap, but — and here’s the crucial part — when it comes to transparency of the public service, where there are explicitly waived privacy expectations found in email policy documents and a crucial right of public access, what will the balance be for public service metadata?

In my opinion, this could be the single most valuable dataset ever released under FOI and this request will likely define public sector metadata policy for generations to come.

It is crucial that we get it right.

The state of Open Data 2014

I was reading an old blog post I wrote in 2011 about the state of Open Data in BC ( http://www.unrest.ca/2011-state-of-open-data ) and thought I’d pen another annual update. I should do this every year, but sadly, I’ve not had the time to really blog lately.

In 2011, I highlighted rights and privacy clearing, cost recovery, scaling and license diversity as major failures and opportunities for course correction in the emerging open data field — and I’m sorry to say, many of these problems materialized.

But we’ve also had a lot of successes since 2011 — the Open Data Society of BC (disclosure: I’m a member of the board of directors) has held two highly successful Open Data Summits that have convened hundreds of people from across Canada and even the world to talk about Open Data. My favourite memories of these events were the edgy talks, like Pete Forde’s entitled ‘People are dying to get the data’ ( https://www.youtube.com/watch?v=s7rpKYSZUDo ) because they really bridge a gap between the civil service and the data activism that is occurring all over the web today. These events help bring together people who would otherwise never meet, and invite them to learn from each other.

The Data BC group of the provincial government has been doing a great job with what limited resources they have — in the last couple year’s they’ve facilitated the publishing of unprecedented transparency/accountability information in the form of fiscal data, the personal information directory and geographic data that has been hugely helpful to a number of stakeholders.  They’ve done considerable work on licensing and on trying to source data — even where it doesn’t exist. I’ve come to like and respect the work they’ve done for BC in a challenging environment.

But there’s a problem in the foundation of this group as well — they don’t have a budget to replace funding for datasets that are currently being charged for (the cost recovery problem), they don’t have the statutory ability to command data release from other ministries, and they don’t have the resources needed to implement the commitments made in the G8 Open Data Charter — especially the transformative commitment to an ‘Open By Default Policy’. This fix will have to come from cabinet, take the form of significant budget increases and involve the creation of a legislative framework. Moreover, the architecture of data release will have to change — a central team fetching data for a portal wont scale. Data release has to be federated within each ministry, and just as each ministry has an officer responsible for handling FOI requests, so too should they have one to handle data requests. Its 2014, its time to make data exchange as seamless and as common as email in the public sector.

The lawyers are also hurting the economics of open data — while much progress has been achieved on licensing, there are still very real debates about conformance to the Open Definition and serious problems with the legal disclaimers of liability for intellectual property and privacy rights clearing. It is my belief that these issues are hurting commercial investment in Open Data.

Across the country, other groups are also making positive progress — the Federal Government included a large funding commitment for Open Data in their 2014 budget, they’re hosting hackathons (which they misspell as appathon [because hackers = bad of course]) and their MP Tony Clement is taking every opportunity to talk about the benefits of open data and the future promise of a more transparent public service. Major wins with digitally filing access-to-information requests, and citizen consultation exists in this area. The publication of valuable datasets like the Canada Land Inventory and Border Wait Times are also impressive.

There too, there are big failures. Canada Post is suing innovators over their use of Postal Codes ( CBC Story ) and DFO’s hydrographic data remains closed and mostly collecting dust. The government seems to be ignoring responsibility for Canada Post’s behaviour, but most will point out that they have jurisdiction over the Canada Post Corporation Act and could make a simple and common sense legislative change to resolve this embarrassment to our federal open data commitments.

We’re making progress municipally — the City of Vancouver has made amazing strides in digital archiving, making digitized archives available on the local intranet in a unique and groundbreaking way that deals with intellectual property concerns. The City of Victoria has embraced open data, they launched VicMap (making their cadastral data open), began webcasting council meetings and published an open data portal. They even hosted a hackathon with Mayor Dean Fortin and Councillor Marianne Alto helping the Mozilla Webmaker folks teach children about digital literacy and creating the web [ link ]. The City of Nanaimo continues to lead the pack with realtime feeds, bid opportunities, maps of city owned fibre resources, digitally accessible FOI processes and so much more.

In the private sector and ngo space there are so many notable projects — the GSK backed Open Source Malaria project being my favourite. There are also successes like Luke Closs' and David Eaves’ recollect.net in the civic app space.

The hacker space is also seeing some success with proof-of-concept prototype applications developed by citizens at hackathons going on to inspire civil servants to create their own versions and publish them widely. The BC Health Service Locator App and the Victoria Police Department App both get credit for listening to citizen input.  Other apps have been created and have seen little to no uptake, like those developed to help citizens better understand freshwater fishing regulations (mycatch), or storefront display apps to help the homeless find shelters with available space (VanShelter). The next steps here are clearly to create bidirectional projects that allow both civil servants and citizens to work collaboratively on applications together using the open source methodology. (Who wants to be the first to get the Queen in right of British Columbia into using GitHub?)… Other projects have failed to find traction due to lack of, or bad quality data. My OpenMoonMap.org site is failing, due to unreliable WMS-only access to data from NASA which is down more often than it is up… the lesson here, online services are no replacement for downloadable primary source data. mycelium.ca (My house of commons video service) is in its 7th year of operation, and continues to prove that even simple prototype apps can be useful and long-lived, drive policy change (House of Commons Guidelines) and find feature uptake (Hansard now has video clips embedded). Hopefully same-day recording download, clipping and linking will be added to ParlVU and this app will no longer be useful.

For the coming year the Open Data Society of BC is crowdsourcing its agenda and I’d encourage you to participate in that discussion and to join or support the society.  via OpenDataBC-Society

I know I missed some people and agencies who are doing great things, so please leave comments if I missed you. (tweet me @kevinsmcarthur for an account as I dont monitor this site’s admin pages often)

(Updated) Evaluating the security of the YubiKey

The folks over at Yubico have responded to this article, and I'm happy to post their letter. It gives a little additional context to the issues I presented and a critical 'other-side' response. I'm happy to see the company actively engaging and addressing the issues really quickly.  There's a couple bits that need clarification. For example the nonces I point out are actually used in other places than inside the synclib, and the 'facilities' issues re the las vegas 'shipping center' were purposefully left vague of the full detail to avoid exposing what appears to be a residential address.

-- Yubico Letter --

At Yubico, we enjoy reading articles from security experts such as yourself, and we appreciate the visibility you provided Yubico through your detailed evaluation of our Yubikeys. Our security team at Yubico takes your assessment very seriously, but there are some clarifications and intricacies that we wanted to share with you that we’re confident will convince you that the Yubikeys offer the highest grade of enterprise security in a comparative product class. Please feel free to contact us if you have any further questions/comments…

The Protocol.

- The Yubikey personalization app saves a .csv logfile with the programmed key values meaning a malware-based attack may discover the log files on block devices even when the files have been deleted

In the most popular scenario, customers choose to use YubiKeys configured by Yubico, where the cryptographic secrets are generated using the YubiHSM’s TRNG programmed into YubiKeys in Sweden the UK or US at our programming centers that use air-gapped computing [at least 1 air gap between the programming station with its control computer and any network]. The plain text secrets database generated on the YubiHSM is encrypted by the customer’s validated Public PGP key, signed by the programming station’s control computer’s secret key, the plain text file is zapped in place and the secrets securely deleted from disk and memory on the programming station. At Yubico, we call this the “Trust No One Model”!

The Yubico personalization app provides customers the flexibility to program their own keys at their convenience. Yubico does acknowledge that customers programming their own keys may not be aware of this risk that the AES keys are in the .csv file and we are working to change the default behavior and provide additional warnings to inform users of the potential risks.  

- Replay prevention and api message authentication is implemented on the server receiving the otp — this has resulted in a number of authentication attacks like https://code.google.com/p/yubico-pam/issues/detail?id=18 which are now corrected in later versions of the protocol. The design, however, trusts the server with the authentication security and thus presents a problematic architecture for shops that do not have crypto-aware developers and system admins able to verify the security setup is working as intended

As you have astutely observed, we’ve fixed the issue you’ve seen in later versions of the protocol. For customers who don’t have adequate crypto-aware developers and system admins, to secure authentication servers should work with solutions from our trusted partners.

- The replay prevention is based heavily on the server tracking the session and use counters and comparing them to a last seen nonce. It also depends on application admins comparing public identities. This should ensure that the keys cannot be intercepted and replayed. Some servers do not properly validate the nonce, the hmac or properly protect their databases from synchronization attacks. Some implementations do not even match the token identities and accept a status=ok as a valid password for any account. (try a coworkers yubikey on your account!). The weak link in the Yubikey-with-custom-key scheme seems to be the server-application layer.
- The Yubikey protocol when validating against Yubico’s authentication servers suffers from credential reuse. It is vulnerable to a hostile server that collects a valid otp and uses it to login to another service, and its vulnerable to hacking or maliciousness of the authentication severs themselves. You are delegating the access yes/no to Yubico under the cloud scheme.

Customers who are concerned about using the YubiCloud infrastructure for YubiKey OTP validation should consider implementing their own authentications and validation servers. Yubico provides all the necessary server components as free and open source code. Customers may also chose to configure and use the YubiKey with own OATH based authentication servers. 


The code.

- The yubikey-val server contained a CURLOPT_SSL_VERIFYPEER = false security vulnerability in the synchronization mechanism.

We have fixed this issue about certificate validation when using https for sync between validation servers. We do however want to point out that the vulnerability had only limited repercussions. This case is closed on GitHub, https://github.com/Yubico/yubikey-val/issues/15

- The nonce and other yk_ values however were not and could be modified by a MITM attack. The attack presents a DoS attack against the tokens (by incrementing the counters beyond the tokens) and possibly a replay issue against the nonces. — however replay concerns require further study and I have not confirmed any exploitable vulnerability.

The nonce used in the ykval protocol from the php client is predictable but it is unclear if this is an issue. We will be doing further review to address any possible exploit vectors if they exist. The case is still open https://github.com/Yubico/php-yubico/issues/5

- There were also instances of predictable nonces. The php code ‘nonce’=>md5(uniqid(rand()))) is used in several key places. This method will not produce a suitable nonce for cryptographic non-prediction use.

The server_nonce field is only used inside the synclib code to keep track of entries in the queue table so we deem this as acceptable. We provided further explanation on GitHub https://github.com/Yubico/yubikey-val/issues/14

- The php-yubico api contains a dangerous configuration option, httpsverify which if enabled will disable SSL validation within the verification method.  Again defence-in-depth approach protects the transaction, with the messages being hmac’d under a shared key, and mitigating this as a practicable attack.

We are working to resolve this issue that was highlighted in order to provide the defense-in-depth protection to eliminate the possibility to turn off https certificate validation in the php client. This case is still open, https://github.com/Yubico/php-yubico/issues/6

- The C code within the personalization library contains a fallback to time-based random salting when better random sources are not available [ https://github.com/Yubico/yubikey-personalization/issues/40 ] however, I cannot figure a time when a *nix-based system would not have one of the random sources it would try before falling back to time salts.

We made the relevant changes to the code and addressed this issue about salting when deriving a key in the CLI personalization tool on windows. Thank you for pointing this out. https://github.com/Yubico/yubikey-personalization/issues/40

The Neo

- As a result, for both personalization issues and third party software, these modes aren’t useful in a real-world deployment scenario, but may be useful to developer users. That said, other smartcard solutions support RSA signing in a smarter way than layering the OpenPGP application on a JavaCard, so are likely a developer’s choice over the YubiKey Neo in CCID mode.

We don't quite agree your analysis of our NEO product and want to point out that there is a distinction between YubiKeys use for development work versus production roll-outs. We intentionally allow users to re-configure the keys and this allows for possible attack vectors but this does not mean the product or protocol is insecure. In a production roll-out, most of our customers choose to protect their Yubikeys with a configuration password.

Therefore, for the NEO, we allow development users to turn on and off the CCID functionality as they choose. In a production roll-out, this function is pre-configured and protected. In addition, It is not obvious to us what makes the NEO in CCID mode less usable than any other CCID + Javacard smartcard product implementation. We have also introduced a PIV applet that allow for all flavors of PKI signing including RSA.

Logistical Security

We respect the due diligence done by you to find out about our locations through public information available on the web. However, although Yubico resides in a shared facility which houses other popular internet companies, we have a dedicated office which is accessible by authorized Yubico employees only. Our current focus is on delivering products with the highest level of quality and security, and the big corporate office will come soon ☺.

Just-In-Time Programming & Efficient Packaging offers Yubico a competitive advantage:
Because of the size of the YubiKey and our unique packing technology, 1 operator and the Yubico Just-In-Time Programming Station can program over 1,000,000 YubiKeys per month.  YubiKeys are handled in slabs of 10 trays of 50 YubiKeys [500 YubiKeys in total] with 4 slabs per 10kg box [2000 YubiKeys].  A pallet of 100,000 YubiKeys weighs less than 500kg. Therefore, Yubico logistics and programming can be performed in facilities that are not available to other authentication hardware companies. Our logistics and programming team have all been with Yubico for more than 5 years and are among our most loyal and trusted employees. We pay particular attention to the security of our programming centers, and update our processes consistent with the advice of our market renounced security experts.  


Thank you!

 

-- Original Article Below --

Every so often I get to take a look at a new security device with the hopes of replacing our existing PKI-based systems, which while very secure are an administrative nightmare and dont lend themselves to roaming profiles very well. This time it’s the Yubikey Nano and Yubikey Neo devices from Yubico that I’m evaluating.

All Yubikey devices support a custom OTP generator based on a home-rolled OTP implementation. There’s a pretty good lack of formal proof on this scheme, but the security boils down to an AES 128 shared secret design, which is reasonably secure against in-transit interception and key leakage. This appears to be the primary threat model the Yubikey is trying to protect against and may be its most secure property.

The Protocol.

The protocol appears to have seen limited review against privileged attacks, and presents a number of security concerns, including:

- The ability to reprogram the devices to a chosen key before account association due to default configurations being shipped both programmed and unlocked. Users are warned against reprogramming their devices as they will lose the API validation ability, however an upload mechanism exists to restore it. Check to see if your key starts vv rather than cc when using the Yubico auth servers as this may indicate reprogramming.

- The Yubikey personalization app saves a .csv logfile with the programmed key values, meaning a malware-based attack may discover the log files on block devices even when the files have been deleted. Needless to say, with the AES keys from the CSV, the security of the scheme fails.

- Replay prevention and api message authentication is implemented on the server receiving the otp — this has resulted in a number of authentication attacks like https://code.google.com/p/yubico-pam/issues/detail?id=18 which are now corrected in later versions of the protocol. The design, however, trusts the server with the authentication security and thus presents a problematic architecture for shops that do not have crypto-aware developers and system admins able to verify the security setup is working as intended.

- The replay prevention is based heavily on the server tracking the session and use counters and comparing them to a last seen nonce. It also depends on application admins comparing public identities. This should ensure that the keys cannot be intercepted and replayed. Some servers do not properly validate the nonce, the hmac or properly protect their databases from synchronization attacks. Some implementations do not even match the token identities and accept a status=ok as a valid password for any account. (try a coworkers yubikey on your account!). The weak link in the Yubikey-with-custom-key scheme seems to be the server-application layer.

- The Yubikey protocol when validating against Yubico’s authentication servers suffers from credential reuse. It is vulnerable to a hostile server that collects a valid otp and uses it to login to another service, and its vulnerable to hacking or maliciousness of the authentication severs themselves. You are delegating the access yes/no to Yubico under the cloud scheme.

The code.

Yubico has taken a radical transparency approach and published all their source code on https://github.com/Yubico…. despite what follows below, this approach should breed confidence in the product over time. When compared to closed-source products, the Yubico product would appear to have the leg up when it comes to identification and correction of security flaws. They are also taking a defence-in-depth approach to API security by signing and checking values even over TLS links. However, while this mitigates a number of coding concerns, it may introduce new concerns and I remain concerned about the use of a hmac signature, over user controllable data, encrypted under TLS as it is a case of a mac-then-encrypt scheme.

I did some basic analysis of the PHP and C code, and found a number of concerning items. The yubikey-val server contained a CURLOPT_SSL_VERIFYPEER = false security vulnerability in the synchronization mechanism. Thankfully the developers had taken a defence-in-depth approach to the API and session and use counter were restricted from being decremented. The nonce and other yk_ values however were not and could be modified by a MITM attack. The attack presents a DoS attack against the tokens (by incrementing the counters beyond the tokens) and possibly a replay issue against the nonces. — however replay concerns require further study and I have not confirmed any exploitable vulnerability.

There were also instances of predictable nonces. The php code ‘nonce’=>md5(uniqid(rand()))) is used in several key places. This method will not produce a suitable nonce for cryptographic non-prediction use.

The php-yubico api contains a dangerous configuration option, httpsverify which if enabled will disable SSL validation within the verification method.  Again defence-in-depth approach protects the transaction, with the messages being hmac’d under a shared key, and mitigating this as a practicable attack.

The C code within the personalization library contains a fallback to time-based random salting when better random sources are not available [ https://github.com/Yubico/yubikey-personalization/issues/40 ] however, I cannot figure a time when a *nix-based system would not have one of the random sources it would try before falling back to time salts.

Logistical Security

I also took the opportunity to look at the security of the Yubico logistics process, and came up with a number of questions, not the least of which was that my Yubikey was apparently shipped from a residential address in Las Vegas Nevada. This gives me pause with regard to Yubico’s claims that the keys are “Manufactured in USA and Sweden with best practice security processes”. I have questions about the chain-of-custody of the keys.

Public sources investigation into the addresses provided on the Yubico site suggest the addresses are shared with other firms and seem to be co-working spaces rather than the typical corporate offices as one would expect of a company providing security products to companies like Facebook and Google.

The Neo

The Yubikey Neo includes a JavaCard based CCID/smartcard device and the ability to support the OpenPGP app. In testing this is clearly beta software and requires kludgey ykpersonalize commands with arguments like -m82. Its obviously not intended for widespread use, and as such it was discounted over more mature ccid products.

Some of the YubiKeys also provide a challenge/response and hotp/totp clients. They’re implemented via middleware that isnt first party and is commercial software on OSX and requires users to learn hotkeys to use.

As a result, for both personalization issues and third party software, these modes aren’t useful in a real-world deployment scenario, but may be useful to developer users. That said, other smartcard solutions support RSA signing in a smarter way than layering the OpenPGP application on a JavaCard, so are likely a developer’s choice over the YubiKey Neo in CCID mode.

Summary

In the end, I cant recommend the Yubikey to replace our PKI systems. The Yubico authentication server api represents a serious information leak about when users are authenticating with the service, and puts Yubico in a trusted authenticator position. It’s also the worst form of credential reuse, and a hostile/malware infected server can compromise unused OTPs and use them against other services.

What this means is that when using the default, pre-programmed identity, if someone were to break into any Yubikey-cloud based system we use, they could break into all of our systems, by collecting and replaying otp’s that have not yet authenticated with the cloud. While you’re using application x, its using that credential to log into application y and is pilfering data. Despite telling them not to, most users reuse passwords across services, and so the login credentials from one service will usually work with another, and the attacker has everything needed for successful login. This is in stark contrast to PKI based systems that always implement a challenge/response using public key cryptography. This also applies to any corporation using pre-programmed keys in a single-sign-on type scheme, as any of the hacked application servers can attack other servers within the sso realm.

Yubico’s or our own loss of the API keys (not the token’s aes key) used for server-to-server hmac validation would also silently break the security of the system.

Because of the problems with the authenticator service architecture, we would have to program the keys into a single security zone, significantly weakening our currently partitioned services (currently multiple pki keys can be stored and automatically selected via certificate stores and/or pkcs11). In the best case scenario, this would leave us to program the Yubikeys ourselves and ship them out to end-users, adding significant costs to an already expensive device and weakening our security architecture in the process.

In short, the device’s default configuration is not sufficiently secure for e-commerce administration and the pre-programming of the device is not financially viable due to shipping and administration costs. The sso architecture creates a single security zone across many services and this is not desirable or best practice.

I will continue to seek out and evaluate solutions that offer PKI-level 2nd factor authentication security without the headaches of administering a production PKI.

Pages

Subscribe to unrest.ca RSS