Thursday, April 2, 2015

The Osmosis Lie: Crowded Trading Floors

From a recent job ad for analytics positions in a Canadian hedge fund:
"... no remote workers. We all sit on the trading floor."

Having worked in quant finance, this is now my litmus test. There is absolutely no good reason why researchers and programmers need to be physically co-located in a loud, crowded trading floor, bullpen, or cubicle bank, especially given modern workplace chat programs and convenient issue tracking tools. Any organization that says programmers or quant researchers need to sit in the trading floor so that one can absorb information, learn the tradecraft osmotically, or whatever, is blowing smoke. 

It's business-speak for "our company is mired in legacy tech, bureaucracy, and outdated standards." Such jobs have to pay above market rates, award bonuses, and give disproportionate raises each year, because it's the only way people who are thoroughly burnt out from the sisyphean task of overcoming the daily noise to focus will even bother coming to work. For better or worse it seems that the only people interested in staying in these jobs for long only do it for the money -- there is a huge culture of self-selling and an extreme willingness to bottomlessly compromise on tech culture standards, like a proud martyr. If you join the firm as a technologist with actual skill and some common sense opinions about software culture and best practices, you just get chewed up by the politics and nothing changes. As a result, the steady-state of most of these organizations is a state in which all of the transient signals from good engineers and culturally progressive technologists have been beaten down to noise, either because such folks only last a short while before quitting, or else because they ultimately decide keeping their heads down and collecting the paycheck is better than fighting endless culture battles.

Another red flag is a requirement that a programmer is "comfortable in any language" (what the hell does that even mean?) or possibly a false promise that "you can use whatever language you want." It speaks of bespoke reinventions of wheels all over; no modular, tested central library of common tools. I once interviewed for a firm that highlighted my interest in Haskell. They said one of the perks of the job would be that I am personally in charge of maintaining my own software tools for rapid prototyping and so if I wanted to write it in Haskell I would be free to do so. At first blush this sounds interesting -- being paid well to write and learn Haskell! But on closer inspection, it's total poison. Wait until you're forced to port some of your nifty Haskell constructs into R or Java and interoperate with some legacy system. Then you'll badly wish there had been enforced standards from the beginning, even if said standards had removed your so-called freedom to homebrew all of your own Haskell gadgets in the first place. This is the kind of dubious ploy that is attractive to inexperienced programmers who haven't lived through a shitstorm of work surrounding migration and integration headaches. Progressive technology culture isn't about draconian, bureaucratic standards (the things that led us to enterprise Java) but it also isn't about having zero standards and zero foresight into the modularity of the larger system

Basically, these jobs make you sit physically tethered to a trader or portfolio manager, beholden to that person's real-time whim about what needs to be computed. It's literally retail data exploration. You are those caricatures of scientists and engineers that you cynically mock in Michael Bay movies butchering tech jargon while they show the joint chiefs a monitor full of gibberish that will save us from the Russian alien robots. You don't have time to blink; you just type! If it means hacking something in an ungodly amalgamation of Excel, C++, Python, and R, and slapping some duct tape on it so the trader can use it before lunch time, then by god that's what you must do. Documentation? Who has time? There was a standard library module that already did everything you just spent 4 hours re-inventing? Too bad there was no time to research a rational approach to the system's design. Someone needs to reproduce a result you calculated 6 months ago? Good luck, buddy: here's my 5000 lines of undocumented R code.

The kicker is that there is such a macho attitude of "this is just how finance is" that the company is usually proud that it's like this, instead of recognizing that it's the very definition of developer hell, and, more importantly, it is not at all required or implied as a domain constraint of finance. Instead of allowing traders to have beholden computation assistants, you could just actually force traders to know how to do real programming. Or, you could actually trust data scientists, statisticians, and other inference domain experts who can program to also develop trading strategies, instead of underemploying them as data secretaries. You can organize teams around common tools, common analysis motifs, etc., and build with re-use, modularity, and QA in mind, so that writing best-practices compliant, tested software is not incompatible with meeting intense intra-day deadlines.

I wouldn't feel so much resentment of this sort of thing if it weren't for the macho pride, the attitude that "this is just how finance is" -- crappy hacks, zero unit tests, slinging Excel sheets around with zero data provenance, hacking everything onto 20-year-old legacy C++ and legacy Java, allowing traders and economic domain experts infinite freedom to pick their computing tools (and letting them push the needless headache of integration onto someone else). 

It's something the industry should be ashamed of and should be working exceptionally hard to change. But it's not even admitted as a problem -- it's lauded as if it's heroic. We don't need your wimpy best practices here, we just roll up our sleeves and do whatever it takes to make the system work. It's almost by definition a culture of mediocrity since the only definition of success is whether the system as-is is making money -- very little thought is given to the counterfactual money being left on the table by not doing things in a fundamentally better way. If a thing is good, it doesn't mean it's better.

Sadly there are all sorts of political and market inefficiencies in the industry. Clients don't do a good enough job of holding investment managers accountable for returns, research, or technological best practices. When large institutional clients are involved, the deals made between them and the respective investment managers are often borne out of bizarre political alliances, nepotism, internal career politics, and so on. Firms spend inordinate amounts of money constructing marketing apparatuses to spin every market outcome as though it either could not possibly have been avoided or else proves the investment acumen of the firm, and institutional clients sadly buy it.

It's sad to say but the most likely way for this nonsense to be fixed is outsourcing. Cloud computing frameworks will be created that present simple interfaces to the domain experts, and the start-ups that create such interfaces will be separate and free to employ whatever technological best practices they see fit. Simply by shifting the burden of the problem outside of the stodgy, politically dysfunctional finance firm, it can suddenly be solved. It could easily imply significant layoffs for this sort of commoditized retail data analyst role. It reminds of the Max Planck quote: "Science progresses one funeral at a time."

Thursday, March 12, 2015

¬ Agile

Some of my thoughts on Agile got some Twitter press recently. I reproduced them below, with some light editing to make the series of comments into a single essay.

One complaint I have with Agile is that it takes the avoidance of a long-term plan too far — to the cartoonish extreme that user feedback is basically hooked up as somewhat of a control system. If user feedback was coherent, this might not be a problem, but in general user feedback is all over the map, contradicts itself, and calls for feature implementations or bug fixes that are highly speculative from an engineering investment point of view. If you are lucky enough to work on an Agile team where engineers are actually consulted for the feasibility of this kind of thing, then it might be OK and Agile might really be more of a set of bookkeeping conventions for you. But that’s not at all how it plays out in practice. In practice you’ve got sales-facing and business-facing managers in the loop, pushing various agendas and always looking for an angle to play based on the work in the back log and pointless metrics. The metrics are also a pretty substantial detractor. Burndown is a joke, particularly compared with linear burndown, and also when compared across teams and projects that vary wildly. Again, I suppose in some theoretical vacuum it might be OK if managers actually understood that you can’t just look at half a dozen burndown graphs and generalize that into any sort of meaningful management decisions. But that’s not what many middle managers do: instead they warp, twist, and misrepresent Agile metrics to suit their changing political circumstances and to lobby for their self-promotional agendas. It’s really depressing as an engineer to literally see in real time (as story points are being partitioned out) how some political agenda is being lobbied for with useless metrics and how it is directly affecting your personal work. I’ve always found this article about estimation games particularly enlightening about all that.

Agile is also extremely costly and wasteful, particularly of time. Take an average entry level developer salary, which is not trivial these days, and tabulate up what 15 minutes for a stand-up every day equates to in terms of actual dollars [1]. The false assumption is that such planning was going to have to happen anyway, but it’s not really true. For all the clamoring that Agile is more streamlined with planning, I’ve always found rapid iterations to basically result in more total  planning time over the long run, kind of like the fractal behavior of measuring the coast of Britain or whatever. It artificially feels like mini-planning every few weeks keeps you lean and focused on just the important stuff, but it’s a mistake. You aren’t aware of your global context or what the whole user story feedback loop is steering you towards, and so frequently the rapid planning is needed to pivot away from a bad development path that would have been avoided entirely with more long-term, up-front planning. I’m not saying that there should be 3+ months of Ivory Tower planning before you start a project, but the extreme micro-planning alternative is also too much. It ebbs and flows, changes by personnel and nature of the project. No framework will solve every problem, so why not be open to occasionally approaching something with a longer-term, research-focused type of planning period? Why always preclude it by the very nature of the tightly maintained iteration schedules?

From a philosophical point of view the lack of global context is the dealbreaker for me. Software is a fundamentally creative enterprise and the whole reason why it’s worthwhile to incur the insane cost of the complexity of software is because you can design it, meaning you can use intelligent, optimization power to consider the vast space of possible designs and you can intentionally and deterministically and predictably cause your work to steer you into a pocket of design space that meets your many needs.

When you agree to be steered by a feedback loop essentially made of a lightly business-filtered stream of customer feedback, you throw this optimization power out the window. You throw away something deterministic and efficient, like gradient descent, and you replace it with something random and inefficient like simulated annealing. Quite literally, you are making somewhat random tiny changes and sitting back to witness whether customers randomly accept or reject them. You are evolving your product in a manner more like Darwinian evolution, which is not sensible when the alternative is direct, creative optimization. If the nature of the design problem is so challenging that you are forced to optimize in this way, it should be lamented and not celebrated as if it was a superior business strategy. And you should be vigilant in a search for how to rescope the problems such that they are amenable to direct engineering, rather than control-system-steering-from-user-feedback.

There’s also a lot of the whole “the road to Hell is paved with good intentions” in Agile. Earnest tech managers may really want to prioritize refactoring workflows, re-designs, optimizations, etc. But all it takes is some slippage, some important business deadline for Q2 that has all the sales people in a nervous mood, and suddenly no matter how much you had been promising to take that big refactoring item off the back log and turn it into some legitimate user stories for the upcoming sprint, you’ll have to break the bad news again to everyone that you’ve got to hack even more new features on top of the bad infrastructure. This is not necessarily Agile’s fault, but Agile also doesn’t set up any kind of a framework that actively makes this entropy growth less likely. Part of the reason is that to do that — to create a brand of QA that explicitly makes it hard for short-term business concerns to override refactoring concerns — is antithetical to the Agile metrics (and this is not good for the middle managers who require such stats for their own advancement) and the time frame of Agile workflows.

Lastly, and this one is just my own personal gripe, Agile is infantile. Agile as a system oppresses me: it says that the system doesn’t care if I am clever, or if I have thought of a great new way to implement something, or if I have thought of a clever hack that I want to try out, or a new gadget that I can code up and share with my team. It discourages tinkering and discourages creativity big time, especially big picture creativity about the global optimization of the project. Agile infantilizes you because it says don’t be a whole person. Don’t bring all of your skills to bear on your work: only bring that minimal set needed to solve the specific user stories shoveled onto your plate. In fact, leave every other part of your professional self at home so as to avoid burning out, feeling bored, feeling shunted from progress, or otherwise becoming unhappy with the cog-like way you are expected to churn through discretized, measured progress. This might be necessary if you manage 10,000 programmers, most of whom suck, but it also means you will never ever be able to get those few really good programmers who can make all of the difference.

In response to the inevitable contortions made to defend whatever “pure Agile” might mean apart from Scrum and things like Scrum, I feel this captures my sentiment: “whenever a majority of users tend to misuse a tool, beyond a certain point it becomes the tool’s fault.”

If Agile (or any given framework) can be easily subverted by politics, then regardless of how good its theoretical intentions are, the framework is a failure. Given that nothing about the basic Agile principles addresses a need to explicitly encode a mechanism that makes it hard for short-term business needs or political concerns to supersede legitimate engineering concerns, and in fact that the Agile tenet of “Responding to change over Following a plan” can be viewed as an endorsement of exactly the opposite in many cases, it’s not particularly surprising or even interesting to point out what a failure Agile is, and it’s not disingenuous nor extreme to flatly say that Agile does not add anything to our communal wheelhouse that common sense had not already put there. Yet it does subtract a lot by way of all of the previous criticisms.

To restate it: there is nothing disingenuous about flatly saying that Agile fails, that Agile adds nothing worthwhile, that Agile is very easy to subvert, and that as a result Agile is not worth defending (certainly not defending in contrast to the likewise idiotic doctrine of Waterfall), and I don’t feel the least bit like I am being one-sided or extreme in saying so. I don’t give Agile any kudos. It didn’t fix anything and the creation of Agile does not deserve credit or praise. Young developers should boycott jobs that will debase them with mandatory adherence to Agile or any other fixed, reactive prescriptive framework that short circuits their ability to learn by tinkering on the job. The opportunity cost of accepting such jobs is often too high. Society at large should be pissed off over all of the aggregate value that has been frittered away due to Agile. That some value has nonetheless been created from within Agile implementations (indeed, in spite of Agile) is no credit to Agile.

To me, the many responses that faithfully recite the so-called Agile principles as a defense of the desirability of Agile always read as a No True Scotsman fallacy. If you want to define the term “Agile” to mean “only good things and never bad things” then “Agile” is vacuously always good and never bad.

All of these principles that Agile tries to appropriate for itself are desirable. I don’t know of any formal software development process, whether it is Waterfall or Agile or any other, that would ever claim to explicitly deny any of those points. Where in Waterfall does it say to deliver late, crappy, bad software that doesn’t meet the client’s new specs? Yet some development processes are better than others at meeting these goals.

Agile doesn’t own the copyright on whatever good quality software is. The idea of delighting your customer, planning when you need to plan, requiring specs when you need to require specs, using common sense, collaborating with business domain experts, … these things have been around since the dawn of commercial software itself in one form or another. They weren’t just dreamt up in 2001.

If someone wishes to define “Agile” as == all good software quality standards, more power to them. It becomes a useless definition and reminds me of the Yudkowsky quote, “If you are equally good at explaining any outcome, you have zero knowledge.”

I choose to define Agile by whatever common business practices emerge in its presence and I am confident that if Agile is defined this way, it removes the mask and reveals the ugly truth that as a system it merely pays lip service to quality while being all too easy to politically subvert.

[1] Here's a quick and dirty estimation of the cost of merely just the daily standup meetings that are prescribed in most implementations of Agile. Let's suppose conservatively that you've got about 5 engineers on a single team (in practice this is all over the map, even ranging upwards of 30 in a team I was a part of, though we never officially tried to have all 30 in the same standup). Junior to medium level developer salaries in most major metro areas are going to be between $\$$85k and $\$$120k, let's call the average $\$$100,000. And let's say that you've got two more senior developers, with average salaries maybe in the $\$$150,000 range. Again these are conservative estimates.

If we assume roughly 250 working days per year and 8 hours of useful productivity per day (again, conservative), this means the junior devs are paid $\$$12.50 for their time in a 15 minute meeting. The senior devs are paid $\$$18.75. This amounts to $\$$3125.00 per year per junior developer and $\$$4687.50 per year per senior developer. With 5 junior and 2 senior developers on a team, the team's total annual cost just for the daily standup meetings alone is $\$$25,000.00. That's 25 grand per Scrum team per year just for one daily meeting. And we're not even factoring in the costs of providing insurance, commuter benefits, etc., which will make the actual annual cost considerably higher.

If we assume this ratio of 5 junior devs to 2 senior devs is roughly constant throughout an organization, it means that on average every 7 developers are costing you at least $\$$25k per year just for Scrum standups. If you employ 70 developers, that becomes $\$$250k. If you employ 700 developers, it's $\$$2.5 million. If you employ 10,000 developers, you're talking about more than $\$$35 million dollars. Just. for. standups. (not even counting insurance or other benefit costs, nor whatever productivity drag there might be just from having the meetings.)

How to Write a Technology Job Ad

We are seeking an idealistic, quality-driven technologist to think creatively about unstructured, high-level problems and the trade-offs needed for their pragmatic solutions. If you have the maturity level to avoid considering yourself a ninja or guru of anything, read on!

You'll love working in our private, quiet offices from 8am to 4pm Monday through Friday, where we don't stock any beers or gourmet coffee!

You (an adult human mammal) are expected to figure out your own meals, but don't worry: we pay you enough money to eat and also to afford a structure to protect you and your family from the weather while you eat.

We're looking for someone who gets things done, such as the things in the job description that we mutually agree on and not other, surprise things.

Do you shop at Target instead of Wal-Mart? Well then you probably have more than enough higher education for us.

We don't want to know which buzzwords you selected for your resume. We just want to talk to you about your skills and experience and see if we can collaborate like amicable adult colleagues.

Benefits: actual insurance, actual vacation time, actual help for retirement.

Because it's 2015, please submit your resume online and don't fill out 6 pages of info that we could just read from your resume... we'll do that.


Saturday, January 3, 2015

A (an?) Haiku

Inspired to totally rip off a phrase from this:

I am a quilt of // loosely cobbled-together // coping strategies.

Tuesday, October 21, 2014

Bad Job Avoidance

I've been unemployed for just over four months. I've been searching for jobs every day: I've conducted over 30 phone interviews (not counting the dozens of third-party recruiters I've spoken with), filled out well over 100 applications, participated in more than 10 later-stage on-site interviews, and rejected 5 formal job offers. It's definitely stressful and demoralizing to go through such a difficult searching experience.

I spoke recently with a third party recruiter who did not like my preferences: I expect to have a job with satisfactory compensation and benefits, satisfactory vacation time, a healthy work/life balance, freedom to use computational tools that are best for the job, the opportunity to gain quality work experience, the opportunity to learn new things, and the ability to work in a suitably quiet and private space to facilitate software development productivity.

I believe these are minimally acceptable preferences: a job that fails to meet even a single one of these items is not just less than ideal; it is unhealthy and should be flat out rejected.

The recruiter asked with a smug arrogance, "So, how's the job search going?"

"Great," I replied. "Here I am enjoying the fact that I'm not stuck in a horrible job."