I am an identity and access management specialist, an area of technology which has good reasons for being a niche specialisation, as do many other other technology areas. But you can imagine my surprise at receiving this unsolicited email from a recruiter:

“My client, a leading British Bank, require a Role Based Access ControlManager to join their offices in Edinburgh on a 6 Month Contract basis with the possibility of extension.
The RBACM will help design the Role Based Access for technology partners throughout my client’s Separation and Business Proving stages, all the way through to transitioning the service to BAU functions. It’s important to note that this role is a Technical ‘Hands-On’ role which also requires Project Management capabilities and the ability to plan.”

Let’s begin with the meaningless drivel, the kind many supposed HR experts suggest we fill our CVs with. Beginning with ‘technical hands-on’ and ‘management’. What usually happens when you follow these self-professed career gurus and add such an inane line to you CV is that a firm looking for a technical role will think you’re not technical enough: who has met a project manager who had the time to get ‘hands-on’ in a multi-million dollar project? Also, requesting that a candidate ‘has the ability to plan’ is just insulting.

Now on to the central issue with this role, role-based access control (RBAC) is one of the many skills an experienced IAM consultant acquires in the course of their career. Every organisation is different, there is no RBAC school or certification, just as there are no IAM schools. Working in IAM requires you continuously keep several key questions in mind in every project: ‘who are you?’, ‘what is your relationship to the organisation?’, ‘what are you allowed to access?’ and how to always be able to monitor that the answers to these questions, defined by business rules, can be continuously monitored and irregularities rapidly identified and remediated.

No two organisations are the same, ‘best practices’ are a myth, an example of business bureaucratic discourse. Suppose a candidate did have a professional experience history, thanks to a series of contingent projects they landed, which did focus on RBAC. So they get the job, and find themselves (hopefully) in a team of IAM ultra-specialists, with four or five people each dedicated to a very narrow subset of typical IAM problems. Of course, due to their job description, if they notice a glaring non-RBAC security risk, it’s not in their job description to be concerned, expert X over there deals with user provisioning, RBAC guy should just sit this one out.

Suppose they get the job, and at the next interview for an IAM project, they were rejected because the interviewer felt their focus on RBAC meant they would not be as capable at user provisioning, or federation, or designing LDAP directories.

This is a typical example of unnecessary specialisation. The same happens with vendor product expertise, or programming language knowledge. And then marketing posters go up in the cubicles by some HR genius encouraging employees to take initiative. I highly doubt these marketing posters aren’t put up without irony, if not, I seriously worry about the level of brain-washed institutionalisation of these well-meaning people.

Computers, and the information era were born thanks largely to school drop-outs who learned by doing (not Steve Jobs! He was just a ruthless businessman). This is the hacker mentality, and this approach works, as history has demonstrated. An interview should assess a candidate’s capacity to rapidly learn new technology, not on the amount of useless, multiple-choice IT certificate tests completed nor on what they happened to be doing in their previous job.

American corporations invented modern-day bureaucracy, and it pervades tertiary education and the work place. Business bureaucracy, the practice of treading water making powerpoint slides to pass the time, the biggest enemy of innovation, and stultification and unnecessary specialisations are the result of this. Once we do not grant the other the possibility of being capable of the intelligence to learn new skills and adapt, this turn of events is inevitable.

Words and language, Ranciere observes, have already been transmitted to the student without explication. Human children learn language at too young an age for explicators to instruct them – they learn in a non-systematic fashion, by hearing, retaining, repeating, making mistakes, and being corrected. However, we are given to understand that only when the process of formal education begins does the child who learned to speak through the work of her own intelligence and through a mode of instruction distinct from explication, only then does her proper instruction begin. This instruction proceeds as if the student were all of a sudden unable to learn with the same intelligence that she used in the past. -Dr Grace Hellyer

Jaques Ranciere, in his book “The Ignorant Schoolmaster” studied alternative pedagogical approaches, and came across the amazing story of a lecturer in the 1700s. Joseph Jacotot had moved to Belgium from France, and found none of his students could speak French. His approach was to point them in the direction of a bilingual newspaper, and to tell his students to just ‘figure it out’. The results were amazing. Which leadJacotot, and from him Ranciere, to regard the teacher/manager/professor as having no special access to knowledge. In fact, explication, a teacher’s explanation in simple language of some complex concept to students is patronising and presupposes an innate superiority of the teacher over the student. Per Ranciere, the role of the teacher is merely to stimulate the will of the student to learn and gain understanding on their own terms. The traditional method of explication, so rife in business and education, Ranciere warns can only lead to stultification.

How does this tie into specialisation and the information era? Firstly, when candidates are assessed only on their existing skills, it is under a completely misguided assumption that the candidate is not capable of figuring out new things. Secondly, with the wealth of information online, not to mention the explosion of knowledge today in an ever-changing world, being able to find and quickly apprehend new knowledge should be considered one of the most important skills in the 21st century. The hackers know this, and have gone on to challenge the entire IT landscape, once dominated by embarrassingly bad Microsoft on-premise applications, with web solutions outside the very realms of conceivability of enterprise IT.

Posters vaguely mentioning innovation, with stock images of soviet-style propaganda posted next to the water cooler, or people encouraged to learn by doing? So far, only lip service is payed to these concepts (as anyone involved in an Agile project in the enterprise is aware of). Let’s remember the words of Heinlein.

A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. -Robert A. Heinlein