Architectual insights

A second postcard from Microsoft's Architect Insight conference


OK, after my first postcard from Architect Insight I promised something a bit longer and more low-level from Microsoft's Welsh conference (you can find the programme here).

I chose, largely, to follow the lifecycle track – and, a note to Microsoft, seven concurrent tracks is too many. Almost by definition architects have broad interests and I imagine most attendees found several sessions interesting in any particular slot.

Microsoft's Lifecycle centres on its VSTS (Visual Studio Team System) tools, of course, although "software factories" also feature – see later. VSTS seems to be built around a genuine rethinking of the development process in order to deliver more transparancy (through "friction-free metrics" - the tools deliver metrics without getting in the way of the developers), team collaboration, and better QA.

There was also mention of "blind spot metrics" to highlight overlooked issues (such as a module with impressive test results and coverage metrics, perhaps; but which also suffers from massive code churn – possibly you're just measuring the implications of a skimpy and inadequate test pack).

VSTS is certainly a radical upgrade for the Microsoft development tooolset, although I'm not sure that it could entice me away from other, more established, vendors (such as Telelogic or Borland), if I already had an effectively automated lifecycle development process. In fact, perhaps there are serious migration issues for anybody not already developing on the Microsoft platform as, although VSTS emphasises its extensibility and customisability, it is firmly based in Microsoft technology – Team Foundation Server – for its version and configuration management.

I'm possibly being a tad unfair here, but buying configuration management from the firm that sold VSS (Visual Source Safe [that link isn't necessarily Reg-endorsed, BTW – Ed]) as an SCM solution rather worries me.

One of the best (most common sense) presentations on VSTS came from Matthew Phillips, lead architect at Avanade UK (see what Avanade is doing with VSTS here).

His enthusiasm for the product was palpable but he also discussed the dangers of customising a powerful tool without due process and maturity (it's not the tool's fault, but you could make it as over-precriptive and restricting as some early CASE tools, if you wanted to). Phillips also gave an endorsement to Borland's CaliberRM, which intregrates with VSTS to add Requirements Management (an essential part of the lifecycle, in my opinion, that VSTS doesn't really deal with out of the box).

Oh, and the complexity of VSTS licencing still seems to be a real issue (see Tim Anderson's comments here. Attendees didn't seem impressed by the book which is supposed to explain it all. Some of them also seemed to be getting conflicting information from their Microsoft contacts on exactly what they could do with their reports from VSTS without buying extra licences, which isn't a good sign.

Ivar Jacobson was brought along to add independent gravitas to Microsoft's lifecycle approach with the Essential Unified Process (Ess UP). I must say I was a little disappointed by this. Yes, the Unified Process has sort-of been hijacked by IBM (although, to my mind, the Eclipse EPF process framework project mitigates this issue) and is a bit daunting; but the idea of a "unified" process behind development generally is probably unavoidably hard when you get down to the practical nitty gritty.

Ess UP's big idea of customising process to particular developments is good (but it has been tried before – see Trireme's Catalysis for example – and only a few of the more mature developers can be bothered). But, Jacobson's approach, of putting its components on glossy cards with explanatory documentation and playing them on a board in a development "game", seems more cosmetic than useful, to me.

Yes, it is based on previous approaches that have worked (see Scott Ambler here and it makes process more accessible; but people who need it this accessible probably won't cope with it anyway. But perhaps I'm being unfair again. According to Jacobson: "The card metaphor makes the process itself lightweight, agile and easy to use." You can check this out for yourselves here.

The presentations on Software Factories by Jack Greenfield and others were interesting, but some of the admitted limitations of their current implementation are rather serious:

  • Customisation isn't easy.
  • There's no logical overview of the application being developed.
  • It's hard to tell what to do and when to do it.
  • There are too many wizards.
  • It's hard to revisit past decisions.
  • It isn't optimised for team development.

I'm also a little worried that the Software Factory approach will, in practice, encourage developers to produce code faster without examining the abstractions behind the system they're automating – from my misspent youth in structured design, business users don't always fully understand their own business process and trying to "automate the current physical" is fraught with danger.

Nevertheless, Software Factories seem to be a significantly innovative approach to development and their next iteration (which I can't talk about just now, because all attendees were put under NDA) may well address many of the issues. And, of course, access to Microsoft future thinking, even under NDA, must be a plus for conference attendees.

The final focus group revisited the question of what an IT "architect" actually is. The consensus seemed to be that there were different kinds of IT architect but that they were generally experienced technologists with an understanding of the business – and the title was often more a matter of recognition of someone in the IT group already doing the job rather than the result of any specific training.

That's a bit worrying to me, as I suspect that an experienced business person would find it easier to pick up the principles of IT than vice versa and they might have a broader view of architecture, going rather beyond the limits of today's automated systems.

As I said in my last postcard, I tend to see an IT architect, in practice, as just a specially well paid (and experienced) systems analyst (thinking of "systems" in the "Systems Theory" sense, perhaps).

Certainly, ego seems to be a qualification – I'm sure some of the attendees saw their architect title as a reward for being Truly Wonderful. And, there was an interesting comment dropped by Martin Fowler at the QCon conference this week: "We sometimes have trouble with the term architect at Thoughtworks, mainly because of some of the people we meet calling themselves architects."

But I'm beginning to think that there really is a special architect role, requiring skills beyond those of a top analyst with both technical and business experience. For a start, an architect should be responsible for the practical success of the business deployment – one architect told me that he wrote and signed contracts (analysts generally don't do that).

And, secondly, architects are (or soon will be, the way regulations are moving) accountable for the architectural quality of their projects. If the new widget system gets as far as going live and proves "unfit for purpose" perhaps it's the architect's fault (just as it might be if a new school didn't have a toilet block) and the architect responsible may get "disbarred" - or sued.

That last one didn't seem to go down well with some people in the focus group – I was reminded of a seminar for aspiring non-exec directors at the IoD when the assembled non-execs were told that they might be personally (financially) responsible for bad governance in the firms they were directors of - and might actually have to earn their fees. However, there is liability insurance – get it now while you can still afford it <grin>.

Overall, then, an interesting conference and a lot to take in. Software factories, in particular, might be a real breakthrough – although the current Microsoft implementation isn't there yet and I can't talk about its next iteration because of that NDA. Just as well really, as vapourware is always impressively free of interesting issues to talk about – I've commissioned Tim Anderson (also attending this conference) to look into what is actually available in this area.

However, I was left with the impression that Microsoft is growing up – and finding the harsh realities of enterprise computing a little daunting. Its old approach, that everything difficult can be made simple by merely adding a Wizard (or two, or a dozen) may be running out of steam. Or, at least, becoming "wizard-heavy".

Microsoft certainly has the brains, talent and enthusiasm for the job, but enterprise scale delivery of automated systems that align well with changing business requirements and can be deployed globally for large communities might just be a very hard problem. It may not be possible to make it as easy as building a little utility in VB and Microsoft may have to come to terms with that. ®


Other stories you might like

  • Quantum internet within grasp as scientists show off entanglement demo
    Teleportation of quantum information key to future secure data transfer

    Researchers in the Netherlands have shown they can transmit quantum information via an intermediary node, a feature necessary to make the so-called quantum internet possible.

    In recent years, scientists have argued that the quantum internet presents a more desirable network for transferring secure data, in addition to being necessary when connecting multiple quantum systems. All of this has been attracting investment from the US government, among others.

    Despite the promise, there are still vital elements missing for the creation of a functional quantum internet.

    Continue reading
  • Drone ship carrying yet more drones launches in China
    Zhuhai Cloud will carry 50 flying and diving machines it can control with minimal human assistance

    Chinese academics have christened an ocean research vessel that has a twist: it will sail the seas with a complement of aerial and ocean-going drones and no human crew.

    The Zhu Hai Yun, or Zhuhai Cloud, launched in Guangzhou after a year of construction. The 290-foot-long mothership can hit a top speed of 18 knots (about 20 miles per hour) and will carry 50 flying, surface, and submersible drones that launch and self-recover autonomously. 

    According to this blurb from the shipbuilder behind its construction, the Cloud will also be equipped with a variety of additional observational instruments "which can be deployed in batches in the target sea area, and carry out task-oriented adaptive networking to achieve three-dimensional view of specific targets." Most of the ship is an open deck where flying drones can land and be stored. The ship is also equipped with launch and recovery equipment for its aquatic craft. 

    Continue reading
  • Experts: AI should be recognized as inventors in patent law
    Plus: Police release deepfake of murdered teen in cold case, and more

    In-brief Governments around the world should pass intellectual property laws that grant rights to AI systems, two academics at the University of New South Wales in Australia argued.

    Alexandra George, and Toby Walsh, professors of law and AI, respectively, believe failing to recognize machines as inventors could have long-lasting impacts on economies and societies. 

    "If courts and governments decide that AI-made inventions cannot be patented, the implications could be huge," they wrote in a comment article published in Nature. "Funders and businesses would be less incentivized to pursue useful research using AI inventors when a return on their investment could be limited. Society could miss out on the development of worthwhile and life-saving inventions."

    Continue reading
  • SEC probes Musk for not properly disclosing Twitter stake
    Meanwhile, social network's board rejects resignation of one its directors

    America's financial watchdog is investigating whether Elon Musk adequately disclosed his purchase of Twitter shares last month, just as his bid to take over the social media company hangs in the balance. 

    A letter [PDF] from the SEC addressed to the tech billionaire said he "[did] not appear" to have filed the proper form detailing his 9.2 percent stake in Twitter "required 10 days from the date of acquisition," and asked him to provide more information. Musk's shares made him one of Twitter's largest shareholders. The letter is dated April 4, and was shared this week by the regulator.

    Musk quickly moved to try and buy the whole company outright in a deal initially worth over $44 billion. Musk sold a chunk of his shares in Tesla worth $8.4 billion and bagged another $7.14 billion from investors to help finance the $21 billion he promised to put forward for the deal. The remaining $25.5 billion bill was secured via debt financing by Morgan Stanley, Bank of America, Barclays, and others. But the takeover is not going smoothly.

    Continue reading
  • Cloud security unicorn cuts 20% of staff after raising $1.3b
    Time to play blame bingo: Markets? Profits? Too much growth? Russia? Space aliens?

    Cloud security company Lacework has laid off 20 percent of its employees, just months after two record-breaking funding rounds pushed its valuation to $8.3 billion.

    A spokesperson wouldn't confirm the total number of employees affected, though told The Register that the "widely speculated number on Twitter is a significant overestimate."

    The company, as of March, counted more than 1,000 employees, which would push the jobs lost above 200. And the widely reported number on Twitter is about 300 employees. The biz, based in Silicon Valley, was founded in 2015.

    Continue reading
  • Talos names eight deadly sins in widely used industrial software
    Entire swaths of gear relies on vulnerability-laden Open Automation Software (OAS)

    A researcher at Cisco's Talos threat intelligence team found eight vulnerabilities in the Open Automation Software (OAS) platform that, if exploited, could enable a bad actor to access a device and run code on a targeted system.

    The OAS platform is widely used by a range of industrial enterprises, essentially facilitating the transfer of data within an IT environment between hardware and software and playing a central role in organizations' industrial Internet of Things (IIoT) efforts. It touches a range of devices, including PLCs and OPCs and IoT devices, as well as custom applications and APIs, databases and edge systems.

    Companies like Volvo, General Dynamics, JBT Aerotech and wind-turbine maker AES are among the users of the OAS platform.

    Continue reading
  • Despite global uncertainty, $500m hit doesn't rattle Nvidia execs
    CEO acknowledges impact of war, pandemic but says fundamentals ‘are really good’

    Nvidia is expecting a $500 million hit to its global datacenter and consumer business in the second quarter due to COVID lockdowns in China and Russia's invasion of Ukraine. Despite those and other macroeconomic concerns, executives are still optimistic about future prospects.

    "The full impact and duration of the war in Ukraine and COVID lockdowns in China is difficult to predict. However, the impact of our technology and our market opportunities remain unchanged," said Jensen Huang, Nvidia's CEO and co-founder, during the company's first-quarter earnings call.

    Those two statements might sound a little contradictory, including to some investors, particularly following the stock selloff yesterday after concerns over Russia and China prompted Nvidia to issue lower-than-expected guidance for second-quarter revenue.

    Continue reading
  • Another AI supercomputer from HPE: Champollion lands in France
    That's the second in a week following similar system in Munich also aimed at researchers

    HPE is lifting the lid on a new AI supercomputer – the second this week – aimed at building and training larger machine learning models to underpin research.

    Based at HPE's Center of Excellence in Grenoble, France, the new supercomputer is to be named Champollion after the French scholar who made advances in deciphering Egyptian hieroglyphs in the 19th century. It was built in partnership with Nvidia using AMD-based Apollo computer nodes fitted with Nvidia's A100 GPUs.

    Champollion brings together HPC and purpose-built AI technologies to train machine learning models at scale and unlock results faster, HPE said. HPE already provides HPC and AI resources from its Grenoble facilities for customers, and the broader research community to access, and said it plans to provide access to Champollion for scientists and engineers globally to accelerate testing of their AI models and research.

    Continue reading

Biting the hand that feeds IT © 1998–2022