Time for UML tools to evolve

Refactoring gap


There's an audible rhythm to development tools. Modern integrated development environments (IDEs) go like this: tap, tap, bam! Diagramming tools go like this: point, click, pause, point, click, pause...

Now that the two are starting to overlap through things like UML plug-ins for IDEs and round-trip engineering as standard, the overall differences are becoming more pronounced than ever. IDEs stand out as being fast and lovely to use, whereas - with a few possible exceptions - modeling tools are slow and cumbersome. They force you to use the mouse - an eminently serial input device.

While they're being left behind in terms of productivity, though, modeling tools are making headway with advanced features such as round-trip engineering - the ability to turn a UML model into code, and reverse-engineer code into a UML model. The model is as close to the code as it's ever been right now, meaning that there's little excuse for the model and the code to become out of sync.

There's still a huge gap when it comes to refactoring support, though. If the model and the code are really tightly integrated, then you'd expect to be able to perform operations on the code via the model elements.

How cool would that be, to be able to right-click on a class in a UML diagram and choose "extract interface", then have a little pop-up window that shows the before and after code segments? That would be true integration. It's certainly needed if the trend for integrating a UML tool into an IDE is to make any sense at all.

Detailed design diagrams such as class and sequence diagrams are very nearly at the code level, so refactoring operations on these diagrams would be virtually the same as code refactorings. In fact, I can't help wondering why the UML support in NetBeans for example doesn't include the same right-click menu as in its Java editor, when you right-click on a class (including the refactor sub-menu of course).

But why stop at code-level refactoring? A UML tool operates at so many more levels than just code. There are business processes, requirements analysis, preliminary design, deployment and tests to consider.

UML is also concerned with both static and dynamic models. A "refactor" submenu on a modeling element at these levels would more suitably be called "prefactor".

What sort of prefactorings would exist under this sub menu? To answer that, we'd need to look at the nature of refactoring itself.

Refactoring is all about improving the design of code without modifying its semantics. In other words, the result of a single refactoring is better-designed code that does essentially the same thing. There's also an implied caveat: if we look at Object Oriented Analysis and Design (OOAD) as a "stack" with analysis models at the top, and code plus unit tests at the bottom, refactoring operates at the code level.

It doesn't affect anything "above" it such as requirements or the preliminary or logical design. Prefactoring should operate at the level it's performed on. So prefactoring a use case - such as extracting a use case with an "extends" relationship to the original, for example - should only affect the use case, nothing at higher "levels". It seems reasonable, though, that a prefactoring would have a knock-on effect on the levels beneath it: change the architecture and you'd expect the design to change too.

This would be round-trip engineering on steroids: probably beyond what current tools can do, but it's the direction they should be heading in.

We already have round-trip engineering to keep the model and the code in sync. I believe that native and obvious, prominent prefactoring support (with plenty of keyboard shortcuts) is the final missing link needed to bring UML fully into the agile world.®

Matt Stephens has co-authored Use Case Driven Object Modeling with UML: Theory and Practice, which illustrates how to get from use cases to code using a core subset of UML, and Extreme Programming Refactored: The Case Against XP.


Other stories you might like

  • Cerebras sets record for 'largest AI model' on a single chip
    Plus: Yandex releases 100-billion-parameter language model for free, and more

    In brief US hardware startup Cerebras claims to have trained the largest AI model on a single device powered by the world's largest Wafer Scale Engine 2 chip the size of a plate.

    "Using the Cerebras Software Platform (CSoft), our customers can easily train state-of-the-art GPT language models (such as GPT-3 and GPT-J) with up to 20 billion parameters on a single CS-2 system," the company claimed this week. "Running on a single CS-2, these models take minutes to set up and users can quickly move between models with just a few keystrokes."

    The CS-2 packs a whopping 850,000 cores, and has 40GB of on-chip memory capable of reaching 20 PB/sec memory bandwidth. The specs on other types of AI accelerators and GPUs pale in comparison, meaning machine learning engineers have to train huge AI models with billions of parameters across more servers.

    Continue reading
  • Zendesk sold to private investors two weeks after saying it would stay public
    Private offer 34 percent above share price is just the thing to change minds

    Customer service as-a-service vendor Zendesk has announced it will allow itself to be acquired for $10.2 billion by a group of investors led by private equity firm Hellman & Friedman, investment company Permira, and a wholly-owned subsidiary of the Abu Dhabi Investment Authority.

    The decision is a little odd, in light of the company's recent strategic review, announced on June, which saw the board unanimously conclude "that continuing to execute on the Company's strategic plan as an independent, public company is in the best interest of the Company and its stockholders at this time."

    That process saw Zendesk chat to 16 potential strategic partners and ten financial sponsors, including a group of investors who had previously expressed conditional interest in acquiring the company. Zendesk even extended its discussions with some parties but eventually walked away after "no actionable proposals were submitted, with the final bidders citing adverse market conditions and financing difficulties at the end of the process."

    Continue reading
  • Singapore promises 'brutal and unrelentingly hard' action on dodgy crypto players
    But welcomes fast cross-border payments in central bank digital currencies

    In the same week that it welcomed the launch of a local center of excellence focused on crypto-inspired central bank digital currencies, Singapore's Monetary Authority (MAS) has warned crypto cowboys they face a rough ride in the island nation.

    The center of excellence (COE) was established by the Mojaloop Foundation – an open source effort to create payment platforms to make digital financial services accessible to those without access to banks. The COE aims to "accelerate financial inclusion in emerging markets" through hackathons, workshops and pilot projects while examining expanded CBDCs payment capabilities."

    Singapore's sovereign wealth fund has invested in Mojaloop, and MAS chief fintech officer Sopnendu Mohanty serves as a board advisor and the authority provides representatives to the Foundation's working group, alongside folks from the Bill & Melinda Gates Foundation, Google, and more.

    Continue reading

Biting the hand that feeds IT © 1998–2022