Fascinating keynote remarks on the future beyond Fannie Mae and Freddie Mac by Edward J. DeMarco of the Milken Institute, speaking to the American Enterprise Institute. DeMarco says that both institutions are technically ready to exit government control. In fact, housing finance has had a substantial operational makeover, with industry data standards readying the sector for a bright future:
“For example, much progress has been made in developing data standards, both with regard to data definitions and developing industry-standard technology for reporting data. This work is important because it removes a key barrier to entry inherent in the Fannie/Freddie model, namely, the proprietary data standards and systems each used. Accurate, standardized data is also a prerequisite for improved disclosures to investors, which in turn enhances investors’ ability to price and manage mortgage credit risk.”
DeMarco sees standards as “building blocks for a stronger, more resilient and more competitive secondary market in a post Fannie and Freddie world. These changes have taken some time, and they have been costly, but the returns from these investments should be large and long-term.” The investments have been well made. Now we need to start reaping the rewards. Milken Institute
Why is the airline industry so late to the standards party? Here's a clue:
“The absence of a common, up-to-date data transmission standard has meant consumers must check multiple sources to identify competing fare options and information about the comparative cost of additional ancillary services, such as extra leg room, Wi-Fi and advance boarding, that many of today’s air travelers view as critical to a satisfying trip, DOT noted.”
In short, fear of competition. Many industries have tried to hold out against consumer comparison, worried that ease of substitution will erode their margins. Well, of course it will. But this is reality.
Trying to keep customers by denying them information or obstructing their attempts to compare offers just doesn't work. They'll move to intermediaries who can regularize the information for them. These aggregators then become the trusted brands and wield immense power. It's better to be on board with an industry standard, helping to shape it and getting early experience with implementing it, than to hold out. Travel Agent
“When looking for direction and advice for your metadata strategy, follow the leaders. In the world of metadata, quite a few industries survive and thrive based only upon the automated exchange of information. Information may actually be their product/service offering, but regardless, the similarity across successful industries is that the exchange is and has been a necessity. Without timely and accurate sharing of business information across organizations - including regulatory organizations - there is no functioning business.”
That's insurance in a nutshell. Increasingly, it's every other industry too. Bloomberg
Starting May 2015, the Department of Health and Human Services (HSS) will be the location for a two-year pilot of the DATA Act. The project will test how data standardization in a complex federal ecosystem works.
HSS has already identified the key areas where different agencies use different data definitions for the same concepts. They call these the Fab Five, and they are data to do with organizations, government programs, places, time periods, and “the concept of amount”.
The department's Amy Haseltine says: “It doesn’t make any of it wrong - it just makes it all different.” This is a diplomatic and positive attitude to evolved complexity which sets a good tone for the work ahead. But I can't help thinking that “different” is the new wrong. What's meaningful and usable at the local level is often dysfunctional at the collaborative level. And the collaborative level is where all human activity is headed.
Notice how it's pretty easy to outline the main areas of data difference. I guess the good people at HSS and their contacts have been struggling for years with the problems caused by the Fab Five. This is an example of the great effectiveness of the 80/20 rule. If they can transition to a Fixed Five, they'll make major gains and make space to reveal and tackle the next tier of unnecessary complexity. FedScoop
Good quotes about health and fitness data from Derek Newell, CEO of digital health care platform Jiff, in Forbes magazine: “Google Health [shuttered in 2012] never took off because consumers actually don’t want to aggregate their data […] They haven’t wanted to. What they want is information. They want meaning, rewards and a feedback loop.”
Now Google is set to launch an open API data aggregation service called Google Fit. Newell continues: “There’s lots of little players, and no data standards. [Apple, Samsung, Google] are going to step in the middle of all that and facilitate the collection of that data and let developers develop on top of it.”
He's absolutely right. Consumers don't want “data”, and they don't want “data standards”. They want information, so they can take decisions; and they want apps, so they can take action. Data is the lifeblood that supplies all applications. Data standards provide the rules for moving, sharing, and aggregating data. Customers don't need to know about any of this stuff.
But businesses do. If you're in the health business, or insurance, fitness data could be very valuable to you. You need to care about the data standards. You want them to at least map logically to the concepts you use in the business. Ideally, you want a close mapping relationship to the data standards you already use.
For this reason, businesses have an interest in the success of open data standards for health and fitness data. This needs to be distinguished from the competitive forces working in the smartphone and lifestyle markets. Decision makers will follow the progress of Google Fit with close attention. Health Data
The Senate has said it will release machine-readable summary and bill information from the 113th Congress and legislation from the upcoming 114th. But the House and the Senate are not using the same data standards. This isn't an academic point. Hollister points out that if both House and Senate adopted the same standard, “then legislative drafters could use software tools that would automatically show the impact of what they're doing on the underlying law – automatically redlining proposed legislation into the U.S. Code.”
That's joined-up thinking. GCN
Some thought provoking findings from recent Forrester research suggest that in prioritizing speed of data delivery over trustworthiness, organizations are creating new bottlenecks. Instead of improving decision making, they may be degrading it.
Here are the key stats: “Overall, 42% [of respondents] spend more than 40% of their time vetting and validating the data. For executives, 70% spend more than 40% of their time vetting and validating data.”
So it's good that people are checking the utility of data instead of acting on it without question. But it would be way better if users were given pre-validated data. Especially since the vetting and validation must be being carried out multiple times by different people on the same data.
It seems we've moved from Garbage In, Garbage Out to asking people to sort through garbage. We're wasting their time. Data governance is a critical element of organizational effectiveness. Forrester
The Google-led General Transit Feed Specification (GTFS) has produced a slew of third party apps using open transit data. This standard is the poster child for open data – and it's encouraging people to think about how other kinds of open data can be exploited.
Now a coalition of parties is setting out to create a standard for building permit data. The challenges in creating the standard will be familiar to anyone who has worked on a data standard: “The tension in this process has always been how to make the data described in the standard generic enough that it accommodates different business processes but doesn’t gloss over any detail that’s important.”
Every authority structures building permits differently. This is because, historically, there was no need to share the data. One city wouldn't need to share with another city. It's only now that citizens, and the third party software development community, are connected that the data has wider currency.
This kind of data has apparently been called “civic exhaust” in the past – a new term for me but a vivid one. Organizations of all kinds are realizing that their by-product data may be useful in new, collaborative contexts. In this case, it's not that two cities will want to exchange building permit data; it's that other organizations may be able to derive value from the data – but not if they have to write new interfaces for every city's data feed.
Business data standards are vital for removing barriers to market formation. In fact, without data standards, many market opportunities can't even be visualized. You literally cannot imagine a viable industry for third-party civic data apps without common business data standards. That's how integral standards are becoming to our economy. GCN
But: “We would be far better off if there were an umbrella organization, such as IEEE, that addressed IoT standards, rather than partisan camps. This would remove the possibility of incompatible solutions on the store shelves. More importantly, it would bring more effort to bear on solving the security issues that are not currently being addressed well by any of the groups.”
Meanwhile, elsewhere another commentator notes about IoT: “A lack of consistent standards in data or connectivity, and what appears to be a reluctance to share data, is stifling innovation.”
Players in this industry need to think carefully about the relationship between competition and collaboration. Without a certain amount of collaboration, there won't be a viable market, and therefore competition will be moot. IoT demands data sharing. It's too big a concept to be owned by one player.
Some developers resist standards because they think using standards is a short cut. Taking short cuts is unprofessional, and possibly dangerous. By creating their own formats, they feel they are taking proper responsibility for their software.
Using standards will certainly speed up development. However, this is not because standards offer short cuts. Rather, standards are highways.
Today's system development and integration isn't carried out on surface streets. It's not a neighborhood business. Everything we create or adapt has to work with other systems and other organizations. It really is a global world. And data standards are the highways of this world.
The Healthcare Information and Management Systems Society (HIMSS) has suggested four interoperability principles in a recent submission to Congress. I wonder if they could have been a little tougher in the first principle, which starts like this: “Standards development organizations should offer educational resources and support in order to implement and develop data standards.”
Or perhaps what I'm looking for is a Principle #0. There's nothing wrong with education and support – it's vital. But before you get to that point, you have to ask why there are standards organizations in the plural, what each organization's remit is, and how they work together. My zero principle would be that if there have to be multiple standards organizations then there should at least be a single coordinating body to which all those organizations federate.
Without interoperability among standards bodies, you won't get interoperability among systems.
Here's an example. About a decade ago, student mental health services came together to agree a standard for recording data. Thanks to this action, they now have data on over 100,000 students from 140 institutions. Analysis of the data shows, for example, that drug and alcohol problems are on the way down, but self-harming is on the way up. These findings can in turn influence how information and counseling services are designed and delivered.
Switch now to the business world. If you only know about the customers, partners, and staff in your own corner of the organization, how much do you really know about people? And if you can only look at your own transactions, achievements, and processes, how do you know how well you're doing? When you have islands of information, your metrics don't travel.
Switch up the focus and the same logic applies at the industry level. You can't achieve best practice unless you can meaningfully compare existing outcomes with other organizations. Also, you can't even know what collaborative opportunities you're missing unless you have systems of record which are data-compatible. Choosing not to exploit data standards is to opt for limited knowledge, reduced agility, and missed opportunities. Onward State
Creating data standards is a community effort. Standards imposed from the top down rarely work. This is because however well crafted they are technically speaking, top-down standards always incorporate edge case decisions that alienate one or more sections of the community. In some cases, the standard may have opted to endorse a formulation used by the majority of players in the domain, and assumed this will be supported by everyone. It's not the solution that's the problem, but the assumption.
A phrase leaps out from a report about standardization in radiology. Dave Harvey, managing director at Medical Connections in the UK, is leading an effort to establish standards for metadata in the field. His group is aiming to enable interoperable systems which are “as compatible as local politics will allow”.
This is a very realistic approach. He says: “You’ll get trusts working on these projects, but when they try to do any sharing between them, they find the data models don’t work. There’s a better chance of integrating if you start with at least vaguely similar data models.”
He's right. Integration is a continuum. That's why it can be better to use a term like collaboration – “integration” sounds like it has to be all or nothing. Finding workable spots on a continuum requires negotiation. And negotiation requires flexibility. Remember, flexibility doesn't mean anarchy. It means tolerance – in the engineering sense, and in the social sense.
We work together because we're stronger together. We must build our data standards together as well. EHI
Baker shrewdly reviews the ongoing explosion in healthcare data, and warns that the volume and variety of data is only going to grow. She urges greater attention to proactive data management to avoid fragmentation and silos. She takes a broad view but the common thread is data standards: “It’s time for the entire medical and research community worldwide to come together, instead of working alone or in small groups, to address standardizing data from devices and fabrics and integrating data bases.” NUVIUN
The web has grown not just in size but importance over the intervening time period. Developers have had to work around a number of limitations, especially in speed of page delivery. Why did it take so long to bring out a new version?
My take is that, whatever the performance issues around HTTP in our massively online world, it worked to an acceptable level. Changing such a crucial and entrenched standard is not something you do lightly. Also, I notice that HTTP/2 doesn't replace existing APIs, but adds features to them. The new version of the standard retains the core functionality so existing investments are not wiped out.
It's a great sign of HTTP's strength – and its core simplicity – that the world has been able to rely on HTTP 1.1 for so long. The grace with which the standard has been advanced to the new version is a tribute to a reasoned, collaborative standards setting process. TNW
I lost the link but... Debra Walton of Thomson Reuters asks a great question: “Given that data is such a strategic asset, one of the biggest issues is understanding your infrastructure, your data standards and data governance. […] How do you have sufficient standards but create an environment that’s nimble, agile, and enables people to use data as an appropriately free flowing asset in the business and not a centralized bureaucratic process?”
My answer is that an effective business data standard is a pervasive part of the environment. Although a standard is in one sense a product, the utilization of a standard is an aspect of business processes – not a process in itself. A useful way to think of data standards is as a toolset you use to shape business processes.
Lubricating oil is a good analogy. Oil makes every machine in the factory run efficiently and extends the life of the plant. Different points in different machines need different oil types and regimes. But enacting these rules doesn't require a centralized Department of Lubrication. It's just part of the job.
Standards are about recognizing commonalities in order to remove barriers to sharing and exploiting information. When you spend time developing, promoting, or working with standards, it's not surprising when you become attuned to seeing parallels across domains. Challenges and opportunities encountered in other business areas resonate with you. Sometimes you can see how solutions in other domains address problems similar to those in your home domain. Other times, you catch neat articulations of your own concerns or conclusions, couched in interesting new ways.
Here's an example from a call for contributions to an oil and gas pipeline conference. Not my home turf. The organizers offer a number of prompts for submissions, including: “Construction Standards/Data Standards: What “should be/needs to be” captured when we put assets in the ground?”
It seems to me that working in the pipeline business has this major advantage for IT people: Everybody knows the business is about pipes, and they know pipes are laid in the ground. So issues relating to recording information about pipes (and other plant) are not far from the top of anyone's mind – no matter what part of the business they work in.
Insurance doesn't see itself as an infrastructure business. To us, a “pipeline” is an abstract concept. We'll think of sales pipelines, channels, workflows, connections, partnerships, networks. But most people in the business are naturally more oriented to products, contracts, rates, premiums and so on.
Insurance has always been about relationships, and with the growth of technology, relationships are increasingly enabled by systems. The industry has evolved into one where the data pipelines are crucial. The pipelines carry deals in various states of completion.
So perhaps we should be asking ourselves: What “should be/needs to be” captured when we put deals in the air? Okay, it's not very elegant. But it gets some of the flavor of today's – and tomorrow's – business, which is its circulatory, collaborative nature. This would help people to think about all the potential users of a piece of information, rather than the data requirements of a particular document produced at some point in the process. Pipeline
The Department of Transportation, led by CIO Richard McKinney, is showing true leadership with its commitment to creating data standards. Traffic and highways are getting smarter, and understanding traffic is increasingly important to all of us – drivers, transportation authorities, technologists, insurers, everybody.
“McKinney, speaking at the SmartAmerica conference, said the transportation industry doesn't want to find itself 10 to 15 years from now with independently developed data standards that hinder communications.” Absolutely. If you want a vision for standards, take a look at the nightmare down the road – the mess you'll be in if you don't do standards.
McKinney said he doesn't know what role the DOT should be playing in standards creation, but he's clear about getting the topic addressed seriously across the industry. This sounds good – standards movements need a gameplayer, a connector, to get traction. Other government departments and agencies, please copy. ComputerWorld
The British Standards Institution (BSI) has published a conceptual data model to help local government create smart cities. This is a good way to bootstrap a standards process. By getting core domain concepts and relationships out into the community, interested parties can converse more meaningfully about priorities and details. BSI has worked with a bunch of local authorities and other organizations to create this starter model. Smart Cities
Say what you like about official government publications, when they go for plain English they can make some striking statements. A new policy paper called Personalised health and care 2020: a framework for action from the UK Department of Health is very clear on the value of standards: “Standardization [of processes, datasets, platforms and interfaces] enables innovation, reduces development costs, lowers barriers to user adoption, speeds up wide-scale adoption and supports an almost infinite variety of bespoke and personalised service offerings.”
Digging farther into the text, I find that the policy aims to be robust on data standards – at least, that's my interpretation of the following: “At times, the [UK's] health and care system has tried highly centralised national procurements and implementations. When they have failed, due to a lack of local engagement and lack of sensitivity to local circumstances, we have veered to the opposite extreme of ‘letting a thousand flowers bloom’. The result has been systems that don’t talk to each other, and a failure to harness comprehensively the overall benefits that come from interoperable systems. In future, we intend to take a different approach. We will be tight on standards and definitions, and clear on expectations regarding interoperability, but we will support local decision-making on systems, programmes, interfaces and applications.”
It makes me smile to see the traditional evolution of disparate systems described in terms of flowers blooming. But the image maybe deflects from the implications of the message about future practice. Managing the dynamic between the implied centralization of data standards and local decision-making about technology will not be easy. The phrase “systems, programs, interfaces and applications” sounds like it might have been hammered out by a committee. It could be used to undermine “expectations regarding interoperability”. I wonder what “tight on standards” will mean in practical terms?
IBM's Lynn Kesterson-Townes blogs compellingly on the need for insurers to have what she calls customer-activated ecosystems. One of her great diagnostics is: “How much time does it take to onboard an ecosystem partner? Do you ever have to say 'No' to an onboarding request because you can’t support the technical requirements?”
This two-part question should be asked in every organization – and repeated at regular intervals. It's only by pressing these points that real change will come about. We've got to accelerate away from the vague idea that it would be nice to be more agile, more open to partnership, more customer-centric, more real-time – and make it happen. IBM
Where does advertising data go after tag management? Standards, of course. This is evolution: Industries sweat to cope with growing complexity until the point where they just can't avoid simplification. Standards will help advertisers and marketers understand data better, share data with each other, lower their costs, and speed up their processes.
Here's one of four things Josh Dreller says must happen in the industry: “Universal Data Standards. It’s still a bit of the wild west when it comes to data. Even if you can get your data easily out of your various customer engagement points, it’s hard to match up files from different systems. For example, one of the most common data fields in digital advertising is impressions. In some systems, it’s called impressions, but in others, it is called imps, served impressions, display impressions, viewed impressions, purchased impressions, etc. There’s literally a language problem between ad tech platforms that needs to get figured out before data can truly become as portable and flexible as it needs to be.”
But Dreller doesn't say who's going to bring order to the wild west of advertising data. They need the advertising equivalent of ACORD. Marketing Land
All the best stories begin with “Once upon a time...” Here's a great one from the mortgage industry: “Once upon a time, documents themselves were considered the most critical element of the loan package… Now, the script has flipped, so to speak, so that data trumps docs in the eyes of investors and regulators alike.”
Once upon another time, the distinction between documents and data was only interesting to data processing nerds. Business folks' eyes would glaze over. Today, two drivers are making decision makers understand the deep value of this distinction. The first driver is regulation. The second is collaboration. As is often the case, these drivers also interact with each other.
If you need to bring the distinction alive for any colleagues who are not getting it, I suggest you try this. Documents are for printing, reading, distributing, filing. They don't provide information flow, because the information is trapped in them. Documents are coffins for information. eLynx
A new study says that open data could generate $13 trillion over the next five years in the G20 countries. It's notable that the study mentions the release of open data by private organizations as well as public bodies, which make up the usual providers of open data. The study also recommends countries make targets around open data.
I wonder if, eight or ten years from now, we won't be talking so much about open data, but about closed data. Think about it. If sharing data has such great value, then people will want data to be open by default. And since we all value our privacy, we will shift our attention to making sure that certain classes of data are not shared. The emphasis will switch from provision to protection. Future Gov
SWIFT with its data standards and network is often viewed as the gold standard for industry data exchange. I interviewed the head of Standards at SWIFT many years ago and found that they share many of the same challenges.
They created the SWIFT Institute a few years ago and I am impressed. In fact, much of the work we are doing with the ACORD 20/20 effort may lead to my recommending something similar for ACORD. The "R" in ACORD represents research and driving industry innovation requires such an outreach effort.
Some may shy away from such collaboration as they think innovation is about competitive advantage. But when it comes to innovating anything involving data exchange with trading partners, suppliers and distribution channels in particular, the need for collaboration is imperative. And doing so frees up time to focus on innovations that do provide competitive advantage. Firms have more in common than differences.
The SWIFT Institute was founded in April 2012. Its core objectives are to:
Here is a working document from their website and I include an excerpt of a familiar refrain.
The development of regional standards enables interoperability between banks, payment processors, and corporates in different countries. The development of a common language through standards can be a difficult process that requires centralized management, broad inclusion of stakeholders, and a coherent vision for payments integration. But developing a standard is not enough. Stakeholders need to implement a common standard in a uniform manner.
The ISO 20022 standards used in SEPA for credit transfers and direct debits were developed by the EPC with input from a wide range of European stakeholders. However, there have been variations in how banks and ACHs in different EU countries have implemented the SEPA standards, which has led to problems with interoperability. Now that migration to SEPA schemes is complete, regulators and other stakeholders in Europe can begin to modify the standards to ensure uniform implementation.
The task of creating deep enough rule sets that enable uniform implementation is difficult even in a domestic environment. When bringing together different domestic schemes under a regional scheme, it may be necessary to leave room for variation in the initial implementation of a regional standard. But too much variation can lead to a fragmented payments environment, which is incompatible with the principle of regional payments integration.
Sharp focus from the legal profession on another reason why data standards are a good thing. Organizations are using the cloud more and more – although it's kind of worrying that “75% of CIOs had no idea how much cloud was being used in their companies”.
Lawyer Frank Jennings of the Cloud Industry Forum’s code of practice board says: “If there is an outage, you need to get back up and running as quickly as possible. It’s no use waving a contract around: you need a back-up plan.” And to get your data back and move it some place else, you need standards. Without standards, you can be locked out of your data, and unable to do business. Gazette
“One of One Mind’s open science principles involves adhering to widely accepted data standards. This makes sense because the standards help make the data useful. Sharing the data is not the end game. Using the data to accelerate the development of safe and effective treatments is what we’re after. Sharing data that cannot easily be interpreted or understood limits the value of the data. At the very least, it adds significantly to the cost and time needed to interpret and transform the data to make it useful. Using data standards adds to the value of the data in a measurable way, and promotes the positive effects targeted by data sharing and open science.”
I love the way blogger “shume” (modest, anonymous guy) puts the logic of standards together. “Sharing data” has become a shorthand for the benefits of data standards, but he rightly reminds us what the point of sharing is. We must always bring the discussion back to business benefit. In this case, shume is talking about data in the clinical research setting. But his comments apply to us all. Data Sharing
Blogging on end-to-end processes in the clinical sciences, Dave Iberson-Hurst at Assero makes an interesting aside: “I await the howls of ‘science cannot be made into a production line’; I don’t ask that it is. I don’t want standards to drive the science, I want the recording of the science to be well structured using standards, that is a big difference.”
This resonates with business. Some people worry that bringing analysis and design discipline to business processes will clash with “the art of business”. But that's to mistake haphazard, evolved, or needlessly complex ways of doing things with creativity, innovation, or good service. In reality, you need efficient processes so you can perform effectively.
Standards are an important enabler in end-to-end processes. Standards help to connect islands of information and to create flow. They provide the conditions for creativity, innovation, and service to flourish – sustainably.
But as Dave says: “End-to-End is not just standards, its process, tools, standards and many other things working together to meet the business need.” Standards are not a panacea in themselves – but they are a constituent of every good solution. Assero
The American College of Obstetricians and Gynecologists (ACOG) is doing just that, beginning its data standards efforts by looking for conceptual consensus. They're asking questions like “When is a pregnancy full-term?” and “When is a labor considered spontaneous?”
From my experience with the standards process, this is the toughest part. It's the business part. There's ample scope for disagreement. It's also hard to manage, because these fundamental conceptual questions often spark other discussions.
However, these conversations also have a positive impact on the participants. They come to share other people's points of view. They realize sharing data means defining it in ways that make it meaningful and accessible to every relevant party.
Technologists are, if you'll pardon the term, midwives in this process. No one needs to mention any kind of computing system or communications network. This is all about the knowledge structures of the people involved, and the data they need to do their jobs. Maternity
One of our primary goals this year is the ACORD2015 event in Boca Raton this November. Our planning process includes the formation of an Advisory Committee to assist us in re-shaping and re-launching our flagship annual conference. I am truly excited about this new event and really appreciate the interest of our members. They came to our office today from near and far. The staff held breakout sessions on topics and we also did some videos with members of the group. We are looking to focus on innovation. We realize that there are several audiences. Some need solutions today. Others want a peek into the future. Lots of great ideas collected.
I hear more and more about analytics today as I travel across the globe. Saporito says that most insurers fail to exploit their greatest asset, data. Because their use of analytics is largely limited to stovepiped operational areas such as underwriting, claims, marketing or risk management. Applied Insurance Analytics demonstrates how to use analytics to systematically improve operations. Even more important: it will help you drive more value everywhere by defining a focused enterprise-wide analytics strategy, and overcoming the challenges that stand in your way.
Saporito helps you assess your current analytics maturity, choose the new applications that offer the most value, and master best practices from throughout the industry and beyond. Throughout, she helps you gain more value from data assets, technologies and tools you've already invested in. You'll find new case studies, practical tools, and easy templates for improving the "Analytics IQ" of your entire enterprise. I know Pat... a must read. Get the book on Amazon
On January 21, 2015 ACORD’s Architecture team conducted a webinar (recording found here) on the mappings effort involving the ACORD Reference Architecture (Framework) and the ACORD Data Standards. This webinar was a continuation of a session that was held at the ACORD Implementation Forum this past October. Shane McCullough is Chief Enterprise Architect. He can be contacted at firstname.lastname@example.org.
We are pleased to announce a record number of registrants (262) and attendees (158). With such a large number of people showing interest, it is clear that ACORD’s continued efforts in this area are in high-demand. Because of this demand, we have set up a new Community (found in the Framework and Architecture Community Section on ACORD.ORG) to address member questions and to gather feedback. We also intend to host a follow up webinar in the coming months.
These efforts showcase the importance of the ACORD Reference Architecture in the future of ACORD. More and more organizations are looking to ACORD to provide guidance, tooling and services for internal projects as well as B2B data standards and forms. This webinar will provide a springboard into our ACORD 2015 event with multiple sessions dedicated to Architecture across the entire Insurance value chain.
The spread of consumer technology is a much greater influence on people's behavior and preferences than their experience in the workplace. This is why the topic of BYOD (Bring Your Own Device) causes headaches for businesses. People want to use their own smartphones and tablets for work purposes – and why not? They don't want to break security, they just want the ease of use and flexibility that comes with using their own equipment. As our personal lives evolve in line with our use of technology, so we want to change the way we work.
Now that the good old “user” is actually the driver of IT, IT teams are working hard to reconcile the demand for flexibility with the needs of security and control. This is hard. But the bigger picture is all good. When you consider what this shift in leadership can do for our organizations, the benefits are huge.
First, and perhaps most trivial, is that support is shrinking. People teach themselves to use apps. They find apps themselves. They configure them themselves. They update them themselves.
Second, and more profound, is that ideas about exploiting information and connectivity are starting to come from ordinary people rather than appointed managers or visionaries. “Wouldn't it be cool if...” Everyone can now be a player in the world of collaborative IT.
Third, there's the impact on strategy. The generation coming through won't see data governance, and data standards, as an esoteric, technical issue. They'll say: “My car can't talk to my house – that's crazy!” and fix it. Better still, they'll say: “Wouldn't it be crazy if my car couldn't talk to my house? Well, how come my business systems can't talk to yours?” Likewise, they won't have any time for data lock-in – every cloud will have to enable data withdrawal, so standards will be essential. There will simply be no argument for non-standard approaches.
Don DeLoach at Infobright says: “2015 will be the year when someone, whether an industry group or private company, steps up to establish the standards that will be accepted worldwide to ensure the proper technology is carried out to deliver on what the IoT promises.”
I hope so. But who? How do you get this party started?
Check out the OMG pages as well.
Okay, so I can see why it didn't make the front page. Still, it's probably more significant than anything that did make the headlines. It's a sign that China is becoming more integrated in the global economic – and political – system. By progressing steadily with international standards, China is embracing transparency while at the same time gaining greater access to capital markets.
Events like the G20 are sometimes seen as an excuse for a party. Well, this time around, a great nation signalled its commitment to joining the world community for good. That's a cause for celebration. China Daily
What a great image. It applies to businesses too.
Data standards form a key part of the promise of open data. If open data doesn't add up, we're not going to get any meaningful results. The other challenge cited in this article is citizen engagement. In short, data can only be transformed into policy via the participation of the community.
Is there also a lesson for business here? It's common knowledge that data standards are the way to make data usable across the organization – to turn data into a real business asset. But the need to tap the community as the transformative power that turns data into real-world outcomes has become obscured, I think, by a misplaced faith in the autonomy of technology. That is: Some folks hope just having lots and lots of data will improve the business. Just like that.
To get value from data, you need to ask questions. So you need to have explicit goals that in turn suggest relevant questions.
To get value from data, you need to share it. So you need to understand who can make use of the data, and who can collaborate around it.
And to get value from data, you need to act on it. So you need to interpret your analyses, contextualize them, convert them into plans, and execute.
All of these activities are participative. They benefit from technology. But technology can't do the whole job. With data standards, you're set to make smarter decisions and lead the business in an informed and adaptive manner. Data standards liberate. They help us do what only people can do. Beta Boston
The Internet of Things (IOT) has a new data standard. HyperCat is “an open catalogue specification that allows applications to discover and make sense of data automatically”. The new standard is aimed at ensuring data hubs don't become silos.
Without such standards, the IOT will surely die on the drawing board. The potential for using data streams created by devices in the environment is totally based on standardization. Standards are implicit in the whole idea.
Will HyperCat get traction? Well, they've got a logo; and they've got more than 40 organizations involved. Meanwhile, Google and Apple are doing their own things.
Whether the IOT evolves around community standards or de facto proprietary standards will depend initially, I think, on which type of technology component makes the commercial running. Will – for lack of a better term – end user devices be the driver of growth, or sensors?
Obviously, you've got to have both. But for me the sensors have the edge. Sensors can talk to each other, and to central systems, without needing to refer to, or be instructed by, any human peripheral.
For example, Fujitsu are growing (and selling, at a premium) lettuces using a clean-room environment, hydroponics, and sensors. Currently, they have people in contamination-type suits working the floor with tablets. But I guess that when they have enough data about growing the perfect lettuce, there won't be any need for the tablets. The sensors will be able to talk direct to the irrigation and lighting systems, coordinated by a central or shared rule base.
My guess, then, is that sensors and sensor hubs will be the faster growing part of the IOT by volume and downstream impact. Standardization at the sensor/hub side is likely to be established ahead of standardization via the end user device.
Of course, I don't know what Google and Apple are planning. Perhaps they're planning to go big into the sensor/hub side. Certainly, Google is interested in robotics and Apple are getting into control systems. It's one to watch. FAST Company
I believe people are comfortable with the idea that long-period batch practices are incompatible with an always-on world. So they know they have to be faster. It's called near-real time. But I'm less sure we've all assimilated the idea of acting in real-real time. Perhaps we don't believe it's necessary.
So here's an interesting insight from predictive analytics in health. According to Justin Lanning of Xerox: “Many are actually using retrospective claims data to predict the future while others are using more recent data sources and others still are using real-time data.” And yet:
“If one EHR sends us a piece of data, we might know that the way they date and timestamp their data is different than this other EHR, and another one and another one. Maybe it’s stamped differently based on when someone opens a window as opposed to when they click ‘save.’ When you’ve got time-driven analytics, you have to know all that stuff, too. So it’s one thing to say that if we had real smooth interoperability, that all the data would come over, but it requires that attention to these little details you might not think about.”
In real-real time, seconds count. Think about it in the insurance context. In the not-very-distant future when your driverless car is involved in a collision with another driverless car, seconds will count. When the smart defenses in your office building react to an incoming storm, seconds will count. In both cases, fractions of an inch in positioning may count too.
Standards bodies must take real-real time on board and pursue its implications for business. We've always focused on details. Increasingly, those details are microscopic – but no less important. Health IT
The incredible growth in the power of IT, together with the tumbling costs of technology, have brought profound changes to our lives and businesses. The most noticeable change is the shift to “always-on” expectations. We expect processes to happen faster. Our organizations work on a more real-time basis.
A less obvious but equally important result is the massive increase in the volume, variety, and velocity of data. The world is producing great quantities of data and the amounts are about to get exponentially larger with the dawn of the Internet of Things.
So we're nearing the point where marveling at the technical achievements of IT is giving way to realization that managing and exploiting data is a real challenge – and opportunity. This struck me with new force when reading about big data and genomics. Barely 20 years ago, sequencing one genome was time-consuming, complex, and expensive. Now it's routine and cheap. So much so that “sequencing is merely data gathering for the biology”. But: “This deluge of data needs more standardised metadata – from which true insight will come.”
We've been here before and we'll be here again. As a domain moves from exploratory to production phases, prototypes are superseded by products, pioneering is replaced by processes, and disparate data is organized into coherent frameworks. For once, “paradigm shift” is an apt term.
Here's an authentic voice from the heart of the paradigm shift: “When I first heard about these standards developments, I was bored almost to tears. But now I realize that this is a very important aspect of getting the most out of genome data.” This researcher tells how, without full data standards for the domain, it's very hard to relate gene sequences to organs and body systems.
The movement for standards in genomic sciences is timely and well led. I wish everyone in the field success. I hope people in other walks of life will take note of their guiding principles – especially the advice that standards enable true insight. BioMed Central
“There will be a saturation point at some time where everything is connected up – where you won’t be able to buy a doorlock that isn’t connected. After that, it’ll be all the services created on top of these things,” says Alex Hawkinson of SmartThings. His colleague, ex-Googler Kelly Liang adds: “We’re talking to all the key players who have interest in this space – whether they’re telcos, insurance providers, tech companies.”
The smart home is usually sold as a kind of control-freak dream that leaves a lot of people cold. How many of us are organized enough to want to control our ovens from the freeway, or have the garage door opener talk to the refrigerator?
The real excitement around the smart home is in the services made possible by interconnected devices. There's home security, for example. Care of the elderly is another potentially vital service that can be layered on to the smart home. And insurance is surely a massive opportunity.
Soon, switch on an appliance and part of its boot sequence will be to fetch a range of insurance quotes. Soon, insurers will have a pretty good idea of what stuff you own and what condition it's in. They'll even know how active you are, based on energy use patterns in different parts of the house. And these are just off the top of my head. Forbes
The argument goes that regulatory forces will compel buy-side companies to put their data in order – break down the silos, connect up the sources. But the conclusion goes like this: “[Sapient's Duncan] Cooper believes industry standards will take some time to emerge, but firms should begin looking internally to help drive change. 'Most firms don’t even have internal data standards and this needs to be the first step. If you can get your own house in order, then external standardisation can come later,' he said.”
I disagree. If you don't have internal data standards then you're in the perfect position to adopt industry standards! You should grab hold of whatever industry standards are available, even if they are not yet complete. Also, you need to regard regulatory requirements as heavy, heavy hints toward core entity definitions. This is the fast way to get your house in order while minimizing your overall costs of transitioning to a data-centric enterprise that's fully connected to your customers, partners, and regulators. Creating internal proprietary standards and needing to then create industry standards is a waste of time. Go for industry standards first and ease the adoption of them internally. The Trade.
There's an important article about IoT standards by Thomas Davenport and Sanjay Sarma in the Harvard Business Review. Did you know the term “Internet of Things” was coined in 1999? It's less surprising to learn the RFID standard – the authors' historical measure for IoT – took 15 years to develop and implement. And you've got to agree we can't take as long to develop new standards for the IoT.
Davenport and Sarma list a number of factors that worked well for developing the RFID standard, and make recommendations for additional factors needed to make IoT standardization a success. The ACORD community can be forgiven for feeling satisfied at the close match between the authors' prescription and ACORD's practices. Our standards achievements have benefited from collaboration across a diverse community, deep user involvement, highly influential players, focus on outcomes and being device agnostic.
We've also made great strides in two of the areas cited as “what needs to happen differently”. These are “faster process” and “bottom-up efforts”, where ACORD has made structural changes right across standards management, engagement, and education. I'd also have to say that I'd shift the authors' third recommendation from the “must do” list to the “already doing it” list, in the case of ACORD. This is the call for “more carrots than sticks”. I think ACORD members have developed a unique ecosystem in which we all encourage and help each other. We understand the benefits of collaboration – and we know it's ultimately the market that carries the big stick. HBR
Dr William Hersh explores this topic in an excellent piece about healthcare data. He uses great examples that are easy to understand while being taken directly from the real world. He makes it clear how implementing data standards right through the healthcare process has great benefits for everyone, not least patients.
Why take a stack of reports, PDFs, scanned images – and grind away at them to reconstruct the discrete data items they originally held? It's crazy. But it's happening everywhere.
It can look like a smart solution. You don't have to tackle any of the existing systems. You don't need to challenge anybody's processes. You can look like a light-touch, hands-off data manager.
However, from the viewpoint of the whole organization, this is a waste of time and resources. It's also a new source of risk. Reconstituting data involves interpretation, which can always go wrong. Also, if people use presumed-common fields in different ways, or with different value ranges, you'll never get back to correct data through a retrospective process that doesn't involve the originators.
Dr Hersh calls it unscrambling eggs. He's right. Informatics
Catherine Smola at CSIO notes that insurers are interested in using drones for claims work. The scope for using unmanned air vehicles (UAVs) to check damage and so on is large. At the other end of the business, the scope for insuring UAVs must be good too. Canadian Underwriter
Speaking at CES 2015 in Las Vegas, (I was there) Samsung President and CEO BK Yoon said that the possibilities of the Internet of Things (IoT) are “infinite” – but also called for more collaboration and openness across industries to unlock that potential. In the meantime, all Samsung devices will be IoT-ready in five years time, with IoT-enabled TVs leading the pack in 2017.
How do we square the calls for collaboration with the announcement of a single-vendor implementation timetable? I guess it's pragmatic leadership. Samsung is busy inventing the types of devices that will populate the IoT and it needs to get them out there.
I hope this turns out to be a wake-up call. If the electronics industry gets this wrong, it'll make VHS versus Betamax look like a squabble in a stone age village. And if the business world doesn't wake up soon and start to appreciate how the IoT is going to force real-time business into every area of life, then there will be a shake-up that makes the online revolution look tame in comparison. CIO
But I'll take away two things from the peripheral coverage of the paper. First, SITA says: “Access to data is the foundation for seamless travel”. Second, a helpful press release from Sky Assist reproduces survey findings which show the top five priorities for airlines and airports. The top two priorities are “system incompatibility and data integration” and “insufficient data standards”.
Reacting purely on the basis of this surface knowledge, I've got to ask: Could those top two issues be related? Just asking. SITA
This topic comes up at every meeting I attend. You know who youu are! Well. Jack Uldrich pulls together some suggestive moves by Google and shows how they could add up to a major insurance play by Google. The evidence is there and his argument makes sense. What do you think? Google seems to spread its bets right across the board. I'm sure they're interested in insurance. They're interested in everything. GoogleInsurance
A call for the UK to follow the US with a DATA Act of its own signals that America is leading the world with its embrace of open data. The principle that the people's data should be available to the people is one that we should be proud to assert in the wider world. Open Data Institute
This is a data standards parable, taken from a piece about the quantified self movement – that's people using personal performance and health data to improve their quality of life, get fitter, or manage a medical condition:
“The average person is not going to spend time creating data visualizations with the information gleaned from their FitBit. Even experts have trouble dealing with the data, especially when it's from hundreds of different fitness trackers with a unique definition of what constitutes a 'step' [… Quote:] 'It would definitely be easier if everyone used a Shine [which tracks health and sleep] and an iPhone […] I'm not saying it would be better – I think it's good to have competition and diversity […] But it's always easier if you don’t have to normalize data from different types of devices.'"
The key phrase here is “unique definition”. If people don't share a common, stable language which defines their data, life is hard. One quick alternative is to clear the market of competition, so everyone uses the same definition by default. But the de facto standard created by the sole supplier may not be the best solution for everybody. A provider can come to dominate a market in many ways, without necessarily using data standards optimized for the whole community.
So, simple choice creates a problem: incompatible data. Removing choice creates a different problem: inadequate data.
The way through this conundrum lies in rethinking the concept of choice. As customers, we want to make choices based on various criteria, many of which are implicit. I might want a fitness monitoring device that looks like a serious piece of sports kit, for example, rather than a medical appliance. Or I might want all my self-monitoring functionality available from my smartphone. And so on.
Providers need to recognize another implicit requirement: I want to be able to share my data across devices, analysis services, and other third parties. The implication of this requirement is that I want data standards.
Customers want data standards, even though (most) customers have never heard of them, and are not interested in sitting through a lecture about them. They want – and need – data standards just like they need all the other unmentioned standards built into modern products and services. Would you sell a fitness device that didn't take standard batteries? That couldn't recharge from the wall? That had to be paid for in goats? Absent data standards, the quantified self movement will surely stall. Data Overload
Gregory A. Maciag: The Business Information Revolution: Making the Case for ACORD Standards
This book was the end result of my writing monthly columns for ten years.