Open Standards and the World of SDN

Posted on November 15th, 2017


The sheer number of devices in today’s world create a unique challenge in terms of bandwidth. The increase of computers and mobile devices requires a solution that can manage that level of bandwidth efficiently and effectively. Enter software-defined networking.

Software-defined networking (SDN) is one of the most innovative technologies designed for network control and automation. As this technology moves forward, so does the need for standards to manage its potential and ensure security.

Backing up just a bit, SDN, as defined by an article from CIO Review means:

SDN technology is a unique approach to computer networking in which a network administrator can leverage a set of software tools to programmatically initialize, control, change and manage network behavior dynamically utilizing the open interfaces in the network. With an SDN application, network administrators can improve and change the methods in which network devices such as Routers, Switches and other components handle data packets. The application provides complete control of the network policies and rules with a centralized control panel.

The general consensus is that implementing SDN using open standards, or vendor neutral standards, allow those applications to work to seamlessly with and simplify the network design. In addition, the device is “independent on multiple, vendor-specific device and protocols.”

While SDN has immense potential, there are still challenges to overcome to survive in the modern cyber environment. To meet those challenges, a user-driven organization was developed known as the Open Network Foundation. The organization “promotes implementation of SDN through open standards where such standards are important for the networking industry to move ahead.”

One of the major initiatives of the group is to develop “various open standards, as well as vendor-neutral standards, for the communications interface defined between the control and forwarding layers of an SDN architecture.”

As more and more technological innovations develop to help navigate the complex world of the Internet and connected devices, the need for open standards to encourage that development in an appropriate way becomes clearer.

Belief in the necessity of open standards, and the mission of organizations like OpenStand, means a lot to those working daily to raise awareness about the need for open standards. You can advocate for open standards, as well, by joining our growing community of OpenStand Advocates. Also, check out our very own work surrounding 5G standards here.

Posted in News

Oracle Making Moves Towards Open Source For Jave EE

Posted on November 8th, 2017


Earlier this summer, Oracle, an American multinational computer technology company, made an announcement regarding the future of Java Enterprise Edition. They are considering moving Java EE technologies to an open-source foundation.

In a blog post, the technology giant stated that the move, “may be the right next step, in order to adopt more agile processes, implement more flexible licensing, and change the governance process.”

This decision is rooted in the fact that “Java EE is a hugely successful set of open standards for developing enterprise applications, but some Java EE enthusiasts and analysts have suggested it hasn’t kept up with developers’ preferences for lightweight frameworks, or with architectural trends.”

Oracle will continue to support Java EE implementations and any future implementations of Java EE 8. In addition, future evolution of Java EE technologies will also have the company’s participation. As the blog post also stated, “we believe a more open process, that is not dependent on a single vendor as platform lead, will encourage greater participation and innovation, and will be in best interests of the community.”

A recent article discussing the move pointed out those in the Java EE community had a growing concern that the Oracle was failing to properly care for the frame work. As such, they began suggesting the technology move away from Oracle to a foundation.

One of the most critical components of this argument are the increasing awareness of how important open source foundations is to the advancement of the technology as well as the public.

Open standards and open source, while different, still work towards similar goals of cooperation and empowerment across the Internet. As readers of this blog know, OpenStand Advocate at CISCO, David Ward, pointed out, “new Open Source Consortiums (OSS) are being started daily to expedite innovation, it’s important to acknowledge that the cycle time of an OSS and a Standards Development Organization (SDO) are fundamentally different.” But they can work to complement each other. moves like this one by Oracle are going to only serve to further the overall cause.

What do you think about this move by Oracle? Let us know in the comments below!

Posted in News

Can Open Standards Allow Feds to Get More From Their Data?

Posted on November 1st, 2017


Another industry leader has used her voice to speak in favor of interoperability and open standards. This time, we hear from Diane Gongaware, vice president of U.S. public sector services at Cisco.

Recently, the executive at the technology giant spoke out about the massive amount of data that government agencies collect, protect, analyze and refine, particularly with the continued growth of the Internet of Things (IoT) transformation agencies: “Agencies must have a network that is ready for this new data age and be open to incorporating the new IoT data.”

In her article published earlier this summer in FedTech, Gongaware stated that government agencies should establish an information technology network that will work to “help them analyze and derive insights from large data workloads as a result of the adoption of internet-connected devices.” She then added that, “the network foundation must include interoperability and open standards so that multivendor solutions can work together and scale.”

Governments are increasingly relying on the data they collect to inform critical decisions for their citizens. Whether those are made from the battlefield or the boardroom, better data analytics can help these agencies make those decisions faster and with more data. Gongaware outlines three ways agencies across the world can do just that.

1. Create a Network that Provides Data Insight and Pervasive Security: Agencies should ensure the investments they make take advantage of these technologies and provide a roadmap for future growth.

2. Collaborate and Establish Governance Across the Community: City leaders need to understand not only where that data comes from, but also how to share it and create governance across agencies and the business community.

3. Work with Partners to Fill Talent Gaps: Whether through leveraging existing relationships or creating new ones, working with the IT industry can help fill existing talent gaps.

The overarching theme of Gongaware’s article is a focused energy on creating a collaborative IT culture and partnership. Open standards like those shaped by adherence to our five guiding principles create a culture of open sharing of information, allowing agencies to “choose network partners that meet agencies’ needs today and have the breadth to provide industry best practices and an innovation roadmap moving forward.”

While industry may move quicker than government in terms of technology, that doesn’t mean they can’t collaborate and partner in impactful ways to move forward intra- and interagency collaboration and governance.

If you agree with the principles of openness, transparency, accessibility, and market-driven standards adoption, we hope you’ll consider becoming an OpenStand Advocate. You can help spread the word by displaying a site badge or infographic on your website.

Posted in News

Security in the IoT – Is it time for the government to get involved?

Posted on October 18th, 2017


Regular readers of this blog are certainly no stranger to the Internet of Things (IoT), where it’s going, and the potential standards-related issues that it will face in the future. And, it seems, those in the United States government are also acknowledging the critical need for standards in this area.

Homes are becoming increasingly “smarter” with the advent of new IoT devices. IoT devices can be anything from HVAC systems that change based on the time of day to a refrigerator that tells you when you’re low on milk – and where milk is on sale. While the concept of these types of intelligent devices that not only talk to the internet but also to each other sounds amazing, there are also new vulnerabilities that arise with the technology.

By allowing the internet into our lives through more than just computers and cell phones, we are also allowing more risk. And that’s where some members of the US Congress are stepping in. According to a recent article in Forbes:

“The question of cybersecurity on the internet of things is too huge an issue to address all at once. Senators working across two different parties are now working together to focus public attention on one of the most important aspects of this situation. It is the question of establishing proper security standards for the sale of IoT devices meant for use by government agencies. Senators Cory Gardner, Steve Daines, Mark Warner and Ron Wyden have sponsored a new legislation known as The Internet of Things Cybersecurity Act of 2017, which, among other things, aims to establish realistic standards with respect to security in connected devices sold to the federal government.”

This activity serves to even further underscore the need for established standards in this space to protect both innovation and security of the end user. These standards should, in our opinion, be shaped by adherence to the principles of the Modern Paradigm for Standards.

While the bill isn’t a cure-all to fix the issues of cyber security in IoT, it does, as the article says, go only so far as any legislation can go in the matter of security. “Security in the cyberworld is an ever-evolving term, one that requires constant and dedicated research impossible to be captured in pen and paper. The legislation in concern is not much of a safeguard in itself, but it is important inasmuch as it serves to bear evidence to the government’s growing concern for cybersecurity in the wake of an exceedingly large number of cyberattacks.”

What are your thoughts on this new legislative effort? Leave them in the comments below!

Posted in News

OT, IT, the IoT – And Baseball?

Posted on October 12th, 2017


In most of today’s modern industries, data is the reigning king for making decisions and measuring success. And for good reason – data analytics offer additional insight into almost any business. Recently, an article from IoT Agenda compared the benefits baseball teams saw once they starting using analytics to how today’s businesses can find the same success.

“What can businesses learn from a similar use of analytics? With the advent of the internet of things and all the data being produced by field devices, businesses are rethinking their strategies as IoT gains ground. Operational technology (OT) managers see how shared data acquired by their controlled devices helps improve their own business decision-making. They also recognize how a common architecture that spans across their now disparate OT and IT infrastructures could improve efficiency, while reducing costs. Like sports sabermetrics, OT managers see IoT eventually leading them to a walk-off grand slam win.”

OT refers to using computers to monitor or change the actual physicality of a system. The term was established to clarify the fundamental difference between traditional IT and other, more industrial, control-system environments. OT collects data to monitor and determine the health of machines from sensors, meters and other devices – similar to IoT.

“IT and OT implementations evolved independently over time to solve different problems. In IT environments, the need to have different applications and systems interoperate with one another forced the requirement for open standards. Not so with OT. Working within the parameters of their pinpointedly focused proprietary systems, OT has continued to operate relatively sheltered from this pressure.”

However, that sheltering isn’t always a bonus. OT managers are beginning to understand the benefits of data sharing and, as such, have become more welcoming to the idea of open standards to eliminate inefficiencies and accelerate innovation. The hope is that the advancements of the IoT will also serve to improve the way OT works as well.

Interested in open technology standards? A good way to get involved is to become an OpenStand advocate! You can:

Posted in News

The Complexity of Standards in IoT

Posted on October 4th, 2017

Image credit:

Many of the parties with a voice in the standards world, including us, have discussed time and time again standards in the age of the Internet of Things (IoT). As the IoT industry grows, so do the number of standards associated with it – with new ones sprouting up all the time. It’s becoming an increasingly complex environment to work in.

To combat that complexity, there has been a crop of new initiatives designed to bring the current vendors working in IoT development. One of the more recent initiatives is called EdgeX Foundry – an effort through the Linux Foundation. While not a standards organization per se, they do open-source software that effectively defines a de facto standard, according to Philip DesAutels, the foundation’s senior director of IoT.

A recent article on the initiative explains it this way:

“The foundation’s modus operandi is to create open-source reference software that others can draw on for their own implementation. The Linux Foundation doesn’t have just one de facto not-standard for IoT, though – it has several. DesAutels carves them up by target audience: industrial IoT and consumer. These two worlds approach IoT differently, he argues.

Industrial vendors each produce a tiny cog in a machine rather than a finished package. They want their single component to talk with lots of others so that systems integrators can work with them. Integrating a combination of obscure industrial controllers may only happen once or twice, on an ad hoc basis, making an open interoperability layer a useful way of cutting integration costs.”

While the article certainly goes into more detail, the idea of open interoperability is one that we find particularly interesting and useful for the larger open standards in IoT discussion. Open standards, providing collaborative and open interoperability, would serve not only to unify the IoT universe but also help to proliferate it going forward.

The Linux Foundation isn’t the only one making efforts to push these initiatives forward. For example, Open Stand affirming partner IEEE is working on a standard for an architectural framework for the IoT, which will bring together what it sees as fragmented efforts in various verticals. It will draw on existing standards and projects, and there will be a reference architecture.

While this is certainly one of the more complex issues, it is also a very exciting time for those working in IoT standards or open standards a whole. There is a lot to do and coming to an agreement on collaborative and open standards won’t be easy. But the end result can be worth the effort.

Do you feel passionate about open standards in technology development? Are you interested in the latest news from the various corners of the tech industry? Then why not become and OpenStand partner? You can:

Posted in News

Open Standards and Interoperability in Supply Chain

Posted on September 27th, 2017


We have seen time and time again the way that open standards can have a positive impact on the progress of nearly every industry. One place we’re now seeing significant growth is in logistics. Recently, One Network, the global provider of multi-party digital network platform and services, announced the availability of the Global Logistics Gateway. This is a new solution that enables Carriers, Freight Forwarders, Orchestrators, Distributors, Custom Brokers, and Suppliers to execute the global fulfillment and transportation process from a single access point.

According to their press release, “The Gateway connects with business networks for sales, procurement, freight, and logistics services to create a ubiquitous network of networks at global scale. This offering supports interoperability between supply chain operating networks through open standards-based authentication and public API-based process orchestration to any buy/sell system or any transport management system (TMS).”

This level of open standards-based activity provides true value through collaboration. The Gateway, for example, doesn’t just focus on one function. It works across multiple enterprises and functions – automating optimized execution across the value network – which greatly streamlines fulfillment and transportation functions.

One of the main challenges supply chain management faces is in the ability to develop a viable ecosystem among manufacturers, logistics companies and all other players within the supply chain. For that to happen, they must all agree on the adoption of standards and the free exchange of documents.

Across any industry, open standards facilitate interoperability and provide a strategic advantage through enabling a networked ecosystem to have end-to-end solutions with each piece, regardless of vendor, working together seamlessly. That provides the highest level of customer requirements and satisfaction.  However, to get there, organizations need to find ways to adhere to the five fundamental principles of standards development: due process, broad consensus, transparency, balance and openness. Each of these are outlined fully in our OpenStand Principles.

What do you think about these efforts from the One Network? Let us know in the comments!

Posted in News

How Can We Ensure Digital Accessibility in the Age of Internet Standards?

Posted on September 20th, 2017

Image source:

One of the latest battles within internet standards is around digital content accessibility – more specifically online video and podcast accessibility for people with disabilities. Now, one would think that this isn’t really something to argue about. Of course they should have access.

However, as seen in a recent University of California–Berkeley legal battle, that is not so simple.

When a judge ordered the University to make 20,000 public videos and podcasts accessible to people with disabilities, the university shut the program down and locked the content behind a firewall. It was just too expensive to go back and revise all the materials to make them accessible.

In a recent article, the questions around digital accessibility and access were looked at beyond just retroactively making these materials accessible. They point out that “accessible Web video is important. Deaf people need captions. Blind people need audio descriptions. People with photosensitivity need strobe warnings. There are tools that can modify video so that colorblind people can distinguish shades (especially important for interactive websites where, for example, everything on sale is marked in red). Controls for videos need to work for people using diverse interfaces.”

However, this summer, OpenStand affirming partner  World Wide Web Consortium (W3C) released a new set of guidelines around “Encrypted Media Extensions (EME),” a standard that allows streaming video in HTML5 to contain Digital Rights Management software.  As the article puts it, “DRM technologies are ways that copyright owners try to limit users’ ability to copy, modify, or share their work.”  This creates a problem for people with disabilities. It used to be that they could alter content to make it accessible. These new standards could make that illegal.

W3C is certainly not ignorant to that issue. Judy Brewer, director of the Web Accessibility Initiative at W3C, says that for the last 20 years, a W3C working group of accessibility experts reviews all specifications to make sure they support accessibility. They do, however, believe the responsibility for ensuring accessibility lies on the content creator – not the end user. In response to a complaint by the Electronic Frontier Foundation, Brewer suggested “teasing apart the specific concerns rather than talking about accessibility too generally.”

While we all agree that accessibility is critical to an open and transparent internet, the path is is anything but easy. What are your thoughts on DRM, the new EME standards and the larger accessibility issue? Leave them in the comments below.


Posted in News

Making Strides Towards Inter-Cloud Interoperability

Posted on September 13th, 2017


On July 25, 2017, two major bodies in the standards world agreed to a collaboration that would set a joint standard for inter-cloud interoperability. The IEEE, an OpenStand affirming partner, and the National Institute of Standards and Technology will work together on this collaboration, potentially increasing the possibility of a vendor-neutral means of moving from one proprietary cloud system to another.

According to one article on the pairing, that possibility “may be made easier after the backers of the Open Container Initiative reached agreement on an OCI 1.0 specification for a container format and container runtime environment. The initiative includes the major vendors of container tools, engines and deployment systems.”

Most of today’s internet giants such as Google, Amazon Web Services and Microsoft, use proprietary APIs and virtual machine formats. As such, moving workloads from, for example Google to Microsoft can be difficult. In the past, even attempting to do so would be impossible. However, today these difficulties have lessened and are currently easier to navigate than ever before.

“There is a growing recognition that the lack of cloud federation in a landscape of multiple independent cloud providers is limiting the service reach, resources and scalability that can be offered in a rapidly expanding marketplace,” said Bob Bohn, chair of the IEEE Intercloud Working Group. The group is already working on SIIF or Standard for Intercloud Interoperability and Federation. The standard is also known by the designation IEEE P2302.

The IEEE understands that having an open internet framework to allow workloads to move from one cloud to another would be extremely beneficial. It could, according to Bohn, “have the same beneficial effect as the Internet had on information sharing and ecommerce, he suggested.”

These cloud providers would need to agree on ways to make their systems recognizable to each other for this to succeed. A common system of trust, cooperation, adherence and transparency would be key.

The IEEE and NIST pairing up to drive this cloud openness forward is a key first step in creating just those systems.

Share your thoughts and leave a comment below.

Posted in News

Are Open Standards the “Key” to Smart Cities?

Posted on September 6th, 2017


From smart household appliances to smart cars, regular readers of this blog know that there has been a rapid increase in the adoption of Internet connected devices. This internetworking of physical devices and the Internet of Things (IoT) has been the subject of plenty of industry talk and speculation, including how open standards can help in their development. But now, the concept of the “smart city” is also seeing increasing adoption worldwide.

The thought is that by providing smart, digital services, these cities will become more attractive places for people to live. And that could very well be the case. However, a recent article from The Open Group explored a the position that “in order for smart cities solutions to remain affordable, cities will need to adopt open standards and open platforms for their digital services so they can maintain competition and keep those services affordable.”

So, are open standards the “key” to the smart cities?

According to Kary Främling, Professor of Computer Science at Aalto University in Finland and Founder and CEO of Control Things, the answer could be yes.

In an interview with Främling, Open Group dug a little deeper. The point of smart cities is to make life easier for the citizens through these connected services and devices. However, that benefit disappears should they become too expensive due to vendor lock-in or proprietary standards. These services cover a range of issues. One, for example, is parking. As an increasing number of vehicles go electric, they need to find charging stations when they are in new or unfamiliar cities. “The challenge is that providers all tend to have their own portals, services, payment systems and so on. We want to have these services that we use in everyday life become simpler.”

How would open standards provide an advantage as cities move to become “smarter?” Främling puts it this way:

The new services are a value as such, but they need to be kept affordable. Keeping these systems open too is not even just about the money and maintaining competition in the marketplace. It’s also about once you have this data and services more open and available, that opens up completely new possibilities for innovating new services. Let’s say that even if Google is an excellent company with loads of smart people, if you extend that into more open information systems and services that you could combine on-the-fly, then even some start-up companies could come up with new services on-the-fly that use the existing ones. It could really spur innovation.

You can read the entire transcript of Främling’s interview here. Using open standards that follow the Modern Paradigm for Standards can help ensure these smart cities remain accessible and open to all.

If you agree with the principles of openness, transparency, accessibility, and market-driven standards adoption, we hope you’ll consider becoming an OpenStand Advocate. You can help spread the word by displaying a site badge or infographic on your website.


Posted in News
Next »