How Can We Ensure Digital Accessibility in the Age of Internet Standards?

Posted on September 20th, 2017

Image source: https://shutr.bz/2vQaK8J

One of the latest battles within internet standards is around digital content accessibility – more specifically online video and podcast accessibility for people with disabilities. Now, one would think that this isn’t really something to argue about. Of course they should have access.

However, as seen in a recent University of California–Berkeley legal battle, that is not so simple.

When a judge ordered the University to make 20,000 public videos and podcasts accessible to people with disabilities, the university shut the program down and locked the content behind a firewall. It was just too expensive to go back and revise all the materials to make them accessible.

In a recent article, the questions around digital accessibility and access were looked at beyond just retroactively making these materials accessible. They point out that “accessible Web video is important. Deaf people need captions. Blind people need audio descriptions. People with photosensitivity need strobe warnings. There are tools that can modify video so that colorblind people can distinguish shades (especially important for interactive websites where, for example, everything on sale is marked in red). Controls for videos need to work for people using diverse interfaces.”

However, this summer, OpenStand affirming partner  World Wide Web Consortium (W3C) released a new set of guidelines around “Encrypted Media Extensions (EME),” a standard that allows streaming video in HTML5 to contain Digital Rights Management software.  As the article puts it, “DRM technologies are ways that copyright owners try to limit users’ ability to copy, modify, or share their work.”  This creates a problem for people with disabilities. It used to be that they could alter content to make it accessible. These new standards could make that illegal.

W3C is certainly not ignorant to that issue. Judy Brewer, director of the Web Accessibility Initiative at W3C, says that for the last 20 years, a W3C working group of accessibility experts reviews all specifications to make sure they support accessibility. They do, however, believe the responsibility for ensuring accessibility lies on the content creator – not the end user. In response to a complaint by the Electronic Frontier Foundation, Brewer suggested “teasing apart the specific concerns rather than talking about accessibility too generally.”

While we all agree that accessibility is critical to an open and transparent internet, the path is is anything but easy. What are your thoughts on DRM, the new EME standards and the larger accessibility issue? Leave them in the comments below.

 

Posted in News

Making Strides Towards Inter-Cloud Interoperability

Posted on September 13th, 2017

Image: https://shutr.bz/2jiW5kZ

On July 25, 2017, two major bodies in the standards world agreed to a collaboration that would set a joint standard for inter-cloud interoperability. The IEEE, an OpenStand affirming partner, and the National Institute of Standards and Technology will work together on this collaboration, potentially increasing the possibility of a vendor-neutral means of moving from one proprietary cloud system to another.

According to one article on the pairing, that possibility “may be made easier after the backers of the Open Container Initiative reached agreement on an OCI 1.0 specification for a container format and container runtime environment. The initiative includes the major vendors of container tools, engines and deployment systems.”

Most of today’s internet giants such as Google, Amazon Web Services and Microsoft, use proprietary APIs and virtual machine formats. As such, moving workloads from, for example Google to Microsoft can be difficult. In the past, even attempting to do so would be impossible. However, today these difficulties have lessened and are currently easier to navigate than ever before.

“There is a growing recognition that the lack of cloud federation in a landscape of multiple independent cloud providers is limiting the service reach, resources and scalability that can be offered in a rapidly expanding marketplace,” said Bob Bohn, chair of the IEEE Intercloud Working Group. The group is already working on SIIF or Standard for Intercloud Interoperability and Federation. The standard is also known by the designation IEEE P2302.

The IEEE understands that having an open internet framework to allow workloads to move from one cloud to another would be extremely beneficial. It could, according to Bohn, “have the same beneficial effect as the Internet had on information sharing and ecommerce, he suggested.”

These cloud providers would need to agree on ways to make their systems recognizable to each other for this to succeed. A common system of trust, cooperation, adherence and transparency would be key.

The IEEE and NIST pairing up to drive this cloud openness forward is a key first step in creating just those systems.

Share your thoughts and leave a comment below.

Posted in News

Are Open Standards the “Key” to Smart Cities?

Posted on September 6th, 2017

Image: https://shutr.bz/2uinpAx

From smart household appliances to smart cars, regular readers of this blog know that there has been a rapid increase in the adoption of Internet connected devices. This internetworking of physical devices and the Internet of Things (IoT) has been the subject of plenty of industry talk and speculation, including how open standards can help in their development. But now, the concept of the “smart city” is also seeing increasing adoption worldwide.

The thought is that by providing smart, digital services, these cities will become more attractive places for people to live. And that could very well be the case. However, a recent article from The Open Group explored a the position that “in order for smart cities solutions to remain affordable, cities will need to adopt open standards and open platforms for their digital services so they can maintain competition and keep those services affordable.”

So, are open standards the “key” to the smart cities?

According to Kary Främling, Professor of Computer Science at Aalto University in Finland and Founder and CEO of Control Things, the answer could be yes.

In an interview with Främling, Open Group dug a little deeper. The point of smart cities is to make life easier for the citizens through these connected services and devices. However, that benefit disappears should they become too expensive due to vendor lock-in or proprietary standards. These services cover a range of issues. One, for example, is parking. As an increasing number of vehicles go electric, they need to find charging stations when they are in new or unfamiliar cities. “The challenge is that providers all tend to have their own portals, services, payment systems and so on. We want to have these services that we use in everyday life become simpler.”

How would open standards provide an advantage as cities move to become “smarter?” Främling puts it this way:

The new services are a value as such, but they need to be kept affordable. Keeping these systems open too is not even just about the money and maintaining competition in the marketplace. It’s also about once you have this data and services more open and available, that opens up completely new possibilities for innovating new services. Let’s say that even if Google is an excellent company with loads of smart people, if you extend that into more open information systems and services that you could combine on-the-fly, then even some start-up companies could come up with new services on-the-fly that use the existing ones. It could really spur innovation.

You can read the entire transcript of Främling’s interview here. Using open standards that follow the Modern Paradigm for Standards can help ensure these smart cities remain accessible and open to all.

If you agree with the principles of openness, transparency, accessibility, and market-driven standards adoption, we hope you’ll consider becoming an OpenStand Advocate. You can help spread the word by displaying a site badge or infographic on your website.

 

Posted in News

Open Standards, Innovation, and Our World

Posted on August 30th, 2017

Image: https://shutr.bz/2uiGFCm

Where open standards exist, innovation is driven; disruptive technologies emerge.

That’s how a recent article around the critical need for open standards in our society began, and we couldn’t agree more. While we’ve been pushing for open standards across the Internet for years, it’s always refreshing and enlightening to read about that same push from another perspective. In this case, it is how open standards can help in manufacturing.

We often spend time focusing on the esoteric aspects of open standards. However, a piece in Automation World simply titled “The Importance of Open Standards” takes a look at how open standards can make the Industrial Internet of Things a true game changer for the manufacturing industry.

The article uses the some familiar principles to define open standards – those accepted by affirming OpenStand partners IEEE, Internet Society (ISOC), World Wide Web Consortium (W3C), and the Internet Engineering Task Force (IETF). To review, those core principles are:

  1. Cooperation
  2. Adherence to principles
  3. Collective empowerment
  4. Availability
  5. Voluntary adoption

The author goes on to demonstrate how his company, Profibus and Profinet International (PI), the largest automation community in the world, has proven their belief in open standards and each of the principles. For example, they observe adherence to principles through an extensive Call for Experts process where all members equally provide input. Our technical standards are developed in PI Working Groups, the processes and guidelines for which are published online.

Have organizations like this acknowledge the necessity for open standards in their field is the exact type of promotion we need to further the mission of OpenStand throughout the world. They state, and we absolutely agree, that “now, in the fourth industrial revolution, analytics and Big Data collected via increased connectivity are being driven by open standards.”

Join us in supporting the OpenStand Principles and be sure to let us know in the comments ways that you have seen open standards impact manufacturing industries.

Posted in News

Are We Holding the ‘Death’ of Open Standards in Our Hands?

Posted on August 23rd, 2017

Image: https://shutr.bz/2vInS3E

Smartphones are everywhere these days. Nearly everywhere you look, people are using their devices to communicate, shop, track fitness, lock their doors or even change the settings on their home heating and air conditioning units. We are getting ever closer to controlling almost every aspect of our lives from the palm of our hand.

However, if you take away the phone, these other gadgets tend to exist in a vacuum. The phone is home base for all of them, and they have no awareness of each other. Interconnectivity as a whole remains incomplete with no industry wide standards to fix it. Open standards across the Internet of Things (IoT) devices can be of major value to smartphone users, allowing for better, more inclusive interconnectivity.

Fast Company’s article “How the Smartphone Era Led to the Death of Open Standards,” examines how, during the PC age, Microsoft was able to develop industry standards that were designed to help objects connect with each other. However, as the popularity of Apple and Google through Android devices drew, and the smartphone became the dominant method of Internet usage, their proprietary standards became increasingly dominant.

However, as we’ve pointed out before, open standards in the age of the smartphone can be enormously valuable for users. Open standards can help to drive down costs, amp up innovation and improve access, benefitting and giving more choices to the millions of global smartphone users. Our guiding principles are built around the idea that innovation and collaboration within the framework of open standards allows for better global interoperability, scalability, stability, and resiliency as well as enabling global competition and continuously encouraging providers to provide the best security possible.

While it’s too early to say what impact proprietary standards may have on development of IoT standards, “…if smartphones define the post-PC era, the Internet of Things may come to define a more disparate, decentralized post-smartphone era. Smartphones will still play a role–just as PCs continue to matter–but it wouldn’t be a central one from which the dominant companies can dictate standards. That would give industry groups the opportunity to build bridges between at least some of the islands that the smartphone era is creating.”

How do you think smartphones have impacted open standard development? Let us know your thoughts in the comments section below.

Posted in News

IACHR Publishes Comprehensive Report: “Standards for a Free, Open and Inclusive Internet”

Posted on August 16th, 2017

Image: https://shutr.bz/2umtKy6  

In March of 2017, the Inter-American Commission on Human Rights (IACHR) published “Standards for a Free, Open and Inclusive Internet” with the aim to “assist the member States in their efforts to incorporate a human rights-based focus in the design, development, and implementation of policies affecting the Internet.” The report drew upon the 2013 Report on Freedom of Expression and the Internet, but served to update and broaden its analysis to the new challenges faced in the exercise of human rights online, particularly freedom of expression.

In publishing this work, the IACHR continues their acknowledgment that the Internet is a unique tool with the potential to expand human rights, particularly the right to freedom of expression, through broader public arenas. As the Internet grows and expands in its complexity, so grows its ability to be that instrument and to increase levels of social benefits and inclusion. However, they also are quick to include that “in order for the benefits of the Internet and other communications technology to be distributed inclusively and sustainably among the population, the relevant policies and practices must be based on respecting and guaranteeing human rights especially the right to freedom of expression, which facilitates and enables the exercise of other rights on the Internet.”

The report also focuses on narrowing down their guiding principles for a free and open Internet, access to the Internet, multi-stakeholder governance, quality and nondiscrimination. One such Guiding Principle is the “relevance of the Internet as a platform for the enjoyment and exercise of human rights is directly tied to the architecture of the web and its governing principles, including the principles of openness, decentralization, and neutrality.”

These sorts of efforts mirror those of OpenStand and our Modern Paradigm for Standards. In order to have a free and open Internet that can benefit all, there must be standards developed that, among other things, encourage collective empowerment that contribute to the creation of global communities and that benefit humanity.

What do you think about these efforts from the IACHR? Did any of the report specifically strike you? If so, let us know in the comments!

Posted in News

Are Standards The Next Defense Against DDoS Attacks?

Posted on August 9th, 2017

Image: https://shutr.bz/2uqJ4KD

The more businesses are keeping data online, the more opportunity for cybercriminals to attack. And, since digital isn’t going anywhere, businesses are forced to take measures to protect themselves from all manner of cyberattacks.

One particularly popular and nasty cyber attack is a distributed denial of service, or DDoS. When an organization, or group of organizations, are victims of a DDoS attack, it means hackers from anywhere in the world send enormous amounts of useless data to their target. All of that garbage data overwhelms the target’s servers to the point where the target can no longer accept incoming requests. Eventually the network and servers slow to a crawl or, in some cases, shut down completely. In recent attacks, the endpoints went beyond laptops and PCs to all manner of connected, or IoT, devices such as baby monitors and printers.

These attacks show no signs of slowing down. In fact, according to leading content delivery network (CDN) services provider, Akamai, DDoS attacks greater than 100Gbps increased by 140% year-over-year in just the last quarter of 2016.

While organizations are continuing to spend huge amounts of money to combat these attacks, the answer may be in a new direction. A recent article in information age asks if it is time for software and hardware manufacturers to consider using standards to address security risks in the IoT.

“One key standard is the Open Trusted Technology Provider Standard, or O-TTPS, which addresses these issues around supply chain security and product integrity. Recently approved as ISO/IEC 20243, this set of best practices can be applied from design to disposal, throughout the supply chain and the entire product life cycle.”

These types of standards try to mitigate tainted and counterfeit hardware from even coming into the supply chain. That way, they’ll never have the opportunity to get into Internet connected devices. Within the standard there is a process for vulnerability analysis and notification of newly discovered and exploitable product weaknesses requirements that can catch risk areas. Then, these attacks can be blocked or slowed and significantly reduce the damage done.

While, as the article states, standards can’t categorically prevent the inception of DDoS attacks, what they can do is mitigate their effectiveness and limit their economic damage.

“Further steps need to be taken in the form of collaboration, whereby we reach a point where we can recognize which technology and technology providers can be trusted and which cannot. But adhering to global standards provides a powerful tool for technology providers and component suppliers around the world to combat current and future DDoS attacks.”

While we know standards aren’t the golden ticket to a future free of cyber attacks, they can certainly be a step in the right direction. This is especially true of those created in a collaborative environment and adhering to our Modern Paradigm for Standards.

Do you think standards could be the answer to slowing down the progression of DDoS attacks? Let us know in the comments!

If you’re interested in learning more about OpenStand, check out our OpenStand infographics.

 

Posted in News

OpenStand Partner IETF Holds Sixth Hackathon

Posted on August 2nd, 2017

Image: https://shutr.bz/2umit0G

Internet usage pioneers, and OpenStand Affirming Partner, the Internet Engineering Task Force (IETF) recently held their sixth hackathon event. This event, through running open source code, was able to find and highlight missing or unclear areas of standards, subsequently improving the standards.

According to the IETF event recap, this Hackathon drew approximately 120 participants on site, plus more than 20 remotely. Work covered a broad range of IETF topics with valuable and inspiring results. The Hackathon had two primary goals:

  1. Advance the pace and relevance of IETF work
  2. Attract young people and developers to the IETF

One of the ways the Hackathon increases the pace and relevance of IETF work is via running code. Implementing evolving standards and producing running code validates the standards and highlights things that may be missing, wrong, or ambiguous in draft versions of these standards. Better still, if the code is open source, viewing and sharing the source code aids in understanding of a standard, makes it easier to use, and promotes its adoption. Open source projects that featured prominently this Hackathon included OpenDaylight, ONOS, VPP, Joy, and many others. For a list and brief description of the Hackathon projects, see the wiki.

These Hackathons work well and are a consistent draw in part because they are not designed for a single developer to “win.” The spirit is collaborative and success is measured in how they can improve the Internet as a whole. Free participation and bragging rights over prize money mean genuine and honest friendly competition.

The OpenStand Principles work towards transparency, openness, cooperation and voluntary adoption within the working Internet. These are qualities that align with the open source community. That’s why events such as these continue to demonstrate how open source code can be used to aid in the understanding, utilization and improvement of internet standards in an open way.

These types of events exemplify the collaborative spirit at the heart of the mission of the IETF. They have a goal to make the Internet work better by producing high quality, relevant technical documents that influence the way people design, use, and manage the Internet. For more information on winners or how to participate in the next Hackathon, read IETF’s recap here.

These are also the types of events supports of Open Standards should participate in, if able. By collaborating, the internet will only continue to grow and become a better tool for all. Join us in working to make the web a better place; become an OpenStand advocate! Go here to Display a site badge on your website.

Posted in News

Are Credibility Standards the Answer to Misinformation on the Web?

Posted on July 26th, 2017

Image: everything possible

What if developing and defining a set of credibility standards could influence how people share and display content? The problem of misinformation online isn’t new, but we have certainly seen new focus on it lately. At a recent event designed to fight against this decimation of misinformation, a number of the digital projects saw a theme emerge: credibility standards.

The event, called Trust, Verification, Fact Checking & Beyond: MisinfoCon, is a “global movement focused on building solutions to online trust, verification, fact checking, and reader experience in the interest of addressing misinformation in all of its forms.”  They found that, as a heightened focus is put on the critical need to establish credible content, so is the need for how this credibility is communicated on digital content. And the answer that came up time and time again was credibility standards.

“Simply rating an article as “credible” is not enough; we need to understand what parts of it are credible, how the conclusion about its credibility was reached, and how to communicate that credibility effectively.Defining a set of standards for content credibility gives us a more effective way to talk about it, and, importantly, to make important decisions about how we share and display that content, regardless of what site the content appears on.”

As more and information is posted online, having both core and peripheral standards to determine credibility could certainly be a huge benefit to all involved. In fact, the World Wide Web Consortium (W3C), the main international standards organization for the web and OpenStand affirming partner, recently announced a standardization of annotation. That is the sort of work this could build upon.  

However, also of interest to open standards supports was the WAY recognition for this need arose. It was through a collaborative workspace where the processes were open to all interested and informed parties. A place where they “hacked, designed and ideated” in a transparent way and reached a broad consensus that the need for standards existed. The are making the process “transparent and open, to help build trust about our decision making and make it open to public comment.”

This is how the best possible credibility standards can be created- making the web a better, more credibly informed space.  

If this is something you are interested in participating in, reach out to MisInfoCon here.  

Do you think standards are needed to slow, or even stop, the dissemination of misinformation on the web? Are they necessary? What should be the core standards?  Leave us your comments below!

Posted in News

How to Modernize the Smartphone Network with Open Standards

Posted on July 19th, 2017

Image: https://shutr.bz/2qZkuz8

According to online publisher TechCrunch, by 2020, globally there will be 6.1 billion smartphone users, led by huge growth in less mature markets. That works out to approximately 70 percent of the world’s population using these devices. In addition, research giant Ericsson predicts that “regions like Asia Pacific, the Middle East and Africa will account for 80 percent of all new subscriptions” by 2020.

As the need for access and communication across the globe grows, smartphones will only continue to grow in popularity. They are often the user’s primary lifeline to the Internet, and that doesn’t just mean while away from home. A growing share of smartphone users are now using their devices as their primary means of online access while at home.

So, with the growth of smartphone usage and the clear benefits of open standards, why have only 11 percent of Federal Agencies embraced an open networking approach?

That question is at the heart of a recent video from Brocade Communications Systems. The battle for proprietary versus open standards in the era of smartphones has long been debated and documented. However, this video offers the opinion that open standards help to drive down costs and amp up innovation which will serve to benefit and give choices to the millions of people using smartphones every day. Oftentimes, many agencies incorrectly believe that proprietary standards mean a more secure network. However, the opposite is true. As our Principles of Open Standards show, innovation and collaboration within the framework of open standards allows for better global interoperability, scalability, stability, and resiliency as well as enabling global competition and continuously encouraging providers to provide the best security possible.

Mobile devices are creating an opportunity for dynamic Internet resources to become available to greater and greater numbers of people. Because of that, keeping the Internet safe, stable, open and interoperable is of greater importance than ever. Open standards that align to our Principles will continue to play a critical role in creating an open internet and driving open innovation, serving as the building blocks for future development of the expansive Internet community.   

Where do you land on the proprietary versus open standard argument for smartphones? Leave us your thoughts in the comments below.

Posted in News
Next »