Open Stand Marks Its Three Year Anniversary

Posted on August 26th, 2015

It's been three years since OpenStand was founded and The OpenStand Principles were jointly affirmed by our partners at IEEE, W3C, ISOC, IAB and IETF. Join us in celebrating our third anniversary this week!

This week we celebrate the third anniversary of the founding of OpenStand. The OpenStand Principles were jointly affirmed by IEEE, W3C, ISOC, IAB and IETF on August 29, 2012, in an effort to codify and jointly affirm the principles that brought us the open Internet and decades of open, technological innovation. The explosive growth of new technologies has only increased our need to more broadly adopt open, transparent, inclusive, accessible policies as we advance technology for humanity.

The OpenStand principles were created with an emphasis on Standards development. Standards Development Organizations (SDOs) working within the OpenStand paradigm operate according to the principles of balanced representation, consensus, due process, and transparency. This results in the creation of an open, competitive system that has, and continues to produce, standards that are widely recognized for their extensibility and high-quality technical content. Without question, open standards developed using the OpenStand paradigm stand as fundamental pillars for worldwide economic growth and progression in all sectors of the global economy.

In parallel, the OpenStand Principles have also proven to apply on a much broader scale, to support open technology development of all kinds. The OpenStand Principles have also been influential in shaping open source development, yielding new technologies and specifications that support global participation, drive interoperability, encourage healthy competition, fuel innovation and create a market-driven environment that supports freedom of choice.

In addition to encouraging advocacy of the OpenStand Principles among individuals and organizations formally focused on standards development, we encourage non-standards-focused development bodies to leverage the OpenStand Principles as we move into 2016. We stand firm in our commitment to promoting open, market-driven standards and technology development, and to helping secure an open internet and an open future. It is an exhilarating time to be active participants in the rapidly evolving global technological landscape, and we thank our OpenStand Advocates and the SDOs that have submitted Formal Endorsements of OpenStand.

If you would like to become an OpenStand Advocate, here are three ways to Stand With Us:

  1. Sign Your Name to express your public individual or organizational support.
  2. Get a Site Badge to display your support on your site or blog.
  3. Submit a formal endorsement from your organization for our site.

Posted in News

Open Standards Opportunities: Vint Cerf on Interplanetary Protocols for Space Communication

Posted on August 19th, 2015

Vint Cerf has issued a call to action surrounding the need for standardized protocols for interplanetary communication. Sound like dialogue from a science fiction novel? Find out why Cerf believes this is practical.

Image: Shutterstock, Vadim Sadovski

“Space: the final frontier.”

Indelibly associated with Gene Roddenberry’s celebrated Star Trek mythos, these words also carry additional significance for the world of open standards in telecommunication. In a collaborative address at a conference hosted by the Internet Society (ISOC) earlier this year, web pioneer and industry juggernaut Vint Cerf identified the need for standardized protocols for interplanetary communication. While this call to action may sound like dialogue from a science fiction prequel, Cerf assured his audience that the need is a practical one.

“We have this fairly old infrastructure [for communicating with spacecraft] and the parties who are responsible, that have the problem of maintaining the equipment, don’t seem to be able to put in additional ground capability.” said Cerf, referencing the nearly fifty year old Deep Space Network created by NASA. “It’s my personal belief that we should be advocating for end-to-end infrastructure for space exploration. We should be providing ‘off the shelf’ capability for the interplanetary protocols, the DTN (delay-tolerant networking) protocols, available for both spacecraft and ground use.”

Cerf pointed to the emergence of private aerospace companies such as SpaceX, Blue Origin, and Orbital Systems as evidence of the changing landscape in the field of space travel. As new players enter the field, they will be mostly unexposed to the idea of standardized protocols, which could create problems for establishing a robust communications infrastructure in the future. If efforts are put forth to establish standardized protocols for space travel now, they could be put to immediate use in the low-orbital applications that ongoing near-earth missions demand, while establishing a foundation for interplanetary communication in the future.

Historically, the traditional practice of aerospace scientists like those at NASA has been to adapt communication instruments and strategies to the specific needs of a mission. Cerf cautioned against this sort of approach, suggesting that “Infrastructure and mission-by-mission thinking are sometimes at odds with each other.”

Cerf’s parting thoughts to his audience of engineers, scientists, and industry experts that were that they should be vocal and intentional about the cultivation of standardized network protocols for interplanetary communication. Cerf went on to challenge those in attendance to collaborate and craft case studies based on realistic mission specifications to demonstrate the need for standards of this nature.

To follow the narrative surrounding standards for open protocols, both terrestrial and interplanetary, be sure to subscribe to the OpenStand blog!

Posted in News

Panelists of GCIG Conference Discuss Future of Internet Governance

Posted on August 12th, 2015

Who's who in the debate on Internet governance? The global reach of the Internet makes it unique in terms of technology and influence, and it's ability to transcend national boundaries presents challenges for governance.
Image: Shutterstock, Moon Light PhotoStudio

“The point of open standards is not ‘one size fits all.’ In fact, it’s completely the opposite. It’s ‘What is the minimum we need to agree on in order to be able to talk to each other?'”

The above quote was delivered by former Internet Society CIO (now co-chair of IANAPlan working group) Leslie Daigle at the GCIG Conference held at Columbia University this past May. The conference, paneled by technologists, academicians, and industry experts, featured thoughtful discussion regarding present and future practices of governance on the Internet. One of the primary lines of inquiry that wove itself throughout the discussion was that of unity versus uniformity: as the Internet continues to grow into new and developing markets, and more and more IP-based applications become incorporated into our lives, what is the best strategy to preserve the usefulness of the net?

To put it quite simply, there seem to be two basic schools of thought surrounding the aforementioned question. The perspective held by the first school of thought is that the robust growth of the Internet necessitates intervention and legislation in order to keep web activity safe, legal, and controllable. The second school of thought maintains that the growth and equity of the Internet depends on non-interventionist approaches to development and that openness is essential for the utility of the web to remain intact.

So who’s who in the debate on Internet governance? Daigle cited two pieces of legislation that were introduced in 2011 (SOPA and PIPA) as misguided efforts of the United States government to confront the problem of digital piracy and copyright violation. While these are not unworthy goals in their own right, Daigle said, the act of surrendering the control and technology of the Internet to the government would almost certainly have deleterious effects on the innovation of the web. Daigle points to the breadth of technologies and applications that may now be considered a part of the Internet family and suggests that the very openness of the web is what makes it so powerful. “No matter where you connect to [the Internet], you’re connecting to the same Internet; whether you’re connecting from somewhere in Africa or right here in downtown New York City.”

Truly, the global reach of the Internet makes it unique in terms of technology and influence, and it’s ability to transcend national boundaries does present some challenges for staid conceptions on governance.”This challenge between policy and technology is the fact that the ease of interaction between people across the globe is challenging our whole understanding of what it means to be a part of a culture or a part of a nation.” said Daigle. “And that’s not the problem. The problem is we need to find ways to get our policy approaches to grow up to that fact. To grow up to the fact that what it means to be a citizen of this planet is evolving.”

Dailge’s admittedly grandiose language may indeed point to a fundamental issue regarding Internet governance: the Internet is different than almost any other utility or service. Panel moderator and Columbia Business School professor Eli Noam suggested that the Internet should be viewed in the same way one might conceive of a municipal good such as water or electricity, which are controlled and regulated by a centralized authority. But rather, it would be more appropriate to think of Internet governance in terms of the UN model, with entrepreneurs, tech companies, and industry innovators playing the role of delegates.

One of the basic tenets upon which most of the panelists seemed able to agree is that innovation depends on openness. This would not be an openness that is entirely without structure, of course, because the power of the Internet is in its capacity for collaboration and interoperability. Interoperability requires agreement on standards of operation and this brings us back to our opening quote from Daigle, “What is the minimum we need to agree on in order to be able to talk to each other?”

Be a part of the conversation! Leave a comment or write us a message and tell us what you think about the future of Internet governance and how open web standards encourage or detract from innovation in web development!

Posted in News

Happy Birthday RFC Series!

Posted on August 6th, 2015

The RFC series has been a sterling example of the power of open standards and information sharing. Here's how the web standards community celebrated the 46th anniversary of RFC 1 this year.
Image: Shutterstock, Ruth Black

The documentation series known as Request for Comments (RFC) turned another year older this past April as the 46th anniversary of the first RFC, known as RFC 1, was observed and celebrated by the web standards community.

RFC 1 was authored by Stephen D. Crocker in 1969 as an effort to record and organize unofficial notes regarding the development of the groundbreaking packet switching network, ARPANET. Crocker was an undergrad student at UCLA at the time and his RFC proved an effective way to help insure information fidelity in the technical areas while making the development process available to a wider audience.

Since its inception, the RFC series has grown into a collection over 7000 published documents, all of which are freely accessible in public indexes. RFC 2555, published on the 30th anniversary of RFC 1, serves as a collaborative portrait of reflections on the RFC series, with several luminaries of the web standards community lending their personal anecdotes to the collection.

What began as a simple text file documenting the development of a network prototype, the RFC series now features a broad array of document types. And although the official file type remains simple ASCII text, RFC documents are available in many different mediums and formats. Proven time and time again to be a tremendous boon to the web development community, the RFC series is a sterling example of the power of open standards and information sharing.

Be sure to get notifications for all the latest posts involving web standards and open data sharing by subscribing to our blog!

Posted in News

Corollaries Between Web Standards and Human Rights Lead to Important Discussions

Posted on July 23rd, 2015

At a summit meeting of the Internet Engineering Task Force (IETF) earlier this year, the question of human rights and standard internet protocols was brought to the fore.  One of the proposals under consideration was the creation of a new subcommittee named the "Human Rights Protocol Considerations Research Group" (HRPC).

Image: IETF

What does the standardization of web protocols have to do with human rights? It’s a reasonable question; after all, web protocols are technological and sophisticated, while human rights by nature, are are fundamental and human. Still, when you consider the political right of freedom of expression and the expressive power of the internet, the query starts to take on real meaning.

At a summit meeting of the Internet Engineering Task Force (IETF) earlier this year, the question of human rights and standard internet protocols was brought to the fore. One of the proposals under consideration was the creation of a new subcommittee named the “Human Rights Protocol Considerations Research Group” (HRPC). The group, which is still in the initial review stage for IETF, would focus on if and how the freedoms of speech and association should inform the development of internet protocols and standards. An abstract of the proposal, which is available on IETF’s website, describes the HRPC agenda:

Work has been done on privacy issues that should be considered when creating an Internet protocol. This draft suggests that similar considerations may apply for other human rights such as freedom of expression or freedom of association. A proposal is made for work in the IRTF researching the possible connections between human rights and Internet standards and protocols. The goal is to create an informational RFC concerning human rights protocol considerations.

Given that internet protocols and standards are the “gatekeeping” technologies of the web and that the web is the world’s most preeminent tool of mass communication, it is appropriate that there be a conversation regarding the confluence of open standards and freedom of expression. You can follow the progress of the HRPC proposal using IETF’s Datatracker tool. If you are interested in participating in the conversation of open standards, leave a comment or send us a message.

Posted in News

Open Standards Opportunities: Financial Transactions & Seamless Global Commerce

Posted on July 15th, 2015

Despite today's technological climate, the ability to make global transactions remains challenging. W3C's Web Payments Interest Group is exploring new ways to solve this problem.
Image: W3C

For anyone who has ever sent money abroad, set up an international bank account, or simply made a Web purchase from a foreign vendor, the challenge of translating cost into domestic currency is a familiar annoyance. Beyond the basic arithmetic of conversion, other challenges include additional fees, unavoidable delays, and other technicalities that can frustrate transactions as well as the user. In some cases, an individual may find that a desired transaction is not possible, due to the limitations of the payment systems involved.

The Web’s ability to introduce standardized avenues of communication into disparate systems is well established. Toward this goal, the W3C’s Web Payments Interest Group is working at the forefront of web payments to explore new ways to streamline global transactions. The group has developed a number of ideas for the future of global eCommerce, which range in complexity and promise a number of tangible benefits for users and the global economy.

One possible solution proposed to improve global transactions is to create a new, standardized front-end application layer that masks complicated financial transaction details. This approach leaves existing payment systems in place, leveraging a totally new web application that runs “on top.” The application layer simplifies the user experience and interfaces with back-end systems to seamlessly handles transaction and conversion complexities. This results in an improved and more reliable user experience. Because this solution focuses largely on the application layer, rather than the complexities of disparate back-end systems, it would be unable to produce marked improvements or new consistencies with regard to network interoperability, transaction speed, security or other variables (such as user input required for each transaction).

A second, more ambitious potential solution for streamlining global transactions involves implementing broader changes in Web standards infrastructure for financial payments. Rather than focusing on a higher level application that masks transaction details, this scenario centers on modifying the way funds move from one system to another. This proposed solution would create a new, standardized environment within which users do not have to worry about having a particular payment method in common with a vendor. Instead, the system would ensure complete transferability – connecting all payment methods available to a user – from debit to credit, BitCoin to PayPal and other payment types. While the up-front cost of implementing a new standard like this would be greater, the benefits of this approach would include guaranteed network interoperability, improved speed and security, and lower cost per transaction.

Without question, the Web-based financial transactions area is fertile ground for improvement. Open standards for global financial transactions promise to improve global transactions by improving user experience, simplifying and streamlining transactions, improving security, lowering costs, improving transaction speed and more. While there are some standards already in place for interbank connectivity and communication, such as the electronic data standard ISO 20022, there are no current standards in place that are as ambitious as those proposed by the W3C’s Web Payments Interest Group.

W3C’s Web Payments Interest Group’s Value Web Task Force committee is actively studying the need for Web standards in internetwork transactions. The Task Force is working to gather industry use cases and requirements and uses that data to aid in proposal development. They are currently seeking interested parties such as banks, clearinghouses, cryptocurrency companies, and related organizations that recognize the potential of a standardized field of Web commerce and wish to contribute to future development. Participation is open. W3C is also an affirming partner of the OpenStand Principles.

If you are interested in contributing to the work of the Web Payments Interest Group, you can reach out to them at If you’d like to have more information about the mission of the Web Payments Interest Group and the Value Web Task Force, you can email with the subject line, [value web]. Where you like to see development in the sphere of web payments and commerce? Share your thoughts in the comments below!

Posted in News

Open Standards Development Adds Value to Web TV and Video Streaming

Posted on July 8th, 2015

TV and other video media have been adapting to web infrastructure for over a decade, but open standards are needed to ensure widespread adoption.

Shutterstock, Semisatch

TV and other video media have been adapting to web infrastructure for over a decade. Netflix is now 18-years-old, and YouTube was founded ten years ago. According to a Nielsen report, the number of American households subscribing to an internet video streaming service has reached 40%, but there is still room for improvement. Video streaming experience has yet to live up to user expectations, let alone go above and beyond those expectations.

Some of the challenges of the current user experience in internet video streaming include the lack of seamless integration between broadcast TV and the web, and the length of time a user must wait before it’s possible to view a desired program. For streaming users, it’s typically necessary to wait well past an air date to view a program of choice, which is undesirable in today’s instant-access economy. A second challenge pertains to broadcasters integrating streaming capabilities into their services. Unfortunately, for a number of reasons (stand-alone solutions, lack of integration with cable services, etc.), these services have not achieved broad adoption because of the disjointed TV/web experiences.

Beyond meeting user expectations, Daniel Davis of W3C believes that, “There are also new ways to enjoy content that the web has the potential to realize, such as multiple simultaneous camera views or customizable synchronization with other online and data services.”

In order to achieve widespread adoption of these services, open standards are needed in the field of web TV. The W3C, an OpenStand affirming partner, is actively at work in this area. Several W3C groups are establishing use cases and requirements, and are addressing standards gaps under the support and purview of the Web and TV Interest Group, chaired by Yosuke Fanahashi of W3C, Giuseppe Pascale of Opera Software, and Mark Vickers of Comcast Cable. This group is embracing the principles of collaboration, effective empowerment, and voluntary adoption and reflect the values of the OpenStand Principles.

According to Davis, some of the needs his group has identified, to date, include:

  • Multi-screen content delivery
  • Stream synchronization
  • TV function and channel control
  • Mixed media sources and content overlays
  • Stream recognition and identification
  • Server-side content rendering (e.g. for low-powered STBs)
  • Improvements to existing features (e.g. adaptive streaming, timed text)

Groups and projects have been mobilized to address these gaps as follows:

  • GGIE (Glass-to-Glass Internet Ecosystem) Task Force: With a goal of “identifying essential elements in digital video’s life cycle and features that would be appropriate for recommendation for standardization in the appropriate SDOs [standards development organizations], not just W3C,” they are currently gathering use cases and facilitating discussion in the interest group.
  • TV Control API Community Group: This group is developing an API to “control TV-like content and features…eventually producing a new standard for media devices, set-top-boxes and of course televisions.”
  • Multi-device Timing Community Group: This newly developed group is focused on synchronization of media streams across the web, opening up some of the unique potential of web viewing vs. traditional one-stream viewing.
  • Media Resource In-band Tracks Community Group: This very standards-focused project is building a spec to define and allow web applications access to in-band informational elements like metadata and captions through the media element itself.
  • Second Screen Presentation Community Group & Working Group: Davis points out that this group has evolved from an idea to a standard. It began as a collaborative proposal brought to W3C to be drafted by stakeholders, which eventually set the foundation for a new Working Group. Today, Davis says, “it’s officially on the standards track and further stabilization should see it implemented and brought to a big screen near you.”

What opportunities for video content distribution and development on the web do you see? What current problems with web video could be solved with further open standards development? Share your thoughts in the comments below!

Posted in News

Open Standards Opportunities: Tokenization and Ecommerce Security

Posted on July 1st, 2015

Tokenization may be the answer to some of the pain points in ecommerce today, including improved payment security but in order to ensure widespread adoption, open standards are needed.
Shutterstock, Rawpixel

The Web Payments Group of the W3C, an affirming partner in the OpenStand Principles, has been making progress on payment integration as part of the Open Web Platform and is doing a series of interviews on web payments. W3C’s Ian Jacobs interviewed Drew Jacobs and Tom Poole of Capital One, and Siva Narendra, CEO of Tyfone, probing into the vast potential of ecommerce and the open web, and the implications of tokenization to improve transaction security. The full transcript of this interview can be found here.

Drew Jacobs highlighted a number of  “gaps and pain points across the value chain, from consumer, to merchant, to financial institution,” which include:

  1. Convoluted purchase processes
  2. Lengthy checkout
  3. Masses of data being submitted, without the guarantee of security
  4. Online credit card transactions that leverage less-secure static data.

While he acknowledged efforts to solve current dilemmas, citing such as vendors like Amazon implementing one-click payments, he says they are seeing a trend toward tokenization.

Tokenization is a process flow similar to a checking system in banking, where the user is provided a token or placeholder (in digital terms, a meaningless sequence of numbers) by their financial institution. When a purchase is made, the user supplies that to the merchant, who then redeems it with the bank or credit-holder. This process is appealing for the same reasons as checking accounts; both the user and the merchant carry reduced liability with a tokenization system.

The problem is, as Jacobs points out, that tokenization is not a cohesive cross-channel solution. Without open standards tokenization is simply one of many other payment execution methods. Jacobs asserts, “Tokenization should not be a separate process from other forms of payments, we need a cohesive solution across channels.” He emphasized the needs for collaborative development in order to ensure widespread voluntary adoption of a standard tokenization system.

Narendra agrees that in order to be effective, there much improvement is required. One of the key areas that must be focused on is security. He states that,

“…there is a fraud rate of about .9% for ecommerce while it is .09% for other forms of transactions. So the fraud rate for ecommerce is 10 times what it is for non-ecommerce. There are a number of reasons for this, including the fact that passwords are not very effective. Tokenization, as Drew mentioned, is an important path for the future. But securely authenticating the right user is being provisioned the right token is necessary, otherwise criminals can steal tokens, too.”

Introducing more payment options will not solve the security and privacy problems involved in data sharing and online payments. Jacobs and Narendra agree that security and authentication for both the user and the transaction has to be first priority in order for tokenization to become the tenable, cohesive market solution.

In response to this need, W3C is working on a Web Crypto API, which Narendra explains, “gives developers access to cryptographic operations from JavaScript.” He continues,

“I think there’s an assumption in the browser community today that the only token that browsers will support is FIDO Alliance-based. But I think we need greater interoperability. We do need to be looking at secure elements, but chips in phones are not the only way to achieve that. There is a large existing infrastructure for security and we need to extend those capabilities to the Web to achieve scale and success.”

Tom Poole of Capital One identified a few key targets for more specific security improvement opportunities:

“There are three different levels where payments could be improved. The first involves adding support for secure storage of information, such as via a browser plug-in. An open standard would enable multiple providers of such plugins (and of course, browsers might provide their own solutions). The next level up is the “white label container” like Softcard that could provide consistency for payment scheme providers, but still allow for innovation. The third layer would be to build on something like Apple Pay, but that would mean very little differentiation and a single vendor would drive the normalization of payments. But I don’t think many people want to invest in that sort of centralized solution. “

There are opportunities at all three layers mentioned by Poole, but according to Jacobs, the most important reason for forming the working group is focusing on “the unique opportunity [for WC3] to provide underlying infrastructure standards that leverage existing work around tokenization. That is the biggest pain point for us today: tokenization doesn’t exist easily online, and we need greater security online. We think browsers can play a role in bringing this together. We also see opportunities around improved authentication and identification of the real user.”

The interview sheds light on yet another reason why open standards are critical to solving problems that impact each one of us. By working together and embracing essential principles for Open Standards development, we can unlock hidden potential and benefit global society. If you believe in the necessity of open standards, please Sign Your Name to voice your public support of the OpenStand Principles.

Posted in News

ISOC’s Kolkman Calls for Collaborative Internet Security

Posted on June 24th, 2015

Never before has the need been more evident for open standards that support privacy and security. The question is where do the solutions to our current security problems lie?
Shutterstock, STILLFX

Never before has the need been more evident for open standards that support privacy and security. Olaf Kolkman, open standards advocate and CTO of the Internet Society (ISOC), (an affirming partner of the OpenStand Principles) has been focusing on the need for collaborative security. Kolkman authored this recent article for ISOC in which he asserts that use of the Internet, or participation in this great innovation we call the world wide web, carries with it collective responsibility:

“When you connect to the Internet, you become a part of its ecosystem. Even more, across the Internet there is no clear line between consumers and suppliers; every participant is a contributor.”

Kolkman argues that this collaborative and egalitarian culture is part of what makes the Internet so powerful. The outcome of the collaborative nature of the Internet creates valuable opportunities. However, as more and more participants join and increasing power becomes inherent in various functions, security and governance is becoming a more apparent concern both for end-users and service providers.

We recently highlighted ISOC’s powerful new stance on collaborative security, here on the OpenStand Blog. ISOC’s stance, outlined in a recent white paper, highlights the importance of  “fostering confidence and protecting opportunities” in the internet environment. Unfortunately, Kolkman argues, there is little economic incentive for individual providers to develop, deploy and maintain key security technologies. In fact, Kolkman points out that because they will most likely bear the expense for assuming a more proactive position with regard to security, there is likely to be dis-incentive for providers to act more decisively—even if raising the level of security in the system and reinforcing confidence in the Internet are stated outcomes.

As a solution to this problem, some point to establishing legal mandates driving increased security measures for providers. However, as Kolkman points out:

“That approach would go against one the fundamental and foundational principles of the Internet: as an organic system, a network of autonomous networks, not built from a global blueprint but developing in accordance with local needs and conditions, deployment depends on voluntary agreement and collaboration. Forcing security and scalability through global mandates may be slow, and may have unintended side effects.”

Accomplishing global deployment of secure, resilient, future-proof Internet technology is better done, as Kolkman calls it, “the Internet way”: at the initiative of individual actors, based on their own decisions and leadership; and through sharing know-how and experience, both voluntary and professionally.

He points to the example of the recently launched initiative, in which the Dutch Internet community collaborated to set up a website for the purpose of communicating deployment and access status for key internet technologies.

Kolkman argues that the solutions to many of our current security problems problem do not lie so much lie in legislation as they do in leadership. Leaders and supporters of the collaborative Internet must continue to be vocal and visible about advocating for open standards. They must be more transparently and openly collaborative in their security innovations and solutions, in order to achieve a more secure future ”the Internet way.”

Posted in News

Open Standards Opportunities: Electric Vehicle Charging Technology

Posted on June 17th, 2015

As the sales of electric vehicles in America increase, the collective need to build, operate and supply charging stations represents an opportunity for open standards.
The Energy Collective
While the sale of electric vehicles represent a small percentage of overall vehicle sales in America, their numbers have skyrocketed over the past few years. As sales increase, the collective need to build, operate and supply charging stations represents an opportunity for open standards. Edward Dodge posted recently about this on the Energy Collective Blog, following the work of Greenlots, a leading advocate for open standards in EV charging. Greenlots’ platform allows for demand response communication to help manage power on the grid with utilities providers. In order to manage power efficiently, there needs to be a communications layer interfacing between charging stations and central utility management. This layer, commonly called Automated Demand Response (ADR), “is simply the use of computers and internet communications to enable demand response to be completely automated and seamless to the user,” and presents an opportunity for open standards to facilitate communications. There is already some work being done here; Greenlots is a member of the OpenADR Alliance, a 130-member consortium dedicated to open standards for the Smart Grid. According to the article:

OpenADR is a communications protocol that standardizes messaging for dynamic price and reliability signals used by utilities, Independent Systems Operators and other participants on the power grid. OpenADR hopes to bring network protocol standardization to power grids globally, much in the same way that TCP/IP enables all devices on the internet to speak the same language.

OCPP is a communications layer between EV charging stations and central management systems [created by OpenADR]. OCPP is an emerging specification that is not formally recognized by international standards bodies nor adopted across the entire EV industry. There is great hope that OCPP will emerge as an international standard ensuring that hardware can operate across vendors’ networks and prevent customers being stuck with useless equipment should their vendor go out of business.

In an emerging market like electric vehicles, a commitment to open standards is commendable. In the pursuit of new solutions, we encourage consortia such as OpenADR Alliance to embrace the OpenStand Principles of cooperation, adherence to standards, collective empowerment, and voluntary adoption as they move forward to pursue national and international standardization.

It’s easy to become an OpenStand Advocate and publicly endorse the principles that have brought us decades of open innovation, simply:

Sign Your Name to express your public individual or organizational support.

Get a Site Badge to display your support on your site or blog.

Submit a formal endorsement from your organization for our site.

Posted in News
Next »