XML and Web Services In The News - 24 January 2007
Provided by OASIS |
Edited by Robin Cover
This issue of XML Daily Newslink is sponsored by Sun Microsystems
HEADLINES:
XForms in Firefox
Elliotte Rusty Harold, IBM developerWorks
This article demonstrates basic XForms processing as currently
supported by Firefox and the Mozilla XForms plug-in. Using the
experimental Mozilla XForms extension, you can process XForms in your
browser today. While not yet deployed widely enough for use on the
public Internet, XForms may be suitable for some intranet applications.
XForms is not only a more powerful means of designing and laying out
forms than classic HTML forms; it's an easier one too. Because content
is separated from presentation, CSS can be used to full effect.
Furthermore, you can put the form elements anywhere on the page you
like, intermixed with any markup. Finally, form tricks that require
lots of JavaScript code, such as updating one field when the user
enters data into another, are a trivial amount of declarative code in
XForms. Except for one little detail, developing with XForms would
be a no-brainer. That detail is that no current browsers actually
support XForms out of the box. Needless to say, this severely limits
what you can do with XForms and where you can deploy them. However,
there are workarounds. Browser plug-ins exist for both Windows Internet
Explorer and Firefox that add XForms support to these market-leading
browsers. XForms processors have also been written in Flash that can
be deployed to any browser with a Flash runtime. Finally, there are
server-side solutions that precompile all XForms markup to classic
Hypertext Markup Language (HTML) and JavaScript programs. Client-side
XForms processing won't be possible for public-facing sites until
XForms is more widely deployed in browsers. However, that doesn't mean
you can't deploy it on your intranet today. If you're already using
Firefox (and if you aren't, you should be), all that's required is a
simple plug-in. After that's installed, you can take full advantage
of XForms' power, speed, and flexibility.
See also: XML and Forms
Microsoft Makes AJAX Technology Available
Paul Krill, InfoWorld
Microsoft has released its ASP.Net AJAX 1.0 technology, formerly
called Atlas. ASP.Net AJAX 1.0 enables Web developers to build AJAX-
style (Asynchronous JavaScript and XML) Web applications by integrating
with the .Net Framework and the Microsoft platform. ASP.Net AJAX is a
free framework for building interactive, personalized Web experiences.
It functions with the Internet Explorer, Safari, Firefox, and Opera
browsers. Microsoft's AJAX offering first debuted in a preview version
in October 2005, and another preview with a Go-Live license, was
offered in March 2006. The Go-Live license enables live deployments of
the technology. The Microsoft AJAX library, featuring standard
JavaScript code running on the browser, and the ASP.Net 2.0 AJAX
Extensions, which are server-centric pieces to enable a drag-and-drop
developer experience are included as part of ASP.net AJAX 1.0. In a
related development, Microsoft on Tuesday is updating its ASP.Net
AJAX Control Toolkit, which runs with ASP.Net AJAX 1.0 and features
controls for advanced effects such as animation and auto-complete
behavior. ASP.Net AJAX 1.0 includes all features of previous versions
plus enhanced training capabilities. Developers using ASP.Net AJAX 1.0
can build interfaces with reusable AJAX components and enhance existing
Web pages with AJAX controls. They also can access remote services and
data from a browser without writing a lot of complicated script,
Microsoft said. Additionally, developers can use the AJAX software with
Visual Studio, but Visual Studio is not a requirement.
CTO's Message: OGC Standards and the Geospatial Web
Carl Reed, OGC Newsletter
Location content is being created and utilized at many levels in the
internet/web infrastructure. Much of this content is not being created
by the GIS community! Consider DSRC: a short to medium range (1000
meters) communications service that supports both public safety and
private operations in roadside-to-vehicle and vehicle-to-vehicle
communication environments. DSRC is really about developing and
deploying an extensive roadside sensor and communication network. This
network will generate billions of location messages on a daily basis --
and is being done entirely outside the traditional geospatial domain.
And this application area will be an integral component of the
Geospatial Web. The Geospatial Web has been evolving since the mid
1990's when the first mapping applications, such as MapQuest and Xerox
ParcMap, were deployed. During the last few years, various applications,
such as emergency services, spatial data infrastructures, and consumer
mapping have accelerated the growth and evolution of the Geospatial
Web. During this same time period, an increasing number of
applications have implemented and use a variety of geospatial standards.
Some of these standards are OGC standards but others are from the Open
Mobile Alliance (OMA), from ISO, from the IEEE, or from the Internet
Engineering Task Force (IETF). The reason for this is that the
Geospatial Web consists of many layers. The following is a somewhat
simplistic view of the number of layers that define the geospatial web.
Models for defining the layers of an IT infrastructure are typically
5 to 7 layers deep. OGC standards play an important role in all levels.
This does not mean that the OGC is developing standards for all levels
and all application areas. Instead, many other standards organizations,
such as the IETF, IEEE, OMA, and OASIS are building on the work of the
OGC to define profiles and application schemas for their specific
requirements. Many of these profiles and schemas, such as those used
in GeoRSS and IEEE 1451, are simple and lightweight. Each meets a
specific requirement. The Geospatial Web is not just a bunch of
mash-ups or even the hundreds of SDI's that have been successfully
deployed. The Geospatial Web is about the complete integration and use
of location at all levels of the internet and the web. This integration
will often be invisible to the user. But at the end of the day, the
ubiquitous permeation of location into the infrastructure of the
internet and the web is being built on standards.
See also: GML references
Microsoft's Enterprise Services Bus (ESB) Strategy
Mohammad, Blog
This is the fourth post in a series of four about Microsoft's strategy
for Enterprise Service Bus. This post discusses Microsoft offerings in
the area of ESB. Microsoft does not believe on a one-size-fits-all
approach to ESB and has so far resisted the pressure to rename Biztalk
or a variation of it as an ESB. Microsoft's approach to ESB is to
understand the customer's core reasons for seeking an ESB and then
propose a solution that suits the need of that enterprise. I strongly
believe that this is a better approach than selling ESB-in-a-box;
however, it has made some analysts unhappy who believe that Microsoft
should have a product marketed under the ESB label. Based on my
experience of what is being sold in the marketplace under the ESB label,
Biztalk could definitely qualify as one, but I am glad that we have
refrained for renaming it so far. Microsoft has contributed to the idea
of ESB as an architectural pattern and has made available some of the
technology-agnostic thinking and best practices in the integration
patterns book and associated articles... In summary, do not buy an
ESB-in-a-box (yes even Biztalk) unless you understand the benefits it
would provide to your IT and business, also do not build something that
can buy J I know it sounds contradictory, what I am trying to
communicate is that rather than starting from what's selling as an
ESB, start with your organizational needs, construct a requirements
matrix and then apply that on the available choices to reach a
conclusion that suits you.
A Meaningful Web for Humans and Machines: Explore the Parallel Web
Lee Feigenbaum and Elias Torres, IBM developerWorks
In this article, we will cover our notion of the parallel Web. This
term refers to the techniques that help content publishers represent
data on the Web with two or more addresses. For example, one address
might hold a human-consumable format and another a machine-consumable
format. Additionally, we include within the notion of the parallel
Web those cases where alternate representations of the same data or
resource are made available at the same location, but are selected
through the HTTP protocol. HTTP and HTML are two core technologies
that enable the World Wide Web, and the specifications of each contain
a popular technique that enables the discovery and negotiation of
alternate content representations. Content negotiation is available
through the HTTP protocol, the mechanism that allows user agents and
proxies/gateways on the Internet to exchange hypermedia. This
technique might be mapped mostly to a scenario where alternate
representations are found at the same Web address. In HTML pages,
the link element indicates a separate location containing an
alternate representation of the page. Historically, it is important
to note that in its original form, the content negotiation mechanism
left it completely up to the server to choose the best representation
from all of the combinations available given by the choices sent by
the user agent. In the next (and current) version of content
negotiation, which arrived with HTTP/1.1, the specification introduced
choices with respect to who makes the final decision on the alternate
representation's format, language, and encoding. The specification
mentions server-driven, agent-driven, and transparent negotiation.
Server-driven negotiation is very similar to the original content
negotiation specification, with some improvements. Agent-driven
negotiation is new and allows the user agent (possibly with help from
the user) to choose the best representation out of a list supplied by
the server. This option suffers from underspecification and the need
to issue multiple HTTP requests to the server to obtain a resource;
as such, it really hasn't taken off in current deployments. Lastly,
transparent negotiation is a hybrid of the first two types, but is
completely undefined and hence is also not used on the Web today.
See also: the RFC
Leveraging Site Infrastructure for Multi-Site Grids
Von Welch (ed), Open Grid Forum Informational Recommendation
This document summarizes the Community Activity 'Leveraging Site
Infrastructure for Multi-Site Grids' held at the GridWorld/GGF15 on
October 3, 2006 in Boston, MA, USA. Extract: "(1) Ken Klingenstein
described the Shibboleth cross-site identity federaion system and
SAML standard that it utilizes. (2) Arnie Miles' presentation included
a discussion of Condor for high throughput computing and raised the
notion of both portals and command-line clients for users. (3) Jim
Basney described MyProxy as a means of federating between different
security domains. Marty Humphrey described work to add support for
Pubcookie, a web single sign-on package, to Myproxy. (4) Von Welch
described the Globus Toolkit and the work by the GridShib project to
allow for interoperability between Shibboleth and the Globus Toolkit.
(5) David Chadwick described PERMIS, an X509-based policy decision
engine with dynamic delegation capabilities. (6) Abhishek Rana's
talk described a number of tools in use in the OSG RBAC architecture,
including GUMS, PRIMA, gPLAZMA, VOMS, VOMRS, authorization callouts
in the pre-web services version of the Globus Toolkit, authorization
callouts in SRM-dCache, and SAZ. (7) Tom Barton presented Signet and
Grouper, tools for managing and creating policies expressing groups
of users and their privileges. (8) Dane Skow described KCA/KX509 as
the basis for Kerberos-to-X509 bridging at Fermilab.
See also: OGF Document Series
What's New in the Prototype 1.5 JavaScript Library?
Scott Raymond, XML.com
The latest release of Ruby on Rails, version 1.2, was announced last
week to great fanfare. But the announcement might have overshadowed
news of a simultaneous release: version 1.5 of Prototype, the popular
JavaScript library. Despite the synchronization and developer overlap
between the two projects, nothing about Prototype depends on Rails —
it's perfectly suitable for use with any server-side technology. In
fact, Prototype has amassed a huge user base beyond the Rails community
— from dozens of Web 2.0 startups to household names like Apple, NBC,
and Gucci. The Prototype library is fairly compact (about 15K), and
decidedly not a kitchen-sink library. It doesn't provide custom widgets
or elaborate visual effects. Instead, it just strives to make
JavaScript more pleasant to work with. In many ways, Prototype acts
like the missing standard library for JavaScript — it provides the
functionality that arguably ought to be part of the core language.
Prototype is perhaps best known for its top-notch Ajax support. Of
course, Ajax-style interactions can be created without a JavaScript
library, but the process can be fairly verbose and error-prone.
Prototype makes Ajax development more accessible by accounting for the
varieties of browser implementations and providing a clear, natural
API. The 1.5 release adds even more power, especially as relates to
creating RESTful, HTTP-embracing requests. Prototype now has the
ability to easily access HTTP request and response headers and
simulate HTTP methods other than GET and POST by tunneling those
requests over POST.
OpenOffice, Office 2007 Ready to Rumble on Rival Document Formats
Elizabeth Montalbano, InfoWorld
Rivals Microsoft and OpenOffice.org both released toolkits that support
building applications for their competing document file formats and
productivity suites. OpenOffice's toolkit allows developers to add the
ability to save documents in Open Document Format for Office
Applications (ODF) to a variety of applications. Meanwhile,
Microsoft's kits help companies build applications for its Office 2007
productivity suite, which is based on Open XML, ODF's rival file format.
Office 2007 is available to business customers and will be in wide
consumer release on January 30, 2007. The OpenOffice ODF Toolkit Project
has published an initial version of its toolkit online and is inviting
members of the community to add to its development, said Louis
Suarez-Potts, community manager for OpenOffice. Previously, developers
would have to add "a good piece of OpenOffice" code to an application
to give it the ability to save documents in ODF, Suarez-Potts said.
The creation of the ODF Toolkit makes this easier, he said. Microsoft's
toolkits for Windows SharePoint Services 3.0, Microsoft Office
SharePoint Server 2007, and the Microsoft Office Project 2007 provide
technical guidance and sample code so developers can build what
Microsoft is calling Office Business Applications. The company hopes
these applications will allow employees to access information from
back-end systems through the new Office UI, which it has named Microsoft
Office Fluent. On Tuesday, Microsoft announced the name of the Office
2007 UI for the first time and said it will license Fluent royalty-free
so developers can build new applications that look like those in the
suite. In addition to the toolkits, Microsoft also announced that it
will have a new portal on Microsoft Developer Network (MSDN) to focus
on development around Groove, a P-to-P (peer-to-peer) application it
acquired when it bought Groove Networks. P-to-P has become a strategic
part of Microsoft's collaboration software strategy, and the company
has even made Ray Ozzie, Groove's founder and brainchild, its chief
architect and heir apparent to Microsoft founder and Chairman Bill
Gates. The ISO recently approved ODF as an international standard for
document file formats. It is supported by companies such as IBM and Sun,
which markets its own version of OpenOffice called StarOffice.
Microsoft's Open XML, on the other hand, recently won approval by Ecma
International as a standard, but the ISO has not approved it yet.
Key XML Standards Pass W3C Muster
Clint Boulton, InternetNews.com
The World Wide Web Consortium (W3C) has confirmed the fitness of
several XML standards designed to query, transform and access XML data
and documents. XQuery 1.0, XSL Transformations (XSLT) 2.0, and XML Path
Language (XPath) 2.0 passed muster as cornerstones for developing the
Web with XML. Connections between applications, databases, operating
systems, Web services and Web servers have traditionally used middleware
to convert data between the formats used by various applications. XSLT
2.0 and XML XQuery 1.0 will make those conversions, enabling users to
focus on business logic. XQuery 1.0 allows data to be mined from
anything, including memos, Web services messages and multi-terabyte
relational databases. The standard is expected to serve as a unifying
interface for access to XML data, much as SQL has done for relational
data, said Don Chamberlin of IBM's Almaden Research Center, who is
co-inventor of the original SQL Query language and one of the
co-editors of XQuery 1.0, in a statement. Though the W3C is only now
stamping its official seal of approval on XQuery 1.0, database makers
IBM, Oracle, and Microsoft already support the standard in their
database software. Meanwhile, XSLT 2.0, which triggers the
transformation and presentation of XML documents, adds new layers of
functionality not found in its XSLT 1.0 predecessor, which is already
used on Web servers and in browsers in some businesses. XSLT 2.0
includes more facilities for grouping and aggregating data and
providing more powerful text processing. XSLT 2.0 can also optionally
use XML Schema, enabling improved detection of compiling and run-time
errors. Michael Kay, editor of the XSLT 2.0 spec, said in a statement
XSLT 2.0 is a huge step forward in functionality and developer
productivity, while also retaining a very high level of backwards
compatibility.
The Open Source Initiative Still Lives
Steven J. Vaughan-Nichols, Linux-Watch
There was a time when the OSI (Open Source Initiative) was one of the
hotbeds of open source activity. After the retirement of its co-founder
and leader, Eric S. Raymond, in January 2005, the OSI lost much of
its fire. That may be changing soon, though. An investigation by
Linux-Watch has found that there is still heat in what appeared to be
the organization's quiet ashes. What's the OSI? Basically, the
non-profit OSI, founded in 1998 by Raymond and others, was formed for
the purpose of "managing and promoting the Open Source Definition for
the good of the community, specifically through the OSI Certified
Open Source Software certification mark and program." In short, the
OSI was traditionally the group that decided whether a license is
true-blue "Open Source" or not. After Raymond left, leaving the
organization briefly in the hands of Russ Nelson, founder of Crynwr
Software, Michael Tiemann, Red Hat Inc.'s VP of open-source affairs,
became the OSI's president. Under Tiemann's leadership, the OSI
decided to cut back on what Danese Cooper, Intel's senior director of
open-source strategy and secretary/treasurer for the OSI, called the
ungoverned growth of "vanity licenses." Companies did this because
they all wanted one of their own. The overall effect of all these
vanity licenses was they were seldom reused and there was "not much
true community around those vanity projects," according to Cooper.
In April of 2005, the OSI followed up on this by adopting a new way
of approving open-source licenses and a classification system for
existing licensees. With these new rules, "Approved licenses must meet
three new criteria of being a) non-duplicative, b) clear and
understandable, and c) reusable."
See also: the OSI web site
XML.org is an OASIS Information Channel
sponsored by BEA Systems, Inc., IBM Corporation, Innodata Isogen, SAP AG and Sun
Microsystems, Inc.
Use http://www.oasis-open.org/mlmanage
to unsubscribe or change an email address. See http://xml.org/xml/news_market.shtml
for the list archives. |