Enterprise IT Consultant Views on Technologies and Trends

December 5, 2011  1:46 AM

Integration using Direct Data Access

Sasirekha R Profile: Sasirekha R

Integration using Direct Data Access

Data Integration is where the applications integrate at the logical data layer by allowing the data in one application (the source) to be accessed by other applications (targets).

In Data Integration, Direct Data Access involves directly accessing the source database using SQLs from the target application. Direct Data Access is to be used very sparingly – as it doesn’t provide isolation and even when there is a minor change in the source data structure, all the target application programs having such direct access have to be modified.

Direct Data Access may have to be used ONLY in certain cases where the alternatives are difficult to use. Direct Data Access is an option to be considered, say in cases like the source application has:

  • Tightly coupled business and presentation logic making business logic not accessible externally (or)
  • Business logic implemented in a specific way without support for remote access (This is likely to be true for legacy applications – that rely on technologies that are more than decades old ).
  • Doesn’t provide any API or web services to access its data

  Continued »

December 2, 2011  3:12 AM

Integration – Standard Best Practices

Sasirekha R Profile: Sasirekha R

Integration – Standard Best Practices

There are multiple options available for integrating multiple systems not designed to work together. The major classifications are Data / Functional / Service Integration with choices on Push vs. Pull, Inbound / Outbound, Synchronous vs. Asynchronous, Online vs. Batch, EAI vs. ETL, use of Messages / files etc.

Each form of integration has its own advantages and disadvantages – and there is no silver bullet. The challenge is in choosing the right kind of integration based on the specific context thereby achieving the right balance among the trade-offs. In any enterprise, combinations of Integration approaches have to be used depending upon the context (“one-size-doesn’t-fit-all!”).

While choosing the integration approach, the following are the standard best practices to be considered: Continued »

December 2, 2011  3:08 AM

Message Oriented Middleware, the key behind ESB

Sasirekha R Profile: Sasirekha R

Message Oriented Middleware, the key behind ESB

Message oriented middleware (MOM), the key behind ESB (Enterprise Service Bus), offers flexibility in application development. MOM permits time-independent responses because it operates in an asynchronous mode.

MOM expands inter-component communication by allowing a client to send a request for service and then continue processing without having to wait for a response. Messaging does not impose the requirement of waiting for a response, as with an RPC. If no response is required, one need not be sent.

 MOM products typically have a feature called message queuing. Message queues allow clients and servers to run at different times. All communication happens by either putting messages into or taking messages out of queues. Continued »

December 2, 2011  3:04 AM

Essentials of SOA, Web Services and ESB in the Integration context – Part III

Sasirekha R Profile: Sasirekha R

Essentials of SOA, Web Services and ESB in the Integration context – Part III

Traditional Enterprise application integration (EAI) provided a hub-and-spoke architecture as a better solution compared to direct point-to-point connections. ESB moved further to provide the concept of bus – where the nodes have more intelligence – and offers better flexibility and scalability.

SOA offers great promise but also has great pitfalls. Making SOA work, avoiding the pitfalls and getting ROI, is hard. Usage of commercial tools simplifies SOA implementation enabling the focus to remain on business requirements – and not on implementation platforms and protocols.

Originally, an ESB product had a core asynchronous messaging backbone supplemented with intelligent transformation and routing to ensure messages are passed reliably. In other words, ESB was seen as a shared messaging layer for connecting applications and other services throughout an enterprise.

Today ESB is seen as a collection of architectural patterns based on traditional enterprise application integration (EAI), message-oriented middleware, Web services, .NET and Java interoperability, host system integration, and interoperability with service registries and asset repositories. The commercial tools offers support for various types of integration – using EAI, BPM, SOA, Message driven, event-driven, B2B as well as adapters for communication with different protocols, databases as well as standard products like ERP, CRM etc.

Reasons for using ESB as the integration backbone include the following capabilities:

  • Provide location transparency and enable service substitution;
  • Support integration in heterogeneous environments ;
  • Support SOA principles, separating application code from specific service protocols and implementations;
  • Act as a single point of control over service addressing and naming;

In an ESB, there is no direct connection between the consumer and provider. With an ESB, the infrastructure shields the consumer from the details of how to connect to the provider. While the service endpoints can have their own integration techniques, protocols, security models etc., ESB provides a simplified view to the service consumers. Thus an ESB allows the reach of an SOA to extend to non-SOA-enabled service providers. ESB also supports a variety of ways to get on and off the bus.

ESB supports Integration at various levels including:

  • Database(s)
  • Application adapters
  • Connectivity to EAI middleware
  • Service mapping
  • Protocol transformation
  • Data enrichment
  • Application server environments (J2EE and .Net)
  • Language interfaces for service invocation (Java, C/C++/C#)

Message oriented middleware (MOM), the key behind ESB, offers flexibility in application development. MOM permits time-independent responses because it operates in an asynchronous mode.

December 2, 2011  3:02 AM

Essentials of SOA, Web Services and ESB in the Integration context – Part II

Sasirekha R Profile: Sasirekha R

Essentials of SOA, Web Services and ESB in the Integration context – Part II

According to W3, “Web services architecture is an interoperability architecture that provides a standard means of interoperating between different software applications, running on a variety of platforms and/or frameworks”.

The core technologies used for Web services are:

  • XML: generic language that can describe any kind of content in a structured way, separated from its presentation to a specific device.
  • SOAP: Platform-neutral protocol that allows a client to call a remote service.
  • WSDL: XML-based interface and implementation description language. Using a WSDL document, the service provider specifies the operations that a Web service provides and the parameters and data types of these operations.
  • UDDI: Universal Description, Discovery, and Integration (UDDI) is both a client-side API and a SOAP-based server implementation used to store and retrieve information about service providers and Web services.
  • WSIL: Web Services Inspection Language (WSIL) is an XML-based specification that locates Web services without using UDDI.

Web services have the following key properties:

  • Web services are self-contained. On client side, no additional software required. On server side, an HTTP and SOAP server would suffice;
  • Web services are self-describing; Format definition travels with the service – no metadata repository required.
  • Web services can be published, located, and invoked across the Web;
  • Web services are language-independent and interoperable;
  • Web services are loosely coupled;
  • Web services are dynamic; With UDDI and WSDL, the discovery is automated and Web services can be deployed without disturbing the clients.
  • Web services provide programmatic access; Service consumers have to know the interfaces to Web Services and no knowledge of implementation required.

Web services alone cannot handle the complex requirement of SOA within an enterprise. That is where ESB – the Enterprise Service Bus – seen as the Universal Integration Backbone comes in.

December 2, 2011  3:00 AM

Essentials of SOA, Web Services and ESB in the Integration context – Part I

Sasirekha R Profile: Sasirekha R

Essentials of SOA, Web Services and ESB in the Integration context – Part I

Service-oriented architecture (SOA) is an approach of defining flexible integration architectures based on the concept of a service. SOA brings the benefits of loose coupling and encapsulation to integration at an enterprise level. Using SOA aims at enabling an organization to implement changing business processes quickly and also to make extensive reuse of components.

Services are the building blocks to SOA. Services can be invoked independently by service consumers to process simple functions, or can be a collection of functions to form a process. The key aspects of services are:

  • Encapsulate reusable business functions (e.g., Get Customer details, Update customer payments etc.); The importance of reusability cannot be stressed enough in an integration context. Every effort should be made to create functions that are reusable across different consumers (not necessarily service oriented) and avoid building specific point-to-point integrations.
  • Are defined by explicit, implementation-independent interfaces. Implementation independent interfaces allow the systems to change their implementation (say the underlying database or even the platform in which the system runs) without affecting other systems. A service contract – is an explicit interface definition – that binds the service producer and the service consumer(s).
  • Are invoked through communication protocols that stress location transparency and interoperability. While the service is defined once through a service interface, there could be multiple implementations with different access protocols for the same service. Multiple implementation protocols allow reuse of service from multiple channels as well as heterogeneous systems (running in different platforms).

While SOA is quite useful within the enterprise, the real need for SOA is when it comes to integration with the external world – B2B, B2C etc. And this is where “Web Services” fits in. Web service is one of the key methods of enabling SOA.

June 30, 2011  2:42 AM

HTML5 holds promise though the formal release is expected to be in 2014

Sasirekha R Profile: Sasirekha R

HTML5 is worth considering today even if the formal release has three more years to go

Apple’s Steve Jobs “Thoughts on Flash” where he has stated that “Flash is no longer necessary to watch video or consume any kind of web content” seems to have revived interest on HTML5 and also debates on Flash vs. HTML5. It is true that HTML5 lets web developers create advanced graphics, typography, animations and transitions without relying on third party browser plug-ins (like Flash).

HTML5, though still not released, is being supported by most browsers (of course, not all features are handled consistently) and has attractive features making it worthwhile to be considered right now. HTML5 supports video streaming, multi-threading, direct communications using Web sockets, asynchronous processing etc.  It is not just about Apple – Microsoft, Google, Mozilla etc. are committed to HTML5. Again it is not just Flash, HTML5 is capable of replacing technologies like Silverlight, Flex/AIR and JavaFX. Though there are debates about how each of them is better individually, the fact that HTML5 being a standard and supported by almost all browsers does make it attractive.

W3C recognized that HTML though used to describe a large variety of documents, was primarily designed for semantically describing scientific documents and is not adequate for “Web Applications”. Add the result is HTML5 that:

  • Defines a single language (HTML5) which can be written in HTML syntax and in XML syntax.
  • Defines detailed processing models to foster interoperable implementations.
  • Improves markup for documents.
  • Introduces markup and APIs for emerging idioms, such as Web applications. Continued »

June 27, 2011  12:24 AM

Google Health to be discontinued – enables transfer to Microsoft HealthVault

Sasirekha R Profile: Sasirekha R

Google discontinuing Health Service and enables transfer to Microsoft HealthVault

Google has announced that it is discontinuing Google Health as a Service – that allowed users to store, manage and share health information at no charge – that was launched in 2008. Clearly these services come and go as per the convenience of the service provider and customers do accept this as part of the freebie mindset. The product will continue to operate as usual till January 1, 2012, after which the users will not have access to current features and not be able to enter, edit, or view data stored in your Google Health profiles.

As the reply to “Why Google Health is being discontinued?” Google says that “adoption of Google Health has been limited to specific groups of users like tech-savvy patients and their caregivers, and more recently fitness and wellness enthusiasts. We haven’t found a way to translate this to widespread adoption in the daily health routines of millions of people. Hence the difficult decision to discontinue”.

Google points out that the announcement is given well in advance and users have options to download or transfer their profile to another service. The users can download the data stored in Google Health, in a number of formats – profile records as ZIP, PDF, CCR, or CSV, and Notices as XML or HTML (http://www.google.com/support/health/bin/answer.py?hl=en&answer=1241448).  Download capability will be available till January 1, 2013.

Interestingly, for people who want to continue tracking their health profile online, Google’s response is to move to Microsoft Health Vault. In the coming weeks, Google will be providing the ability to directly transfer the health data to other services supporting Direct Project protocol – the emerging open standard for efficient health data exchange. Continued »

June 24, 2011  7:40 AM

Fast changing role of IT – Leading Business Transformations

Sasirekha R Profile: Sasirekha R

IT personnel to combine innovation with traditional wisdom leading Business Transformations

Traditional IT departments used to be “service providers” with the business spelling out their requirements and IT personnel developing and maintaining applications accordingly. Process driven (business users detailing the processes that gets translated to requirements) and Transaction-orientation (handling order, account, payments) were the driving forces. The responsibility of CIO mainly is in ensuring building/buying applications that automated the business processes and ensuring SLAs as demanded by the users.

IT personnel expertise was typically technical – related to Platform (Mainframe, OS/2, Windows, Mac, Unix, Linux) and Programming languages (Assembler, COBOL, C, C++, Visual Basic, .Net, Java). IT skills had to do with debugging, trouble-shooting, optimizing etc. In essence, IT department was playing the role of “Make / Buy to Order” with business being responsible for spelling out their requirements – functional and non-functional – in detail.

IT can longer limit itself to such a secondary role. While process driven, transaction-oriented applications are still required, they are getting commoditized, the real value is in “IT leading Business Transformations – enabling business to exploit the technological advances”. Continued »

June 20, 2011  1:47 AM

SciDB – a database for scientific analysis – “For the toughest problems on the planet”

Sasirekha R Profile: Sasirekha R

SciDB – a database for scientific analysis – “For the toughest problems on the planet”

SciDB Inc. website (http://www.scidb.org/) opens with a powerful statement “For the toughest problems on the planet“. Typically, scientists are forced to retrofit business information technologies to suit their needs or to build their own technologies with very limited resources. To work more productively, scientists do need information solutions built for their purpose (for science).

In March 2008 at Asilomar, representative group of science and database experts came together to determine if the requirements of the different scientific domains (and some large-scale commercial applications) were similar enough to justify building a database system tailored to the needs of the scientific community. The answer was “YES”.

Result is the decision to build SciDB, an open source database technology product designed specifically to satisfy the demands of data-intensive scientific analytics. Continued »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: