Enterprise IT Consultant Views on Technologies and Trends


May 13, 2011  10:01 AM

GeoTools – Try out GIS for your applications

Sasirekha R Profile: Sasirekha R

GeoTools – Open Source GeoSpatial Toolkit worth trying out

GeoTools is a free, open source Java geospatial toolkit for working with both vector and raster data. GeoTools is associated with the GeoAPI project that creates a vendor-neutral set of geospatial, Java interfaces derived from OGC specifications.

GeoTools is a general purpose geospatial library with a large feature set including:

  • Create and Analyze graphs and networks
  • Powerful “schema assisted” parsing technology using XML Schema to bind to GML content.
  • Interact with OGC web services with both Web Map Server and Web Feature Server support
  • JDBC plug-ins for DB2, H2, PostGIS, MySQL, SpatialLite, SQL Server Continued »

May 9, 2011  3:28 AM

Ubuntu 11.04, another milestone in making Linux desktop Mainstream

Sasirekha R Profile: Sasirekha R

Ubuntu 11.04, another milestone in making Linux desktop Mainstream

Ubuntu 11.04, code named “Natty Narwhal, released recently, has got a new look and feel for Ubuntu and has taken a major step in the right direction attracting both the desktop and netbook users. The Linux Kernel 2.6.38 which includes a patch that boosts the performance of the kernel across desktops is part of Natty and users upgrading to Ubuntu 11.04 can also expect great performance improvements.

Unquestionably, the new Unity Interface is the most appreciated by the users and it is said that for new users, the features are so intuitive making it easy to adapt. Ubuntu 11.04 ships with Compiz that offers settings such as Backlight always on, launcher animations, urgent animations, and special window transparency options. Those users who had considered Unity slow and buggy and as a major drawback of Ubuntu now agree that it is on par with other major desktop experience. Continued »


May 5, 2011  1:04 AM

Intel’s new 22nm 3D transistor to put it ahead of competitors

Sasirekha R Profile: Sasirekha R

New 22nm 3D transistor to propel Intel ahead of competitors

Intel has announced that it is about to put its 22nm (nanometer), 3-D transistor into volume production by the end of 2011. Intel demonstrated a computer using a micro-processor code-named Ivy Bridge, which is the first high-volume chip that will use 3-D transistors. Intel’s 3-D transistors technology at 22 nm is a radically different one compared to the currently used 2-D (planar) devices. According to Intel, 3-D Tri-Gate transistor and the ability to manufacture it in high volume, mark a dramatic change in the fundamental structure of the computer chip.

The smaller the transistor, the more power efficient, and hence better. For decades, the transistors using planar technology was continuously shrinking and packing more power. But Moore’s Law seemed to have hit a road block as in planar technology, as the features become so small that it started creating electrostatic problems – making controlling the switching of transistors difficult. Continued »


April 28, 2011  2:25 AM

Dynamic Scripting in CICS – brings best of both worlds

Sasirekha R Profile: Sasirekha R

Dynamic Scripting in CICS – brings best of both worlds

The CICS Dynamic Scripting Feature Pack (optional product) seems to bring the best of both worlds – benefits of quickly developing scripted, Web 2.0 applications with simple and secured access to CICS application and data resources. The Dynamic Scripting Feature Pack basically embeds and integrates technology from WebSphere sMash into CICS TS run time and includes a PHP 5.2 runtime along with Groovy language support. Access to CICS resources is achieved using the JCICS APIs.

Dynamic Scripting Feature Packs can be used to:

  • Rapidly develop, evolve, and deploy rich web applications, using dynamic scripting languages.
  • Expose RESTful web services and widgets for use in mashups and consoles.
  • Compose composite services, widgets, and applications that combine Web 2.0- style public and enterprise data feeds. Continued »


April 27, 2011  2:47 AM

Tweaking COBOL Compiler Options for Improved Performance

Sasirekha R Profile: Sasirekha R

Tweaking COBOL Compiler Options for Improved Performance

COBOL is still the most widely used language in mainframe – both for online transaction processing as well as massive batch processing. Even minor improvement of performance of repeatedly executing COBOL programs directly provides savings of CPU and hence the costs. One of the simple things – and often neglected one – that would optimize COBOL performance is to use the right set of Compile options.

In quite a few performance tuning engagements, the culprit is that the compiler option (set in the JCL or the configuration tool) used by almost all programs having compiler options (mostly left to default) that bring down the performance. Even the veteran COBOL programmers, tend to ignore these and focus on programs alone while trying to improve performance. In this article, I would like to highlight the COBOL compiler options which impacts performance. Continued »


April 18, 2011  8:24 AM

Simplify Architecture Governance to derive value from Enterprise Architecture

Sasirekha R Profile: Sasirekha R

Simplify Architecture Governance to derive value from Enterprise Architecture

As rightly pointed out by a Forrester report, “improving perception of EA” is the key challenge and the common goal of Enterprise Architects. While developing an Enterprise Architecture Blueprint itself is a commendable job, it is only part of the job done. The real uphill task is to make the blueprint a useful one and this is where, “Architecture Governance” that has become the byword for ensuring effectiveness of Enterprise Architecture comes in. Today, there are multiple definitions, frameworks and tools available for establishing Architecture Governance. Continued »


April 11, 2011  12:20 AM

Japan Nuclear Radiation Impacts (nothing to do with IT!)

Sasirekha R Profile: Sasirekha R

Japan Nuclear Radiation

With the explosion in the 3rd reactor of the Fukushima Daiichi Nuclear Power station, the fears of spread of nuclear radiation in Japan have increased. After effects of nuclear radiation remain for a long time and the possibilities of it spreading to neighboring countries cannot be ruled out. So tried to find out more details on Nuclear Radiation and their impacts – and sharing it in the blog (though it has nothing to do with IT).

In 1986 Chernobyl Nuclear disaster, where in addition to almost immediate death of thirty workers, a few thousand deaths happened due to radiation (according to UN). Another sign of damage on health is that about 6,000 people aged under 18 at the time of Chernobyl had developed thyroid cancer – usually only affecting older people. These are mainly attributed to the point that the accident was discovered more than a day after the explosion. Another sad factor is that Research related to Nuclear Health Hazards are not happening in full force – and in some cases totally dropped due to lack of funds.

Before going to specifics, thought it would be a good idea to get some details on Nuclear Radiation. Continued »


April 8, 2011  2:52 AM

Understanding Case Management, the next zing thing of BPM

Sasirekha R Profile: Sasirekha R

Case Management, the next zing thing of BPM

Traditional Business Process Management (BPM) focuses on activities, the order and sequencing of the activities to solve a problem – more like an imitation of mass production in a factory.  While BPM does serve its purpose, there is a greater need for automating and tracking unpredictable “cases” that do not follow a well-defined process.  There are situations where not all the activities or the order of the activities are known before hand and needs to take the specific context into account to make these decisions. This is where “Case Management” comes in – for taming the untamed processes.

Case Management as term seems to have different definitions in different contexts as can be seen by the Wikipedia definition. In that sense, I tend to agree with one of the Forrester blog comment that says “Unfortunately, case management is a lousy term for a great idea”. Continued »


April 7, 2011  6:52 AM

Understanding Inline vs. post-processing de-duplication

Sasirekha R Profile: Sasirekha R

Understanding Inline vs. post-processing de-duplication

A major difference in de-duplication product offerings is related to when the de-duplication occurs: “In-line” (or “real-time” as the data is flowing – before it is written) or “Post-process” (after the data has been written). The benefits and drawbacks of Inline and post-processing is a much debated one. Continued »


April 7, 2011  5:08 AM

Data de-duplication, the hottest technology in storage

Sasirekha R Profile: Sasirekha R

Data de-duplication, the hottest technology in storage

Data de-duplication with its promise of reducing storage capacity of backup environment by 95% (according to Forrester) has become fully Mainstream with more than 84% of the Gartner survey respondents currently using or planning to use it.

Data de-duplication (also called “intelligent compression” or “single-instance storage”) is a specialized data compression technique that reduces the storage needs by eliminating redundant data and storing only one unique instance of the data.

Unlike the standard file compression techniques, the focus of data de-duplication is to take a very large volume of data and identify large sections – even entire files – that are identical and store one copy of it. The standard example is an email system where there could be 100 instances of the same 1 MB file attachment and with de-duplication, only one instance of the attachment is actually stored thereby reducing a 100 MB storage demand to only one MB. The single copy that is stored could in turn be compressed by single-file compression technique and providing further storage reduction. Continued »


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: