“Sometimes the mainframe
gets hungry for some data
so it has a byte.”
Peter Humanik

I’ve had a lot of geeky conversations. When you’ve worked as a developer, lunch table discussions with fellow programmers can lend themselves to discourse on video games, internet memes, the latest hilarious video on YouTube, and what is new in the world of technology. On one particularly nerdy-joke filled day, I recall a lengthy conversation over french fries and tuna sandwiches on the grammatical accuracy of the sentence: “Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo”. Yup, that’s a real sentence – and it really makes sense (check out the wiki page devoted to it)!
But I digress! Developing with a few other young folk, coding in REXX and Assembly Language on the mainframe, no less, it was still an effort for all of us to have a completely modern take on the work we were doing. The words that typically came to mind when we pictured mainframes were the usual ones: the mainframe has a rich, deep-rooted history in “reliability”, “security”, and “availability”. But how often does one sit back and think, “Mainframe: Innovation. Cutting-edge. Cloud Computing.”
Now, “cloud computing” and “the mainframe” are not typically used in the same sentence. However – the two are not as disparate as they may sound, and the concept of the mainframe on the cloud has been discussed in the tech community a fair amount in recent months. Why?

Mainframe and Cloud Computing Models Share Basic Concepts. 

Businesses are moving to cloud computing environments for increased efficiency and decreased costs; the focus of the cloud model is on improved manageability, less maintenance, and capitalizing on a shared resource infrastructure. There are significant cost-saving benefits from cloud computing and the economic incentives are hard to argue with. The biggest and most often discussed concern is that of security.
So, cloud computing offers flexibility, agility, and high availability. Sound familiar? Scalability, flexibility, virtualization, and utilization of shared resources: these are all concepts the mainframe community is well-acquainted with.

Enter the ‘Mainframe Cloud’ Model.

The mainframe offers certain capabilities that aren’t found in other platforms: extremely strong security, coupled with multi-tenancy, pooled resources, sophisticated resource allocation, and so on. A ‘mainframe cloud’ model would boast all these capabilities, as well as allow for dynamic capacity management.
As businesses have to harness and manage more and more data, with growing needs for scalability and flexibility, it makes sense for mainframes to move into this arena. Recently (May 2012), IBM announced that it will add System z mainframes to its SmartCloud platform offerings later this year. As one blogger states, “fully virtualized from the start, the z is a natural for the cloud.”
This integrated model would offer businesses the ability to take full advantage of the benefits of using a mainframe, while also reaping the economic cost-benefits of a cloud model.
So, while perhaps not a term that rolls naturally off the tongue, the “mainframe cloud” is a concept that seems natural, and necessary. It becomes clear that the mainframe is far, far from “archaic”, or “dead” – instead, it is alive, well, and thriving, pushing its way through to innovative new ground.

About the Author

Sarah Dandia
Master of Biotechnology; University of Pennsylvania
B.S. in Computer Science, B.A. in Psychology; Binghamton University

Sarah started out of college as a software developer, and after two years, went on to complete a degree in Bioinformatics. She currently works for IBM as a Client Technical Professional for DB2 Tools on System z. She is interested in books, travel, technology, perfecting her guacamole recipe, and learning guitar one painfully slow chord at a time.

Posted in Uncategorized.
IBM Rational software was acquired by IBM in 2003, to become one of its leading software brands. For some time, the name itself intrigued me the most, and after diving into its different products, it made perfect sense to me: every result or impact yielded was the direct effect of applying a smart but rational twist to how things were done. All in all, IBM Rational allows you to change how you design, develop, and deliver software, to build more innovative products and services, beat your competition, shorten time to market, with lower costs and cut down risk. 

Each software project, whether large or small, undergoes certain phases, which combined are known as the Software Development Lifecycle (SDLC). There exist five stages in the SDLC model: Requirement Analysis, Design, Implementation (coding), Testing, Evolution (maintenance). Under the Rational suite, there are many products that are key for a smarter development environment and complement the development life cycle.

IBM has dubbed Enterprise Modernization a cornerstone in their Smarter Computing initiative, offering an approach in revitalization and continuous improvement of aging applications, empowering development using both existing and new skills with productivity enhancements, unifying teams to increase organizational agility across all platforms, and optimizing usage of IT infrastructure by freeing up capital and capacity needed to run critical, production applications – all with the goal of making breakthroughs in IT efficiency and innovation.

There are several families of IBM Rational products in which Enterprise Modernization plays an important role in:

Rational Asset Analyzer is an application discovery and impact analysis tool for improving your understanding and insight into relationships within and among System z and composite applications. Not only will it allow you to understand the code better, but it will also show you existing application inter-dependencies, meaning changes can be made with fewer mistakes and comprehensive testing, helping complete projects on time and within budget.

Rational Team Concert is a unified and collaborative multiplatform team infrastructure including support for System z and distributed systems to help streamline the entire application development and deployment life cycle across all operating environments. Following is a simplified example of why team collaboration is a very important parameter for a productive environment. Imagine a kitchen in a restaurant where dinner (source code) is being planned (designed). The chef (project manager) will hand out tasks (jobs) to the rest of the cooks (developers). Each will have to prepare his list of ingredients (requirements) and cook his dishes (artifacts). At the end, all dishes collectively leave the kitchen (lab) to form the dinner (end product). For this event to be successful the chef (project manager) needs to be aware of what each cook (developers) is working on, make sure that every one is in sync and that no duplicates are being cooked (developed), securing a smarter development environment. 
Rational Developer for System z helps make traditional System z development, web development, and integrated Service Oriented Architecture based multi-platform development faster and more efficient. It supports modern user interfaces and full web application processing and web services to integrate these application styles and processes together. It creates, maintains, debugs and deploys transactional and batch applications to the z/OS platform. It promotes the reuse and transformation of existing applications to help reduce costs and shorten the development cycle.  For all of us young mainframers, RDz offers a “GUI” interface to the good old green screen while adding great new features and capabilities.
Rational Development and Test Environment for System z provides a small-scale, unit test environment for developers who will be able to run z/OS and z/OS middleware on an Intel or Intel-compatible (x86) personal computer, thus bringing flexibility and reducing the cost of developing in a mainframe environment. The developer is now capable of building and testing new System z applications virtually at anytime and from anywhere. The z/OS features are still the same, however, developers can now create applications in the unit test environment before being released on the mainframe, hence shared mainframe environments or processes are left intact. Since the testing is run in the unit test environment, this means that mainframe development MIPS (million instructions per second) are spared for production capacity. 
Rational Host Access Transformation Services quickly transforms your 3270 and 5250 green screen applications to intuitive web, portlet, rich client or mobile device user interfaces, and extend 3270, 5250, and VT green screens as standard web services. 

The purpose of this post was to give you a quick overview of what IBM Rational software has to offer to the development world. More detailed follow up posts will be coming your way soon, so stay tuned and remember to always think Rationally.

Posted in Uncategorized.
On occasion, the IBM mainframe will make an appearance in an IBM ad campaign.   A couple of great ones produced in the mid-2000s focused on mainframe virtualization:

Now I’m not quite sure if IBM produced this one, or if a clever mainframe fan put this one together…but it’s pretty funny!

Great humor here, but…think about the messages within these – they convey two key themes:

  1. Extreme virtualization – System z can run hundreds, if not thousands of virtualized guest OS instances. As the ads illustrate, this really does enable System z customers to combine huge server farms into a single System z frame.
  2. Security – While System z isn’t a very good babysitter for your kids, it does a pretty nice job babysitting your valuable information resources. System z security is unparalleled in the software and server marketplace, with built-in functionality for encryption, access control & resource authorization, and some of the best logging & auditing technology in the industry.

There’s a wealth of great System z material on YouTube. We’ll be posting more in the future!

Posted in Uncategorized.