With over 90 million customers in over 60 countries, MetLife is one of the world’s largest insurance companies. They are also one of the best examples of a Fortune 500 company combining mainframe and other back-end technologies with agile born-on-the-web technologies and a culture hitherto only seen in smaller start-up companies.

As we wrote in a previous post on corporate-academic partnership, MetLife has a compelling vision of the future of Enterprise IT. In this vision, mainframe is the “bedrock of technology… and foundation on which innovation, technology, and business grows.” The key to success is therefore being able to effectively “bringing together Mainframe and Emerging Technology and taking those solutions to places never thought possible.” Examples of these efforts include a new Facebook-sounding application called “The Wall,” which uses MongoDB to integrate over 70 back-end systems into a single dashboard encompassing all customer transactions. The effort brought together mainframe and web technologists, and, by using agile methodologies, a prototype of the application was ready in two weeks. After three months, the application was rolled out internally to MetLife call centers, dramatically simplifying access to customer data spanning disparate databases and systems.

“In insurance… working in months, not years, is really a startup mentality.” – Gary Hoberman, MetLife  CIO and SVP of Regional Application Development

Given that success in this sort of dynamic environment requires techies able to combine mainframe expertise with things like JSON and web services, it should be no surprise that MetLife is aggressively hiring college students to bring in a new generation of Millennial Mainframers. In order to attract such talent, MetLife recently created a program called MetLife Tech U to provide new hires a mix of education and hands-on training over the course of their first six months. Tech U participants spend half their time training in experiential learning activities and half their time working with a MetLife business unit. Tech U culminates with a capstone project that each participant presents to senior MetLife leadership.

For those completing the mainframe track of the Tech U, training also includes an online three course certificate program in z/OS systems programming or application development from Marist College. If you’re interested in reading more about the Marist College program, check out the Millennial Mainframer post by Keith Shaffer on his experiences with Marist College’s online mainframe education.

I personally know two Millennial Mainframers that have recently joined MetLife, and they both seem to be quite content with their jobs. The first is Natalie Chalco, a personal friend of mine that I’ve known for several years. The second is Dontrell Harris, a Millennial Mainframer blogger that previously wrote a post on his experience in the Master the Mainframe Contest.

In my opinion, MetLife is just about the best place for a millennial to start a career in mainframes. As an employee, you’d work on creative projects in an environment that blends the creativity and agility of a start-up with the scale and stability of a global organization.

The great news in all this is that MetLife is actively hiring college seniors and young Millennial Mainframers (up to three years out of college) for the next iteration of MetLife Tech U. The positions are based out of its Raleigh, NC and Clarks Summit, PA locations. If you’re interest in this fantastic opportunity, please click on one of the links below and fill out an application. Please also feel free to share this blog post with others that might be interested via the “Share This:” buttons at the bottom of the page.

Clarks Summit, PA Roles (Mainframe): http://jobs.metlife.com/pennsylvania/it/jobid4999448-technical-associate-jobs

Raleigh, NC Roles http://jobs.metlife.com/raleigh-durham/it/jobid4999447-technical-associate-jobs

For additional information about MetLife Tech U, please visit http://www.metlifegto.com/jobs/university

Editors Note: If you are considering Marist College Mainframe Certificate, then check out the free Marist College Intro to Enterprise Computing MOOC at https://mooc.marist.edu/web/ecc.  Look out though, as you may end up getting hooked like Keith Shaffer, and earn your Associate, Professional, and Expert certificates!

In the fall of 2003, I interviewed at a large insurer for a database administrator position.  I knew a bit about databases, but other requirements of the job were completely unfamiliar to me – including the operating system that was being used.  Strange acronyms like TSO, ISPF, JCL, and DB2 left me scratching my head – and this was before you could Google anything.  As part of the interview, I had the opportunity to shadow one of the DBAs.  I watched him navigate through the unfamiliar screens and sat trying to figure out what he was demonstrating.

I showed promise and, after I was hired, worked very hard to learn systems that descended from technologies older than I was.  But even equipped with a strong drive and helpful encouragement from colleagues, learning how mainframes operated was a massive effort.  My employer, who hadn’t hired someone without existing mainframe skills in over a decade, did not have a formal training plan in place.  In defense, this wasn’t uncommon.  When I started my career in data processing, groups like IBM’s Academic Initiative and SHARE’s zNextGen were in their infancy and weren’t well known.  Instead, I relied solely on coworkers for training.  It took several months to understand what I now consider the “basics” of operating in a mainframe environment.

In 2007, I heard about an organization at Marist College called the IDCP or the Institute for Data Center Professionals.  The particular program that caught my attention was the z/OS Systems Programmer track of their “Enterprise Systems Education”.  I had never taken any online courses, so I was skeptical of how much could be conveyed outside of a classroom.  A link to their website provided background information on all of the instructors, including their impressive accomplishments.  When I showed this to my coworkers and manager, they thought it was worth pursuing and so I enrolled in the system programming track.

The program’s intended audience is wide.  Students come from a mixed background – varying by age, experience, and of course, geographic location.  The systems programming track is three years long and each year is broken into three courses referred to as modules.  The first module is the “Introduction to z/OS and its Major Subsystems”.  This module, like the eight that would follow it, offered a variety of ways to learn.  Each week, the instructor would provide students with a lengthy presentation that included audio commentary and technical documentation to read.  IBM RedBooks, IBM White Papers, and industry articles were often used.  Students were encouraged to post questions, as well as thoughts, on a Marist online forum.  The forum was particularly helpful, because students with previous experience often provided an alternative way of understanding a concept.  In addition to the slides, readings, and forum postings, we were also assigned 1-2 long-term projects, such as essays/reports.  At least one of the reports required me to interview several subject matter experts (“SMEs”) that I worked with.  This not only helped me understand the technical material, but it gave me an opportunity to develop closer relationships with coworkers and have a better understanding of the work they perform.

Of course, every good class inevitably comes with quizzes and tests.  At the end of each week, the instructor would post a quiz.  The quizzes were usually a combination of multiple choice and fill in the blank responses.  Additionally, a midterm and a final exam were also provided.  The midterm and finals were usually 5-6 open ended responses that required a good deal of writing.

Some modules lent themselves well to hands-on work.  Those courses would be accompanied by labs, in which we connected to Marist’s z/OS systems.  The z/OS Installation, DB2 Fundamentals, and z/OS Reliability, Availability, and Serviceability (which also focused on assembler language) all involved regular lab assignments.  In another course, z/OS Performance Fundamentals, we reviewed performance aspects of z/OS environments.  In that class, we were able to either use the instructor’s sample data or even review data provided by our own shop.

Although the workloads varied by module, the Marist curriculum was thorough and demanding.  On average, I dedicated 3-4 hours each week to reviewing the assigned material; and even more during midterms, finals, and major long-term deadlines.  Instructors are always available to assist and provide any clarification of the material and help you advance your understanding.

Importantly, the material covered in several of the modules pertained directly to the work I was undertaking at my employer.  In particular, z/OS Installation, z/OS Advanced Topics, and z/OS Security were all vital areas that I now share responsibility in managing.  Today, I use all of these concepts on a day to day basis and often I refer to the course materials for review.

Given that mainframes are no longer a dominant area of study in schools, the Marist program offers both individuals and employers an invaluable alternative.  Typical mainframe classes are short and costly; pricing of a full year at Marist is comparable to what others offer for a week.  The program also offered me a way to excel my learning without requiring the attention of my technical colleagues.  Marist enabled me to learn the industry-wide practices for many mainframe shops on my own and, afterwards, seek guidance from my colleagues to understand our employer-specific customizations.

For anyone considering the Marist IDCP or a similar program, I’ll end with some advice.  If you’re already familiar with z/OS, then you know what to expect.  If you enjoy it, certainly consider stepping into the Marist program or a similar program at another school affiliated with the IBM Academic Initiative.  However, if you’re just starting to learn z/OS, take you’re time to understand what’s involved to work in a large systems environment.  The entire mindset of working in a mainframe environment is fundamentally different than other platforms.  Decide if z/OS is something that you’d like to devote significant time to learning.  If, after 6-12 months, you’re still interested, then Marist is the best next step for your career.  If you’ve got some experience on the mainframe and decide it’s not for you, that’s perfectly fine, too.  In fact, just knowing the basic terminology and concepts will help anyone be a better technician in a shop that uses mainframes – even if you’d rather focus on other areas, such as project management.

About the Author

Keith Shaffer
B.S. Computer Engineering, Syracuse University
z/OS Associate, Professional, Expert Certificates, Marist College

Keith graduated from Syracuse University in 2003. That fall, he took a position with a large northeast insurance company as a z/OS DB2 DBA. Since 2006, Keith has been within the same company as a z/OS systems programmer. Keith enjoys soccer, photography, and extensive traveling with his wife.
Connect with Keith on LinkedIn

Posted in Uncategorized.

Unless you have been living under a rock, you have probably heard of the new and innovative networking movement sweeping datacenters worldwide. This new movement is called Software Defined Networking or “SDN” and it has grown to pose quite a threat to traditional networking ideas.

What is SDN?

By definition, SDN is an approach to networking in which network logic is decoupled from hardware and given to a software application called a controller. In simpler terms SDN removes the network operating system from each individual device and moves it to a server. Switches are then given rules or “flows” by the controller which describe how to forward specific traffic across the network. This allows for more network speed, efficiency, and innovation.

Traditional network vs SDN network


An overview of a Software Defined Network


Why is SDN better?

Network efficiency and speed are improved with SDN in several ways…

  • Network processing is handled by a much more able-bodied server 
  • Switches evaluate route packets at line speed since they only perform forwarding decisions 
  • Centralized network control 
  • Custom defined network behavior 
  • Virtualized and sliced networks 
  • Multi-vendor device interoperability 
  • The ability for pay for hardware alone, without the unnecessary added features 


The need for innovation

Networks today are not evolving quickly enough to satisfy the growing needs of consumers. Networks are built using switches, routers, and other devices from various vendors that have become exceedingly complex because they implement protocols standardized by the IETF and use proprietary interfaces. Many network administrators need network features tailored for their needs, which is a request very difficult to fulfill using standardized protocols and proprietary features. SDN solves this problem by allowing network administrators to define a network behaviour that suits their needs.

Although SDN is in it’s early stages, research and development is already under way. One very popular and developer friendly controller is provided by the OpenFlow based startup, Big Switch Networks. Big Switch actively supports “Floodlight”, an entirely open source OpenFlow controller written in java and licensed under the Apache license. You can clone the project from Github and start programming your own custom network behaviour in a matter of minutes. Bringing an open source option to networking not only gives administrators choice, it gives anyone a platform to innovate with. On the other hand large companies like Google have already implemented their own private versions of OpenFlow in their own datacenters.

Marist, IBM, and OpenFlow

Here at the IBM/Marist Joint Study we are developing an OpenFlow testbed to evaluate the effectiveness of different OpenFlow devices and controllers. Our goal is to contribute our findings back to the community in hopes to progress the adaptiveness of OpenFlow in modern networks.

So where does the Mainframe come into play?

As an intern at the IBM/Marist Joint Study I have the privilege of working with other interns involved in various mainframe related projects. Consequently we share rack space in the same datacenter that holds our very own mainframe, used to support the research done by our fellow interns. After joking around about running an OpenFlow controller on the Z, we realized that we may have a very rare opportunity on our hands. After all, how many OpenFlow researchers have a z114 dedicated solely for research in the same datacenter as their own rack?

As a team we are now making steps towards researching the benefits of combining the two robust enterprise solutions. We imagine that the mainframe could be a favorable solution for a robust, scalable, and efficient enterprise software defined network. The mainframe would be able to handle a high volume of network processes, scale in the event of a network traffic spike, and maintain high availability, all while running as efficiently as possible.

We have already begun making steps towards evaluating the benefits of running OpenFlow controllers on the Z ensemble. We teamed up with our fellow intern Doug Rohde and started by configuring and running a tool called “CBench”, a controller benchmarking tool. We ran the tool on a zBX blade and managed to average a performance rating of nearly 3 million flows per second. Although there is no thoroughly documented research concerning the use of CBench with Floodlight, other researchers have been finding similar results on comparable systems.

Since our team’s primary goal is not really affiliated with the mainframe, we can only give the concept so much attention. For this reason progress is slow, but we have some more ideas that we think will yield valuable results. At this point our next step is to test and benchmark the controller on zLinux. The rest of our high level ideas will be presented next week in detail by my partner, Ryan Flaherty.


About the Author:

Jason Parraga
B.S. Computer Science, Marist College (In Progress)

Jason is currently a Junior doing OpenFlow research for the IBM/Marist College Joint Study. As an OpenFlow researcher Jason is a hybrid student with a passion for Computer Science as well as Information Technology. This allows him to apply his knowledge of networking concepts by programming modules and functions for open source OpenFlow controllers such as Floodlight.
Connect with Jason on LinkedIn

Posted in Uncategorized.