Welcome to student consolidation | Student Loan Consolidation | student loans


Friday, July 6, 2007

How to manage the data crunch: IHEs of every size are finding solutions to their server and storage needs that require less manpower to maintain and t

The justification behind the uses of servers on college and university campuses is as simple as "more is never enough."

Servers--the software and hardware that store data and handle its processing--are often known by their end users mainly as the sources of blame for various computing failures. "The server's gone down" may be the single best-known phrase in information technology (IT). But schools are relying on their servers to do more than in the past, especially in the area of storage, which they look to as a path to the paperless office they've been promised for more then a decade by the apostles of high technology (see sidebar).


At the University of Houston's Advanced Computing Research Laboratory in Texas, a variety of academic units at the university are able to get resources for computer and computational science research. It uses 60 HP zx6000 workstations with dual 900 MHz Itanium 2 processors and an HP rx5670 Itanium 2 four-way server with 1 GHz processors, all running the Linux operating system. With this setup, the center achieved new heights of performance speed, all in a setup that allows it to be flexible to serve its diverse and changing user needs.

Princeton University (N.J.) took a different route to outfit ting one of its own research centers. The Center for the Study of Brain, Mind and Behavior (CSBMB) uses a 64-node G5 Xserve duster from Apple to handle its image analysis and simulation modeling. Though it is housed in the psychology department of the university, it is an interdisciplinary facility that touches on many academic areas, including chemistry, computer science, applied math, and more. "We are really an imaging facility," says Randee Tengi, CSBMB system administrator. "People collect huge amounts of data, then they go back to the lab and analyze it."

To store the vast quantities of images needed for work in simulation and analysis of neuroimaging data, CSBMB uses an 11-terabyte Silicon-Server from BlueArc Corporation. (A terabyte measures the storage volume; one terabyte is 2 to the 40th power, or about 1,000 gigabytes.)

When the center opened six years ago, every user (there are now about 100) used the same machine that ran the imaging software. "We still wanted a central processor for parallel jobs, so we needed a cluster machine," says Tengi. "We wanted all of the data on one file server so users could also access it from their desktops."

As a result, the centers technology lets them use their own desktop computers (whether they run Mac OS X, Windows, or Linux operating systems) to access the image data from the central file sewer.

Though server power is the name of the game at CSBMB, manageability is near the top of Tengi's list of benefits she's derived from the new system. The center has limited resources for administering its technology, so she appreciates having a system that is easy for her to maintain and easy for the end users to operate.

Austin Community College (Texas) worked with IBM in an effort to better serve its growing student enrollment, up more than 19 percent since 1993, which has increased the amount of data that the IT department must manage and support. The school specifically wanted to increase the speed of its web-based and campus-based services around the clock.

The school implemented an enterprise sewer consolidation project that combined several applications from four independent computer servers to only one IBM server, the eServer pSeries 670 running AIX 5L. The switch is expected to save the school approximately $50,000 a year, especially in the area of student grading. In addition, faster online processing has allowed faculty to submit end-of-semester grades via the web, rather than on optical scan sheets.