Research Computing is prepared to assist you with your proposal from start to finish. Whether you are just beginning a new proposal and need assistance with proposal writing, or are in the late stages of a proposal and looking for a letter of support or final details, Research Computing can provide the necessary language and information to complete your proposal. Our staff often work with researchers on grants and new projects, and are therefore equipped to collaborate with you and help define the relationship between your research team and Research Computing. If you have questions about how Research Computing can be a part of your research, or would like to request proposal support from Research Computing, please contact us by sending us an email at [email protected].
Expand each section below to find facilities statements for each individual unit within Research Computing. These statements can be used in your next grant proposal to showcase the support and expertise provided by Research Computing.
MSI Facilities Statement (Short Version)
Minnesota Supercomputing Institute (MSI) is the University of Minnesota's principle center for computational research. MSI also operates several data centers on campus. Its main data center is located in the basement of Walter Library (room B40) on the University Twin Cities campus. It has an IT raised floor surface of approximately 3700 sq.ft. and over 1 MW of available power. The Institute HPC systems are comprised of over 40,000 X86 64-bit compute cores and 145 TB of RAM, which can support over 1.6 PFLOPS of peak performance. HPC nodes are equipped with between 64 GB and 2 Tb of RAM to support applications that require small and large amounts of memory, 32 nodes have Solid State Drives (SSDs) to support applications with demanding input and output (I/O) requirements, and 57 nodes include various configurations of the NVidia general purpose GPU accelerators (K40 and VT100), from 2- to 8-way.
In addition to the supercomputing systems, MSI also supports on premise cloud platforms for specialized data use agreements, prototyping, and customizable software environments, interfaces and systems for advanced scientific visualization, and interactive computing. MSI manages two large storage systems: a high performance parallel files system (6.5 PB), and a CEPH/S3 tier 2 object storage system (4.0 PB). MSI hosts a SpectraLogic T950 tape library with expansion capabilities for over 30 PB of online storage, which is used to backup high value files. The data center is connected to the 100 Gbps campus research network via multiple 40 GbE connections. The University maintains 100 Gbps connections to our regional optical network, which in turn is connected to Internet2 and beyond. MSI provides the infrastructure and expertise to the greater Minnesota University system. In addition to the diverse systems, more than half of the MSI staff are available to provide expert consulting in areas such as research informatics, software development, and algorithm optimization.
MSI Facilities Statement (Extended Version)
Established in 1983, the Minnesota Supercomputing Institute (MSI) is the University of Minnesota's principle center for computational and data intensive research. MSI provides services to over 880 active groups that sponsor more than 4,500 unique users from 19 different university colleges, maintaining an array of systems dedicated to the computational and data intensive research needs of investigators in the state of Minnesota's higher education institutions and their collaborators.
High Performance Computing
MSI's High Performance Computing (HPC) systems are designed with high speed networks, high performance storage, GPUs, and large amounts of memory in order to support some of the most compute and memory intensive programs developed today. MSI's HPC systems are comprised of over 40,000 X86 64-bit compute cores and 145 TB of RAM, which can support over 1.6 PFLOPS of peak performance. HPC nodes are equipped with between 64 GB and 2 Tb of RAM to support applications that require small and large amounts of memory, 32 nodes have Solid State Drives (SSDs) to support applications with demanding input and output (I/O) requirements, and 57 nodes include various configurations of the NVidia general purpose GPU accelerators (K40 and VT100), from 2- to 8-way.
Interactive Computing and Scientific Visualization
In collaboration with the Laboratory of Computational Science and Engineering, MSI supports a visualization laboratory. The Lab can accommodate up to 24 people and is located in the same building as the MSI. MSI also supports specialized interfaces (i.e., NICE EnginFram and Jupyter Notebooks) and hardware for remote visualization and interactive computing. Interactive HPC systems allow real-time user inputs in order to facilitate code development, real-time data exploration, and visualizations. Interactive HPC systems are used when data are too large to download to a desktop or laptop, software is difficult or impossible to install on a personal machine, or specialized hardware resources (e.g., GPUs) are needed to visualize large data sets.
MSI supports an on premise cloud computing platform built on OpenStack to support special data use agreements and to allow quick deployment of web, database, and other non-High Performance Computing systems. The virtual instances in this environment are available for a fee in a variety of sizes depending on the number of processors and the amount of memory and disk space required for the project.
All MSI researchers have access to a high-performance parallel storage platform. This system provides 6.4 PB (PetaBytes) of storage with sustained read and write speeds of up to 48 GB/sec. The integrity of the data is protected by daily snapshots and tape backups. High value data sets are backed up to an off site facility as a part of the institute's disaster recovery plan. MSI also supports a second tier storage solution designed to address the growing need for resources that support data-intensive research. The system is tightly integrated with other MSI storage and computing resources in order to support a wide variety of research data life cycles and data analysis workflows and uses Amazon's S3 (Simple Storage Service) interface, so that researchers can better manage their data, more seamlessly share data with other researchers, and migrate entire data analysis pipelines to cloud-based platforms.
Data Centers, Network Connectivity, and Office Facilities
MSI enables interdisciplinary research through its robust data center facilities with over 1 MW of IT capacity to support leading edge computational and data storage systems. MSI supports two data centers, both of which are connected to the campus network with speeds up to 100 Gbps. Campus networks connect to our regional optical network and Internet2 at 100 Gbps giving our researchers the network capacity and capability needed to collaborate with researchers from around the world. Located in the Walter Library building, MSI office and data center space (~ 18,000 sq. ft) are centrally located on the Minneapolis campus. MSI also maintains office spaces on the Saint Paul campus where additional researchers are located. MSI also provides computer and teaching laboratories, which are primarily used for outreach and teaching workshops.
UMII Facilities Statement
The University of Minnesota Informatics Institute (UMII) supports eight full time staff who work to foster and accelerate data-intensive research across the University of Minnesota system in all scholarly pursuits by providing informatics services, competitive grants, and consultations. Consultation services are conducted in close collaboration with a number of UMN-based high-throughput core facilities with a focus on genomics and metagenomics, optical and electron microscopy, small animal imaging, neuroimaging, material science and mass spectrometry. Researchers consult with informatics experts who are specialized in the downstream analysis of data acquired at the core facility. A base-level of informatics analysis support is offered free-of-charge to UMN researchers, which includes consultations on best practices for handling large volumes of data and how to best leverage the many complementary campus resources available to our faculty. UMII staff may also be included in proposals submitted to external funding agencies when dedicated support is required. Lastly, UMII provides funding for graduate student fellowships and for seeding research priority areas related to informatics.
U-Spatial Facilities Statement
U-Spatial is nationally recognized as a leading model for how universities can successfully integrate spatial data, visualization, analysis, and spatial thinking. U-Spatial has supported over 2,000 researchers across 150 departments and centers at the University through the help desk, training, consulting, and events. While we collaborate closely with large research centers and programs, U-Spatial also seeks to serve researchers working in the so-called “long tail” of the scientific enterprise. These are often smaller projects that may not be able to support full-time spatial research staff, but with help from U-Spatial, cumulatively help to enable and advance the many missions of a land-grant university and offer disproportionately great benefits.
U-Spatial is home to 11 professional staff and three to six undergraduate and graduate students with expertise in geographic information systems (GIS), remote sensing, and spatial computing. Staff design and develop full stack solutions, which may include database hosting and server- and client-side applications, such as web-based maps and decision support tools. Staff primarily leverage the Esri infrastructure for project development efforts, but are also skilled in drawing from open source solutions where projects call for such an approach. U-Spatial co-supports with the UMN Libraries a Spatial Data Curator whose job it is to ensure that all spatial data contains full and appropriate metadata.
U-Spatial is physically located on two UMN campuses, the Twin Cities and Duluth. U-Spatial labs have dedicated desktop computers, and staff utilizes a robust centrally managed virtual infrastructure to provide its spatial software and databases. Dedicated development and database hosting options are available to UMN research for a reasonable cost.
The University of Minnesota has an extensive site license with Esri (a leader in GIS software), which gives all students, faculty and staff access to nearly all of Esri’s enterprise resources for teaching and research, including ArcGIS Online, desktop software, database software and extensive data resources. Over 18,000 students, faculty and staff have an ArcGIS Online account through the University. U-Spatial provides and supports broad access to other geospatial applications including LAStools, SHELDUS, and participates in the Open Geospatial Consortium and University Consortium for Geospatial Information Science.
RC Proposal Routing Form (PRF) Routing Information
The PRF Routing Chain information is the same for all units under Research Computing:
- Minnesota Supercomputing Institute - DeptID 11062
- University of Minnesota Informatics Institute - DeptID 12165
- U-Spatial - DeptID 12221
PRF routing chain:
- Department Administrator- Brian Carlton <[email protected]>
- Department Head- James Wilgenbusch < [email protected]>
- Research Associate Dean- Michael Oakes <[email protected]>