Cyberinfrastructure Day at the University of Arkansas at Little Rock
The University of Arkansas at Little is hosting CI Day on Friday, May 4 on their campus. This exciting FREE program is intended to showcase state-of-the-art computing technologies available to researchers at the university and across the region.
Visitors to UALR may park in any of the metered parking lots for a fee of $1.00 per hour with a 2 hour limit, or they may park in the parking deck for
a fee of $1.00 per exit.
We are currently inviting various poster presenters related to cyberinfrastructure.
If you are a student interested in presenting your work, which has resulted from use of (or study of) cyberinfrastructure (i.e., high-speed networks, high-performance computing resources, large-scale storage, visualization tools, etc), please be sure to register as a "Student Poster Presenter". There will be winning prizes for the Student Poster Competition.
If you are interested in presenting your work or promoting your organization/product, please be sure to register as "Poster Presenter".
On-site Registration and Continental Breakfast
Welcome and Opening Remarks
Eric Sandgren, Dean of Engineering and Information Technology, UALR
Patrick Pellicane, Vice Provost of Research, Office of Research and Graduate Studies, UALR
|9:10||Cyberinfrastructure at Arkansas
"CI-TRAIN Project Overview”, Doug Spearot, Interim Co-Director of Arkansas High Performance Computing Center, University of Arkansas, Fayetteville
“Network Infrastructure in Arkansas”, Mike Abbiatti, Executive Director, Arkansas Research and Education Optical Network
“CI at UALR”, Kenji Yoshigoe, Associate Professor, Department of Computer Science, University of Arkansas at Little Rock
Mini Tour of UALR HPC facility starts at 10:15 AM for interested parties, James Joyce, HPC Administrator, University of Arkansas at Little Rock
|10:30||Simulation, Modeling, and Performance Evaluation
“Dynamical Models of NGC 3124: A Galaxy with an Apparent Counter-Winding Bar-Spiral Hybrid”, Patrick Treuthardt, Postdoctoral Research Fellow, Department of Physics & Astronomy, University of Arkansas at Little Rock
“A Hybrid Monte-Carlo Molecular Modeling Approach to Predict the 3-D Structure of Proteins”, Jerry Darsey, Professor, Department of Chemistry/Department of Applied Sciences, University of Arkansas at Little Rock
“Measuring the Overhead of Intel C++ Concurrent Collections over Threading Building Blocks for Gauss–Jordan Elimination”, Peiyi Tang, Associate Professor, Department of Computer Science, University of Arkansas at Little Rock
Arranged by HP and Dell
|1:00||Student Poster Competition and Networking|
|2:00||Data Analysis and Visualization
"Visualization of Unstructured Social Network Data”, Edi Tudoreanu, Associate Professor, Department of Information Science, University of Arkansas at Little Rock
“Large-scale Data Mining in Bioinformatics Research”, Jian Bian, Assistant Professor, Bioinformatics, University of Arkansas for Medical Sciences, University of Arkansas for Medical Science
Poster Awards and Closing Remarks
|3:15||End of CI Day|
Please register ahead of time for catering purposes.
Abstract: The Cyberinfrastructure for Transformational Scientific Discovery (CI-TRAIN) project is a partnership of institutions of higher education to transform the practice of information technology services for enabling scientific discovery. The CI-TRAIN project was founded by institutions in Arkansas and West Virginia in a partnership that builds on common research in nanoscience and geosciences and leverages complementary expertise. This talk describes the mission and activities of CI-TRAIN project and its achievement in the past three years.
Bio: Dr. Spearot is an Associate Professor in the Department of Mechanical Engineering and a member of the Institute for Nanoscale Materials Science and Engineering at the University of Arkansas. His research focuses on modeling and simulation of nanoscale material behavior and multiscale structure-property relationships. Dr. Spearot was awarded the 2010 NSF CAREER Award to elucidate the nanoscale mechanisms associated with phase nucleation during vapor deposition and the 2007 Ralph E. Powe Junior Faculty Enhancement Award to study plasticity in nanostructured materials. Originally from Bloomfield Hills, Michigan (a suburb of Detroit), Dr. Spearot received his B.S. in Mechanical Engineering from the University of Michigan. He completed his M.S. and Ph.D. degrees in Mechanical Engineering from the Georgia Institute of Technology working with Dr. David McDowell and Dr. Karl Jacob.
Abstract: Arkansas Research and Education Optical Network (ARE-ON) is a high-speed optical network to promote research, education, and economic development in the state of Arkansas. ARE-ON has successfully established network infrastructure in Arkansas for inter-campus collaborations. This talk describes the mission and the current development of ARE-ON.
Bio: Michael D. Abbiatti, Executive Director of the Arkansas Research and Education Optical Network (ARE-ON) is an active proponent of innovation in teaching and learning. His current focus is linking regional optical networking with a dynamic set of core missions including an aggressive research agenda, a comprehensive e-learning agenda, support for current and future telemedicine/telehealth activities, and a far-reaching emergency preparedness agenda. His professional responsibilities historically have involved providing leadership and support for statewide technology initiatives in Higher Education. Abbiatti is a member of the core team creating and receiving a $102M federal grant with colleagues from University of Arkansas for Medical Sciences (UAMS) and the Arkansas Association for Two Year Colleges (AATYC) that will extend high speed networking and computing resources to greatly enhance Arkansas' telemedicine and e-learning capabilities.
Abstract: Ever since the first tela-scale computer was deployed four-year ago, UALR has successfully acquired several high-performance computing (HPC) systems of various sizes and features for operation, and more students and faculty at UALR, and their collaborators are using these resources for advancing their research. This talk provides overview of these computational resources and research activities at UALR.
Bio: Kenji Yoshigoe received his Ph.D. degree in Computer Science and Engineering from the College of Engineering at the University of South Florida in 2004, and is currently an Associate Professor of Computer Science at the University of Arkansas at Little Rock (UALR). His research interest is on computer and network systems. He is currently investigating the scalability, reliability, and security of systems including high-performance computing (HPC) cluster, high-speed packet switches, and wireless sensor networks (WSNs). He leads high-performance computing initiative at UALR and Arkansas Cyberinfrastructure (CI) initiative at the state. He is a principal investigator (PI) and Co-PI for various National Science Foundation-funded projects. He has published widely in leading computer networks conferences and journals, and serves as a chair and a committee member in many symposia and conferences. He is a member of the ACM and IEEE.
Abstract: The bar component in the spiral galaxy NGC 3124 appears to be a very open spiral pattern winding in the opposite sense of the outer spiral arms. It is clearly observed in the high resolution B, V, R, I, and Ks-band images from the Carnegie-Irvine Galaxy Survey. We present preliminary results of our attempts to recreate the observed gaseous and stellar morphology through test particle simulations.
Bio: Dr. Treuthardt is nearing the end of 3.5-year postdoctoral research fellowship at UALR under the supervision of Dr. Marc Seigar in the Physics and Astronomy Department. He received his PhD from the University of Alabama in 2007 and had a short-term postdoc at the University of Oulu in Oulu, Finland during 2008. His supervisor at the University of Oulu was Dr. Heikki Salo, and they worked on creating numerical models of the gaseous and stellar components of 23 barred spiral galaxies in order to compare observational and theoretical parameters. Dr. Treuthardt’s research interests include studying the evolution, morphology, and dynamics of galaxies; dark matter halos; and supermassive black holes.
Abstract: Protein structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine (for example, in drug design) and biotechnology (for example, in the design of novel enzymes). Identifying a protein's structure is the key to understanding a protein’s biological function and role in health and disease. It could be said that correctly modeling the 3-D structure of a protein is the “Holy Grail” of computational molecular modeling. Therefore, the ultimate goal for modeling is to be able to predict the native structure of a protein based on nothing more than the sequence of amino acids. There are many reasons why understanding the three-dimensional (3-D) structure of proteins is so vital. Some of the most important reasons are: (1) In the pharmaceutical industry, understanding holds the prospects of greatly reducing the cost and expense of developing new therapeutic drugs. (2) Understanding antiviral drugs is greatly facilitated by knowledge of the structure, shape and mechanism of target protein molecules. Knowing how a virus gets into a cell is key to obtaining a better understanding of inhibitory drugs to prevent this key part of the viral life cycle. Taking into account the angle ω is very important in predicting the overall 3-D structure of a protein molecule, and is an angle which is often ignored. Monte-Carlo modeling is the only viable procedure for developing structures which can directly account for non-180º angles for ω. It is also the only modeling technique which can probabilistically bias sampling of the entire conformational space of a protein molecule.
Bio: Dr. Jerry A. Darsey is a professor in the Department of Chemistry & the Department of Applied Sciences, the University of Arkansas at Little Rock and is an adjunct professor in the Department of Biopharmaceutical Sciences, the University of Arkansas for Medical Sciences in Little Rock, Arkansas. He also consults for the FDA at the National Center for Toxicological Research. His B.S. degree is in Physics and his PhD is in Physical Chemistry both from Louisiana State University. Dr. Darsey is the author or co-author of approximately 120 manuscripts in the fields of molecular modeling, neural network simulations, conformational modeling of polymers and proteins, Monte-Carlo modeling; and a book in final revision on applications of neural networks to molecular and biomolecular systems. He has made over 200 presentations at regional, national and international meetings. He also is inventor or co-inventor on 6 patents issued by the Unites States Patent and Trademark office. Dr. Darsey’s research interests are primarily in computer modeling techniques of atomic and molecular systems for the elucidation of their chemical and physical properties. He also works in the area of nanotechnology to model atomic and molecular nanoclusters for developing new materials to enhance hydrogen storage. As either a PI or Co-PI, he has received over $2 million in funding from NASA, Department of Energy, Arkansas Science and Technology Authority, the American Chemical Society-Petroleum Research Fund, the Michael J. Fox Foundation for Parkinson’s Disease, and others.
Measuring the Overhead of
Intel C++ Concurrent Collections over Threading Building Blocks for
Peiyi Tang, Associate Professor, Department of Computer Science, University of Arkansas at Little Rock
Abstract: The most efficient way to parallelize computation is to build and evaluate the task graph constrained only by the data dependencies between the tasks. Both Intel's C++ Concurrent Collections (CnC) and Threading Building Blocks (TBB) libraries allow such task-based parallel programming. CnC also adapts the macro data flow model by providing only single-assignment data objects in its global data space. Although CnC makes parallel programming easier, by specifying data flow dependencies only through single-assignment data objects, its macro data flow model incurs overhead. Intel's C++ CnC library is implemented on top of its C++ TBB library. We can measure the overhead of CnC by comparing its performance with that of TBB. In this work, we analyze all three types of data dependencies in the tiled in-place Gauss–Jordan elimination algorithm for the first time. We implement the task-based parallel tiled Gauss–Jordan algorithm in TBB using the data dependencies analyzed and compare its performance with that of the CnC implementation. We find that the overhead of CnC over TBB is only 12%– 15% of the TBB time, and CnC can deliver as much as 87%– 89% of the TBB performance for Gauss–Jordan elimination, using the optimal tile size.
Bio: Peiyi Tang graduated
with a PhD degree in Computer Science from University of Illinois at
Urbana-Champaign (UIUC) in 1989, a Master degree in Computer Science from the
East China Normal University in 1982, and a Bachelor degree in Mathematics from
Fudan University in 1970. From 1983 to 1989 he was with the Center for
Supercomputing Research and Development at UIUC where his work was concentrated
on compiler algorithms of dynamic scheduling for large multiprocessor systems
like the Cedar system. He was with the Department of Computer Science of the Australian National University in 1989-1995, the Department of Mathematics and Computing of University of Southern Queensland in 1995-2001 and has been with the Department of Computer Science of University of Arkansas at Little Rock since 2001. His research is in parallel processing and high-performance computing. His research has been focused on parallelizing compilers, parallel programming languages, parallel programming, distributed coordination and programming, data mining and parallel processing applications including parallel data mining and parallel mathematical software.
Abstract: This talk describes a method of using visualization to integrate data sources with different information quality characteristics in order to improve their overall quality. Multiple quality dimensions are covered in a case study in which geographically referenced Twitter information is introduced into a geo-spatial environment similar to Google Earth. The information was seamlessly integrated into the environment in such a manner that no additional activity is required from the user to explore Twitter data. Multiple computers, organized in a cloud-like architecture, are employed for real-time Twitter data collection, analysis, and interactive visualization."
Bio: Dr. Tudoreanu's research interests are related to the use of graphics to better understand data and the processes that affect data. His work encompasses areas in information visualization, human-computer interaction, information quality, and virtual reality. Specifically, Dr. Tudoreanu is exploring the use of interactive visual representations to understand changes in dynamic data and assist in debugging complex computations. He is also interested in the impact of advanced interaction techniques on the capacity of a user to gain insight into virtual environments that have a high density of graphical objects, often a result of rendering large, complex data sets. Dr. Tudoreanu has been the keynote speaker at ABSEL 2010, served as a panelist for the National Science Foundation, and received grants from the National Science Foundation, Air Force Research Laboratory, NASA, US Department of Education, and Acxiom Corporation.
Abstract: It is common to find very large datasets in bioinformatics research. Here, we present two very distinctive topics in bioinformatics research. However, the two projects share one common property in which the size of the data is so big that outruns our ability of analyzing the data on a single computer. The first project is to study the human brain connectivity network to understand psychological diseases. We construct functional brain connectivity networks (graphs) from fMRI (functional magnetic resonance imaging) data based on co-activation of brain regions. We then extract the topological features of the brain network, such as the degree, the betweenness centrality, and the local clustering coefficient of each node; and use these features to build a Support Vector Machine (SVM) to identify subjects with depression. In the second study, we describe an approach to find drug users and potential adverse events resulted from using the drug by analyzing the content of users’ Twitter messages (Tweets).
Two sets of features (i.e., textual features that construct a specific meaning in the text; and ontological/semantic features that express the existence of semantic properties) are extracted from users twitter timeline; and these features are used to build two SVM classifiers (i.e., the first one is used to find the users of a specific drug, and then used the second to identify whether the drug has caused serious side effects). Both studies exhibit common problems encountered when analyzing large datasets. In the brain network study, a human brain functional connectivity network can be as big as more than 30,000 nodes (vertices) and over millions of edges (connections between vertices). In the Twitter mining study, we have 2 billion Tweets collected from May 2009 to October 2010, which is about 2TB raw text files. In both studies, parts of the analysis have to be conducted on a High Performance Computing (HPC) platform (i.e., Amazon EC2 cloud computing environment and its Elastic MapReduce cluster). We tried as much as possible to parallelize the tasks. However, our experiences showed that there lack an easy-to-use HPC-based analytic framework for dealing with large datasets in bioinformatics research.
Bio: Jiang Bian, Ph.D., M.S., serves as an Assistant Professor in the Division of Biomedical Informatics at University of Arkansas for Medical Sciences as well as an Adjunct Graduate Faculty in the Department of Computer Science at University of Arkansas at Little Rock. He earned his Ph.D. degree in Integrated Computing and a M.S. degree in Computer Science both at the University of Arkansas at Little Rock. Dr. Bian has an active role in bioinformatics research especially on electronic medical records (EMRs) and medical data sharing. He has led the development of Arkansas's Trauma Image Repository, which creates a statewide imaging repository to allow sharing of critical images between unaffiliated emergency departments. His research interests include computational neuroscience, secure health information exchange, patient privacy preserving, large-scale machine learning and data mining techniques applied in biomedical fields using High Performance Computing (HPC). He has numerous publications on secure communication, secure distributed file system, medical image archiving and sharing, mining social networks for public health information, and studying brain connectivity networks to correlate brain states with diseases.